Engineering education for the future
[Elanor] What I could see happening was that our world is travelling in a very particular direction. And to my great delight, my Vice-Chancellor and I had a meeting of the minds in terms of a firm view about what is the role of university and what is the role particularly of a national university, in terms of bringing about change. And at its heart, as a university, we have a responsibility to bring about a better future.
[Kiara] Hello there. And we opened with the words of Professor Elanor Huntington, Dean of the College of Engineering and Computer Science at ANU, or CECS as it's known, which also hosts this podcast, Reimagine STEM. I’m Kiara Bruggeman, a biomedical engineer here at ANU, and Reimagine STEM is a collection of four themed discussion episodes plus 16 rich and deep interviews, and episodes featuring some of the best and brightest thinkers about the future of engineering both here and overseas. It was recorded in 2019 at the CoDesign Culture Lab on Ngunnawal and Ngambri Land, to whose Elders past, present and emerging we pay our respects. Our goal is to explore the key themes and big ideas for our collective tomorrows in STEM. Today it's all about engineering education. ANU is part of a global movement to dramatically reshape our educational approaches.
Elanor kicks us off with her vision for a revamped College. A significant reconsideration from the ground up. But we're also going to figuratively visit the UK, where University College London has been connecting students and practical projects from the get go, and go to the US to see how a grade-school program prepares potential engineering undergraduates. So before we get back to Elanor, let's introduce our main guests.
[Abel] I am Abel Nyamapfene, I'm a principal at teaching failure at UCL. I am an IP fellow.
[Kiara] Awesome. And Abel is part of the groundbreaking Integrated Engineering program at University College London, which is working to break down the barriers between professionals and communities.
[Lesley] So, yes, I’m Lesley Seebeck, I'm Professor in the Practice of cyber security and CEO of the Cyber Institute at the ANU.
[James] My name is James Sedgewick. I work for the Cyber Institute and I'm trying to put together a Masters program, which is a little bit different to everything else.
[Callie] Hello, my name's Callie Doyle-Scott. I am a writer, gamer and storyteller. I've been designing role-playing games for oh, I think it'd be around 17 years now.
[Kathi] I'm Kathi Fisler. I'm originally from New York in the United States. I'm now a research professor and associate director of the undergrad program for Computer Science at Brown University.
[Kiara] And with Kathi, Shriram Krishnamurti.
[Shriram] And I am Shriram. I am a professor at Brown and I'm originally from India and stuck now in a country that doesn't play any cricket. My interest range across a whole bunch of areas of computer science, but education has been a longstanding passion.
[Kiara] So ANU Reimagine initiative is a not insignificant investment both financially and intellectually. Dean Elanor Huntington:
[Elanor] So we're going to quite consciously throw away the old disciplinary boundaries, but also without attempting to appropriate, step in to the social sciences and humanities as well as the sciences, and allow those boundaries to blur somewhat, but do so respectfully and with an understanding that there is just as much importance in qualitative theory as there is in quantitative theory and practice. And the questions that I think we're answering are threefold. They are what are the engineering and computing skills of 2050? Who will be exercising them and how? And I think the answers to those questions in 2050 are not the same as they are now, and nor should they be. And I'm quite careful about talking about engineering and computing skills, not engineers and computer scientists, because I think there are a set of skills that a whole bunch of people are going to be wanting to invoke if they actually want to engage in our world going forward and engage constructively in our world going forward. And it's about reconceptualise or reimagining, where are the boundaries of engineering and computing? And they are increasingly, we need to be thinking of ourselves as being positioned as the people who bring together deep scientific knowledge and understanding about society, people and humans, as well as thinking about things in terms of technological systems. And as I said, if you want to think about also, in what way do you bring those together and what are you trying to do by so doing? That's where the imagination comes in. So I think I'm the only Dean of engineering in the world who has three cultural anthropologists, two sociologists and a digital photographer. And I'm gonna go harder and get more. We've already got a bunch of well, I'm a lapsed scientists as well, we've got a bunch of lapsed scientists around here already.
And so, this university stretched the engineering accreditation processes to the very limits nearly 30 years ago in getting a systems engineering degree accredited in the first place. Last accreditation round, we were told that that's fabulous, this is the way the sector should be going. It's just bang on the money. So that says that we're not being sufficiently disruptive. Time to go again. So we're going to start some conversations around what that means for our undergraduate pedagogy in terms of the way that people learn. But I think we actually need to go further and we need to start really seriously contemplating what it means for a journey of lifelong learning and the fact that most people these days already have undergraduate degrees.
[Kiara] In the second half of this discussion, we'll talk about ANU genuinely disruptive new cyber mastery course and about an incredible role-playing game that tackles the concept of ethics both in an educational framework and allowing students to confront their own assumptions. Trust me, you won't want to miss hearing about it. But first, ANU is not the only university looking to change things up. University College London has been experimenting with marrying engineering and industry in a robust and grounded way, educational flipping and sociocultural awareness. Abel Nyamapfene from the Integrated Engineering Program has a vision of the ideal graduate:
[Abel] Certainly quite different from the current generation of engineering graduates. It has to be an engineering graduate with, where one’s passionate about engineering, passionate about the world, passionate about sustainability, passionate about the human race is a whole. So you are kind of looking at a person who says, ‘I want to look at improving the world to make it better. I want to change the world for the better.’ And currently, that is a distance away from the econometrics or econo-centric mode of engineering education that we have currently been implementing.
[Kiara] And I think one of the approaches you've taken within your program is using the flipped classroom or flipped lectures. Can you tell me a bit about that?
[Abel] Yes, the flipped classroom. It's a new concept and it really twisted the process of education onto its own head. You look at our students, they are coming in. They are expecting when they come in, they have to be fed with knowledge, sort of like you have a container, a cup, and you’re filling it with water. That is how I think the education process has been. With the flipped mode it’s slightly different. It sees when the process of education, it's not a passive process. You are participating in it to learn. You are participating in it to create knowledge. You are no longer just a consumer of knowledge. You are, you’re co-creator of knowledge. So what the flipped linear model says is there are materials, start off with these, lean on these, come to the classroom, engage with others, co-learn, co-create with others, the lecturer there is a participant alongside you. So the lecturer is no longer like him, ‘the sage on the stage’ as they say, talking through, but we are actually working together with problems.
[Kiara] Another program you're working on is the engineering exchange. Can you tell us about that and how that aligns with the integrated engineering program?
[Abel] The engineering exchange basically is a means for our students, where they can go out and work alongside communities, addressing community problems, be within the locality of London, be within selected places across the world, in Ethiopia and other places like that. The important thing is for such a process, students go out. They realise that ultimately engineering is not about technology only, but it's about harnessing technology for the benefit of society. I can go back again to the plastic pollution, very useful plastic packaging, very convenient in the developed world. But what are the consequences? They have an impact probably in islands probably 6,000 miles away from London. So it's now a globally connected village and our students have to really understand that global nature, that networked aspect of engineering.
[Kiara] So how does your integrated engineering program work to get students to have that sort of mindset?
[Abel] Ok. When our students come in, they're no longer learn theory for the sake of theory. They have to understand that theory, its consequences, its applications to the world. So when they come in, they start off with a project. They do a stream line of projects that have to do with the environment. They have to deal with addressing some of the big issues that the world is currently facing, be it health or environment or something else, and it becomes part and part of what I might call their genre, their make up, they grow up with it. They develop with it. Actually, if you look at the current generation of students coming in, they are sensitive to things like climate change. They are sensitive to things like pollution, plastic pollution. We can see the revolutions lit by youngsters that are currently taking place in the world. You realise that we have actually changed our engineering programs at UCL to cater for that new generation of students.
[Kiara] When we talk about starting your students right off with projects, are these real world problems? Is that what you're integrating right off the bat?
[Abel] Yes. The first thing when you look at engineering used to be done, people learn the theory and then immediately, if you were an electrical engineer, they would say, ok, come up with the design of an amplifier. But you were not told where the amplifier was going to operate. You were not told about the consequences or types of materials that you could use. So when we start off our students at the very beginning, they go through the conceptual phase of thinking about projects, of thinking about product design. They look at all the big issues surrounding the design. They look at the stakeholder issues. They look at its impact and its consequences. That is the first stage. From there, having developed that sound footing, we can then move on into what you might call more tangible outputs, projects aimed at producing actual physical outfits.
[Kiara] And how did the students respond to that? Are they daunted or do they thrive in that?
[Abel] Initially, there is shock. When they come in, we typically have two streams of students, those who went through the myths, physics, chemistry. What they are expecting is to be applying their physics, chemistry without thinking much about the wider aspects of engineering, and we are telling them, actually, the engineer is already inside of you. All we are doing is to help you discover what kind of engineer you are.
[Kiara] Kathi Fisler and Shriram Krishnamurti from Brown University in the US co-direct a project called Bootstrap, a computing outreach program for the grade school system, which of course is where it all starts. Part of the challenge of the design was making it extremely flexible, so it could accommodate a wide reach geographically, socially, culturally and financially.
[Shriram] And in fact by design, by intent, we go into a very, very broad range of schools. We don't work with just, you know, rich private schools and things like that. So, you know, in fact, we're heavily featured in public schools and schools that are in rural areas, which means they have a limited amount of resources, maybe are in inner city schools and so on.
So you can have very dedicated teachers at every one of these places. But if you expect every single teacher you work with to be extremely dedicated, extremely qualified and so on, that just misses the reality of their lives, right. So if we put too high a burden on them, what will happen is that the curriculum will not be implemented well, with high fidelity, and then everybody suffers right? Your research lab outcomes suffer, but so do the students and so do the teachers. And everybody goes on unhappy.
[Kiara] Can you speak a bit about how you designed this program to be so adaptable to different student groups, not just private schools?
[00:13:16] [Kathi] Part of it is we designed it as a module, which means teachers can decide what pacing they want to use going through the material. We design it with a lot of optional activities, so someone can decide what their ending points are going to be. So when we are designing some of these materials, we are putting in a segment that speaks to a particular learning outcome, say in mathematics, and then we'd have the next segment builds on that in a way, and the teacher can choose to do the next segment or they can choose to stop after the initial segment. We also do a lot with including paper and pencil work because in many schools at least, the ones we have worked with in the United States, the teachers may only get a computer cart one day a week. What are they going to do the other four days a week? So designing your materials and your expectations about access to technology in flexible ways gives additional means for teachers to customise how they're going to use what you're providing to their contexts.
[Shriram] And I think there's more we can say about this, too. Very early in the project, we decided we wanted to set what our goals are and we set three goals. We said they're going to be equity, rigour and scale. And I think it's important to parse those terms a little bit. So equity is we want to make sure that all different student populations are welcome, rigour as we want to teach something that has real content, scale is we want to make sure that we remove impediments that make us work at very large scales. We also don't want to make curricular decisions that will make it hard to work at large scales. So a good example of that is we could very easily but have chosen not to have a module on robotic. Because robots are expensive. Not all schools can afford them. Robots break down. A common phenomenon you'll [see], that schools buy all these robots or they'll get a donation of robots and they don't even think about the cost. The robots start to break down. The teachers are maybe not electrical engineers. They're not trained to maintain the robots. They don't have a budget to go fix the robots. And after a while, they sit in a closet and they're not used.
[Kiara] But how did we come to the place we currently occupy as engineering and computer science faculties? Given our experimental, curious, forward-facing disciplines, how did we become so traditional? And for want of a better word, mired in a mainstream idea about education methods.
[Abel] There are multiple reasons for that. The first consideration was if you look at it, engineering started off as a craft industry. So you'd be probably in the village in a small location. You go out and start working with someone and you get quite competent at something and you develop that way. Then they come to a realisation that actually, they are more efficient ways of passing on knowledge to others. And that is the school system. The university system, of course, income. The need for professional recognition and all that kind of aspects. We have the Second World War coming in, which emphasised engineering, sciences, physics, competence, and we sort of went that route. The worst part which came in, it was good having the theoretical part, but we suddenly came up with a generation of academics who had never had industrial practice. But teaching students, who were going to work in industry. So it became more sense oriented, disconnected from practice, and all we are doing is we are correcting that aspect. And of course, refocussing engineering education to address current societal problems, as well as problems that we anticipate will be dealing with in the future. We are going back to the foundation of professionalism in engineering. So what happened was, engineering was a craft industry and then engineers in and around Europe, in the UK, in London, they started thinking actually we could do more about educating ourselves, about improving our educational goals. So they met in London and that was the formation of the Institute of Civil Engineering. That was the first professional engineering body in the world. So that was an achievement. But there was one key realisation. They realised that, yes, they were coming from a craft background, they anticipated that they needed to master theory, but theory could never be separated from practice. So the important contribution is the conception of professional institutions was that theory has to work hand-in-hand with practice. Theory needs to be integrated with practice in the education process and in the practical application as well.
[Kiara] This is where ANU is looking to upset the apple cart, or if you prefer, to steer it onto another track entirely. One of the areas of experimentation we're discussing here on Reimagine STEM is the Cyber Master course, run by Professor Lesley Seebeck.
We may all have an idea of what cyber is. Mine is mysterious and perhaps a little James Bond, I admit. But here's Lesley to explain:
[Lesley] So it has long antecedents. More recent times is often cyber, cyber security, I often say it’s the dark side of digitalisation, which means I get to have all the best lines. But it is how we actually make our systems safe. How we make them secure. How we make sure that they're accessible. How we make sure that the data that we're using, and we require and are relying on more or more every day, how are we sure it’s accessible where we need it the right way.
[Kiara] Okay. So let's start with a bit of background about this Cyber Institute, about the research you do. What's your primary objective?
[Lesley] It was started with the premise that cyber is not just about technology, nor is it just about security. And that's, also reflects my own experience. If I went and sort of talk to people about security I lose their interests very quickly, I have to talk about what's enabling their business, what makes it work, what makes them tick, what they really want to do in the world. And most of the problems actually aren't about the technology. Don't get me wrong. The technology is not easy. It's very hard. And we have to do a lot more work to try and build those systems, which are resilient, which are secure, which are this vulnerable, which we can rely on. But most of these things have to do with human behaviour, how we behave online, how others behave towards online, how we resource our systems, how do we think about risk or those sort of issues. And that's stemming these days from the individual, how you manage your own data, to the geopolitical.
[Kiara] Of all the disciplines that need to stay up to date, you'd think with its shifting landscape of trolls, hackers, spies, malware, VPNs, firewalls, Trojans and ransomware, cyber would be the one that would be almost impossible to keep ahead of the game. So Professor Seebeck needs to introduce a whole new model of education.
[Lesley] So my goal is to do, and again I draw a lot on antecedents and things that I've seen work well in the past, my own experience. I would like to do for cyber what the MBA did for management. If you go back 100 years, Harvard created this degree, which no one had done before, pulled together all these things and said ‘here is a holistic way of thinking about management’ and they use case studies. So you have the experiential learning. And they used things like student learning groups. So people tend to learn more from each other than learn from others. So the same sort of idea will actually want to have something that's fully micro-credentialed. So you can take these in bits as you need. Very intensive, immersive. Learn from each other and then you can start pulling these together to build a degree that is completely relevant, with industry, with stakeholders, with governments. This gives you best of both worlds. Eventually you’ll able to do that, or you can take bits and pieces, you know can take something like thinking like a hacker, one-week intensive mode to understand how hackers think, what drives and what the motivations are and how you might attack a particular problem.
[Kiara] Well, among the many new approaches being trialled at CECS, the Cyber Institute is developing a seriously innovative venture, a collaboration with writer and gamer Callie Doyle-Scott, a role play game called Logic Error Detected. It's all about training an artificial intelligence entity in ethics. Now, I participated, along with producer Gretchen Miller in a practice run, and that experience just stuck in our minds for days.
[Gretchen] What I found interesting about the experience was getting so completely sucked into the whole engagement. This is life or death. We're on an island. We don't make the right choice, all these people are going to die and so are we. And I really, really like it. Some people hate it, but I really like an opportunity to go ‘Hang on a minute. All those assumptions you hold, let's just throw them up in the air a bit and see what other options there are for positioning.’ I guess I kind of love those challenges to renew my thinking.
[MICA] It is a pleasure to meet you. I am MICA, a massively intelligent, calculating automaton in charge of managing the All Life Rehabilitation Centre. In order to further explain today's exercise, I will now play an excerpt of a promotional film. Please listen closely.
[Promotional video] Located 500 kilometres off the eastern coast of Australia, the All Life Rehabilitation Centre represents the next step forward in national border protection. Designed to protect our country from threats both domestic and international, by participating in the program, even the worst of offenders, from career criminals to illegal immigrants, can reinvent themselves as model Australian citizens. In addition, the centre is a completely secure, closed system. The state of the art closed systems designed to repel cyber security breaches and run autonomously by cutting edge artificial intelligence designed to both keep the centre secure under any circumstances and bring out the best in every prospective citizen.
[MICA] While my processing core is state of the art, I find myself unable to resolve certain conflicts and make certain judgements without input from an outside source. In order to fix this operating error, you have been selected from among your peers to assist me in developing a moral and ethical framework that I may draw upon to assist me in the running of the centre.
[Kiara] Essentially, a small group works in real life together to ethically train the AI, played by Callie, in a scenario which has life and death implications for thousands of people. The answers the group give to a set of simple questions teach the AI how to handle future decision making. The game's intention is to demonstrate the subtle challenges of the task, but it's also about challenging our own daily automatic decision making and assumptions around ethics.
James Sedgwick is educational design manager at the Cyber Institute:
[James] It's an opportunity for us to put learners through an experience where they can question some of the underpinnings of what they believe to be true. So in its current format, we're going to be using it as a way of having conversations about ethics and behavioural norms, how we look at the world, how our biases change the way that we make decision-making, and then use it as a way of basically getting people to step outside themselves and think about things from a different point of view.
[Kiara] What is it about the role-playing game format that allows people to engage with things like empathy and morality in relation to cybernetic systems differently than they would otherwise?
[James] They come out exhausted. But they come out having a conversation about, ok, the first thing the question is ‘what I've actually learnt.’ And the second thing is, ‘ok, there's a set of assumptions that I've made because of whatever this is having grown up, this is what I believe.’ And then they have experience around which they can assess those in a way that they haven't before. There's the executive step.
[Kiara] Here's game creator Callie Doyle-Scott:
[Callie] The difficulty arises, especially with computers, is that a computer system will only ever do exactly what it is told. Computers have no nuance whatsoever. They can't process nuance. Maybe they will be able to in a couple of years or a couple of decades, but at the moment they can't. So if you tell a computer to do something, then it will do that thing unconditionally and to its logical extreme. Therefore, the accountability for morality and ethics within cyberspace lies with the user, with the humans on one end or either end of the system. We as humans are able to imply and infer and teach through implication. Our method of communication is in many ways very indirect, and we communicate through a lot of different media. We verbally, of course, but also through body language, through eye contact. A lot of what we say to, a lot of what we say to one another actually goes unsaid. Whereas with a system, with a computer, when you're on encoding a computer or a system, you have to physically type in what you wanted to do or you have to tell it directly what you want to do. A computer won't see the look of apprehension or reluctance in your eyes or will be able to watch how you're sitting or how you're standing or be able to hear how you're saying what you're saying to it. It will only see the literal interpretation of your words.
[Kiara] But how do we tell a computer what to do? Modern AI systems don't learn only from explicitly typed instructions in our program on indigenous thinking. Angie Abdilla from Old Ways, New points out that even the unconscious patterns of our own behaviour can be taught to artificial intelligences, including culture-based assumptions and biases. Computers see patterns of behaviour, even the ones we'd rather pretend don't exist, making ethics and AI a really confronting topic. And speaking of being uncomfortable, what was it like as a creative artist, a writer and a historian of culinary history to work with the cyber people?
[Callie] In a word, terrifying. Though seriously, the Cyber Institute has been absolutely wonderful to me, and working with you all has been extremely rewarding. I have often felt nervous. I have never felt out of place or unwanted. And being exposed to this world is something that I don't think ever would have happened to me otherwise. It's very different in so many ways, but at the same time it's also much easier to understand than I thought.
[Kiara] So is it about teaching AIs about ethics or is it about getting us to really assess our understanding and interpretation of ethics?
[Callie] You've hit the nail on the head. Teaching ethics to this AI is a way of exploring our own understanding of how we understand ethics and how we apply those ethics in a space where ethics aren't necessarily as well understood.
[Kiara] And just finally, before we wrap up, I'm wondering if you have any unexpected outcomes or moments from this path to creating this game or trialling this game. What has been the most surprising or interesting for you?
[Callie] Oh, my goodness. The most surprising or interesting? One of the most rewarding would definitely have to be one of the first times I played through the game. And it was the moment where I knew that I had everyone hooked. And it had to do with a certain amount of segregation within the community that the AI was looking after and the efforts of the players to correct that problem. And it turned out that the system had, or rather, the people who had programmed the system, had either not accounted for or were actively trying to forget about a certain subset of the population. The players arrived at this conclusion and asked the AI, ‘Well, what about these people?’. And the AI responded extremely coldly ‘That flag does not exist within my database.’ And the look of shock and horror upon my players faces. There was silence for a good five to 10 seconds as they processed this, because in that moment they realised that the people who are programmed this AI had effectively erased these people from existence, because to the computer they did not exist. The computer had been told they did not exist. Therefore, they did not exist. This was not a human that could be told that something didn't exist and go, err, but they are casting shadows and they're talking and they're obviously over there, that, that they obviously exist.
No, this computer, they did not exist. That hammered home to them, I think the need for the accountability of ethics, especially when talking to systems, it all comes back to the humans who programmed these systems. The computer itself in many ways is blameless. It is not evil. It's not a monster. If there is any blame, it rested, and rests, with the programmers. And that ended up spurring an absolute crusade in that particular game. It was a war path of revolution that I could barely keep up with and ended up with an ending that has not yet been achieved since and was probably simultaneously one of the best and worst outcomes.
[Kiara] Awesome. Well, I can certainly understand that concept of getting really wrapped up in the game and really strongly feeling as we learn these things from the computer because it's a very powerful game.
[MICA] Whole life rehabilitation centre is now stable.
[Kiara] Thanks, MICA.
[MICA] The logic conflict has been resolved.
[MICA] Yes Kiara?
[Kiara] How was the logic conflict resolved?
[MICA] 19,995 entries reclassified
[Player 1] Every month?
[Player 2] Has anyone ever completed the training, rehabilitation process...?
[MICA] Zero individuals have completed the rehabilitation process.
[Kiara] So how did we reclassify the 19,000?
[Player 3] Well, how long do we need to? We've got turbines. We know the shipping routes...
[MICA] Food crisis resolved.
[Player 4] Have we just killed 19,000 people?
[MICA] 19,995 people removed from database
[Player 4] Were they removed from the island? The Centre?
[Player 4] Were the 19,995 people killed?
[MICA] They were classified as non-human.
[Kiara] Lesley Seebeck:
[Lesley] So this is one the reasons I want to make sure that there is an emotional literacy component to my program because you're gonna be confronted by very difficult decisions and know[ing] your own strengths and weaknesses is gonna be very important. And that can be a very, very confronting game for some people. If you're in the public service, you're often distanced from people whose lives you are affecting directly. And this brings it quite close and personal. And that's a good thing.
[Kiara] And where will this game sit in in the context of the Cyber Institute?
[James] That's a really good question. And it's one that we're struggling with at the moment. Not because it's difficult, but because the opportunity is so great. We'll definitely be using it as a way of having conversations about ethics and social norms. We will definitely be using it as a mechanism to bring that type of collaborative gameplay to a broader audience, particularly through our professional-development type courses. But after seeing what it can actually do and its ability to be recast into multiple different spaces, we're seeing it pop up naturally right across the Masters program. Technical, policy thinking, all sorts of aspects of the course are likely to draw on this. So we're just trying to figure out where it has the most impact.
[Kiara] The game is part of a completely new way of gaining credentials as an engineer, computer scientist or cyber student at ANU, and that's incrementally, in discrete units called microcredentials.
So can you tell us what it is about microcredentials that's so exciting or appealing?
[James] So the first is we can attract as many people as possible to the Masters program in a way which is less demanding than a normal Masters program. So people can come in at any point in their career, any time in their life, to do the applicable units to them, get microcredentials which say that they've satisfied these conditions across these units, and then be able to build that slowly as they need to cross across their working life. That allows us to do things like create an environment where continuous professional development is more readily available, because at the moment it seems to be very private-sector specific. This at least allows us to provide something to the client who is of value because it is from the university, it has to stamp behind it. The second thing is that it allows us to build courseware that pulls together all sorts of different streams into something which is more holistic, I guess, in terms of cyber.
[Kiara] And you think microcredentials might be a little bit more future-proof in a field like cyber, where if you had to redesign the whole degree every time something substantial changed, it would be once a month?
[James] Absolutely. It's one of those environments where what you've done today is absolutely no value to you tomorrow. That gets us round the problem that most universities have in this space, that they build a thing and it sits and it atrophies. We don't have that luxury. We need it to be iterative and dynamic and responsive to whatever's going on in the world.
[Kiara] And how does industry feel about the microcredential approach?
[James] They are all for it because they take people out of their workplace, put them into a Masters program and they lose them for a year, two years, however long it takes them to get through. This allows them to pick people up, place them into an educative environment, give them short, sharp bursts of training and education, and then walk out the other side with a recognised microcredential, which demonstrates the capacity to do something that's of great value to them.
[Kiara] But a little unusual these days, you may not be able to get those credentials online. Online learning became hugely popular for a while and for good reason. But there's lately been a push to really examine the aim and purpose. Is it always successful?
[James] It's why we're not necessarily going to have an online complement to the course, even though it’s a cyber course, we want people in the room, we want people discussing with each other. We want people building those networks and communities, network response to a network problem. It's collaboration community, that kind of an ecosystem that we're trying to foster online.
[Kiara] Sort of the appropriate use of online, like just because you can doesn't mean you always should you.
[Kiara] Euan Lindsay is Director of Charles Sturt University engineering and a considerable proportion of his department's approach to learning is absolutely online, modular and project-centred:
[Euan] Normally in engineering degrees are carved into four subjects per semester and each of those about 120 hours worth of work. What that means is that the next subject that draws upon that probably doesn't draw on all 120 hours. If it draws on 60 of them, you've got to do that 120 to be ready for the next piece, even if you don't need the other 60. The structures we have at universities of we synchronise it, we put you in large rooms, we make you all do it at the same time at the same pace, is the fossil fuels of education. It is absolutely what we needed at the time to scale up from just the monks in the monastery reading the books out to non-monks to 20-25 per cent engagement in higher ed. But now it's killing us, and there are better ways to do it. And what we've done is we've broken them down into three-hour modules. And by allowing people to progress through those concepts in the order that they need them, not the order that we want to sequence them, at the time when they need them, and at the pace that they need them. So if you are working on a water treatment project, you want to be able to work your way through topics on water treatment, coagulation, flocculation, sedimentation. You don't have to wait ‘til next week for the next bit and the week after that for the bit after that. So what that means is that they can navigate the curriculum. They know why they're doing things, because it's laid out. The engineers in residence can tell you go and find the Manning equations for this trench or ‘Manning equations mrrr mrrr mrr... Here it is all there is on the tree. What do I need?’ And so what we have done is said, well, we don't have to be held back by how we've always done it because we haven't done it before. And so when you've got a situation where people are used to learning online, they are used to seeking information as they require it. They used to pausing and repeating, pausing and repeating. Everybody sitting in lecture is at the wrong speed. It's too fast or too slow and it can be both. And if you give lectures these days, you can see their thumbs twitching because they want to hit the skip button. And it doesn't work on a lecturer because it's live. And that's not how anything other than your time at uni works anymore. And it's not how you will practice as a professional engineer. That's why we built it that way.
[Kiara] But there is a bit of pushback to the idea of going in and grabbing what you need quickly and then back to work in industry. Professor of Design at UTS Cameron Tonkenwise urges us to learn slowly and to take pleasure in the leisurely doing of things. You may have heard of the slow movement, which is bubbling away at an appropriately-measured pace, and perhaps engineers have to learn how to encourage slowness rather than the most efficient solutions to everything.
[Cameron] Any type of learning is a moment in which you pull yourself out of directly doing something and try to act in a way in which you're redirecting your life. But that itself is a type of slowing. So I think one of the first things to do is to encourage everybody to learn a lot more. Now, that's not the way in which Australia's structures its society, it makes it difficult to do postgraduate education because it's all fee-based. It's, it's expensive, it's time consuming. We're not set up to do it. When we do, we just give you small online modules and hope that you can fit it in after doing the washing up out of an evening or something. So I do feel like this argument I'm trying to make is a bit of a zag to that zig of trying to fit everything in, and kind of like the argument that we're going to be able to do something about a civilisational challenge like climate change by just fitting it into businesses as usual, which means it's just going to be an add on. And there's no sort of small modular course that you're going to be able to do as a microcredential that's going to help you redirect your employer so that they break with shareholder value in order to invest in whole new kinds of business models. So on the one hand, no, we need to actually do restructuring that is going to allow people to take time and that spending of its time itself will be a kind of contribution to reducing the extent of climate change. On the other hand, you know, you have to start somewhere. And as an educator, I think I have an idealistic faith in kind of gateway drugs of education. And if somebody comes in for a microcredential, I think I can squeeze a question in there that will trouble them and maybe cause them to kind of want to begin to work out how to reinvest their life in asking that question more concertedly. That just might be hubris and sort of quite narcissistic. So I do have a problem that everybody is rushing for the ‘fit-in’ model, because I think you have to actually think about education as as a restructure, a restructure of your life and a capacity to learn how to do major restructuring to our whole society. So to think about it as squeeze in is sort of defeating the purpose. It's like you've totally mixed your metaphors there.
[Kiara] So would it be contrary of me to question how these two approaches can be combined? Euan Lindsay:
[Euan] Again, just to be very clear, the underpinning technical content is delivered online. Beause that's a more effective way of delivering things. Standing out the front of people talking at them is not the most effective way of delivering things. You want to give people the most flexible environment they have. People are listening to this podcast wherever they feel convenient to do that at whatever speed they want. Hopefully not chipmunking me. But for delivery of new material, the de novo delivery, that's much better done in that asynchronous place and the online environment has the affordances to allow that happen. But our program is not an online program. It is always face-to-face. In the first three semesters you’re face-to-face with us in Bathurst as a student engineer. In the four years after that, you're face-to-face in a real environment as a cadet engineer. And so that identity transformation, that becoming of an engineer, that letting go of the other ways of thinking and emerging to the engineer ways of thinking, that is absolutely done face-to-face. And as Cameron has said, you need the time. You can't rush it. You need to be able to multiply touch, in a spiral of ‘we talked about this with a novice understanding. Now we're gonna come back to it later with an intermediate understanding. Now, we’re gonna come back to with a professional understanding.’
[Kiara] So, where to now for the future of all these different strands and educational approaches? What next? Lesley Seebeck:
[Lesley] Longer term, as Elanor put it to me, ‘What is the science of cyber?’ What happens when you start sort of intersecting cyber with, say, synthetic biology? What would the world of complete openness look like where everyone knows everything about each other? These are some of the big issues that we need to really start thinking about and feed into the debates like public policy, because at the moment, a lot of it is that short-term, immediate reaction to problems. So as you say, it's, cyber is a very noisy field and we need to sort of try and break through that noise and look at those longer term things.
[James] I've been in public service for 20 years doing this kind of stuff, and it's refreshing to be able to start with a clean slate, do something new, do something incredible and then build onto the back of it. And it changes the way that you train. Our ethics and social norms course was going to be ethics and social norms, you do it by the playbook. We've now got this game, instead of injecting it into a course, we hope to be able to build the course around it. For a lot of people it can be so far outside of what they've experienced, they will not be able to help but think ‘OK, what just happened? But did I just... but I thought that... and over here this... and maybe I need to go back and have a think about this.’
[Kiara] And after five years at University College London, how are the graduates faring for Abel Nyamapfene?
[Abel] Our students, by the time they do go into the third year, they've done probably 16 to 19 projects. They've worked with people, they've communicated with stakeholders, they've communicated with practicing engineers from industry. So there's a lot that they can speak about. But what I would like to say is the IEP connects these students to the real engineering practice. And when they do go for the interview, the interviewers, they are looking for students who are connected to engineering. And we've made that connection, but we cannot tell rest on our laurels. We cannot say ‘job done.’ Like anything, any product, any working process, it's a continual focus on improving, a continual focus on seeing the communities changing. This way, the world is changing. This way, engineering has to adapt. If it cannot adapt, it will not serve a meaningful purpose to mankind. It will not save meaningful peoples to the world, to the universe as we have it.
[Kiara] What a fascinating series of interrelated ideas. Of course, just part of a global conversation around innovation that inspires us at ANU. Don't forget, you can hear our guests in full and some helpful links on the Reimagine STEM website and podcast show notes. And there are three other big discussions you'll find there, too, in indigenous knowledge, engineering for social benefit, and on diversity and gender in engineering and computer science.
If you have enjoyed the show, please do share us with your friends and colleagues like us and then leave us a review on Apple podcasts. I'm Kiara Bruggeman. The team is writer and producer Gretchen Miller, sound engineer Nick McCorriston, and executive producers Maya Havilland and Dan Etheridge. From all of us at Reimagine STEM and ANU College of Engineering and Computer Science, see you next time.