This is the second in a series about AI questions that have come up while I’ve been exploring another facet of the topic. In an effort to make sure I am focusing on the question I am meant to be pursuing, I’m writing up some short pieces with other questions. I might come back to them some day, but for now it’s working to get them off my chest. You can read the first one about visual representation of AI here.
AI. You know, for kids.
In exploring the public perception of AI, one of the many things that comes up is ‘teach kids as soon as possible how to code’. Something I broadly agree with, if younglings seem inclined. But can this really stretch to AI?
One nice aspect of teaching code to kids is that there are lots of online places now where they can learn either safely and independently or with adult guardians. They’re pretty cool and having witnessed a six-year-old knock up a simple game, the results are fun enough that they could spark an interest that stays with young people into higher education.
A few issues that come up when talking to teachers are often based around background and opportunity. Basically, check your coding privilege. Personally, I was really lucky, my dad taught me to code in BASIC when I was around seven and it meant that it was never something I ever thought of as ‘too hard’ or ‘not for me’ based on gender or age. So far, so nerdy.
But not all parents are as nerdy as my dad (sorry Dad), nor do they all have the time and equipment at home to teach a kid to code out of hours. Also in that scenario, there could be a skills challenge. Even having done degree-level mathematics, I’ve seen kids’ maths homework that made me nervous. So that could mean training parents as well as kids. Which could be nice, but are enough hours in the day?
Shipping code in schools
So, if we take this to schools, there’s plenty of reporting on changes in the UK that bring code to kids and it all sounds very encouraging. But it also seems that finding teachers can be a tough issue. Not just great teachers who can engage a class of bright minds, but those who are also trained to code and can present that in a way that would be inclusive and exciting.
Many schools have their work cut out for them getting their kids to achieve in english, maths and history. When I was at school, passionately pursuing art and dance (things change – unless I can think of a fun way to choreograph a SQL injection) – my favourite subjects seemed a little ‘b-stream’. It was required to get your As in the core subjects and the rest was probably cake. That’s a personal experience of course and not representative of all. But is coding set to be one of those less important subjects?
None of this is setting out to be critical of teachers or parents, I think they are all to be admired trying to provide as many opportunities as possible for kids.I also think that anyone who has to formulate a curriculum has a tough job to balance kids needs, abilities, parents and politics.
Beyond this is also the question of ethics (more on this in a separate column for ethics fans). Playing with Siri, Cortana and Google would have pretty much blown my tiny mind as a nibet. But, at the pace at which kids get used to things, I reckon it would also become pretty normal for long.
So where would the topic of ethics and technology be best placed? Should there be a primer for ‘you and your robot’ in social and health education? Or does it need to be introduced in the IT lab?
I am also aware that when I have worked on kids STEM events, these are the kids who want to go, they are already sort of primed to want to work with code. Or, their parents are in a place where they hear about these events and are in a position to take them there. Not all of these conditions happen at the same time and it’s important to recognise that yes, I’ve seen kids code, but along with this, those were the ones with the opportunity and inclination to do so. This is a bit different to trying to get kids to learn about the ethics of AI in a general as part of double-maths on a Wednesday afternoon.
Thinking critically about what Siri tells us and how
Talking to parents and teachers, they have told me that when it comes to AI and especially AI assistants, kids love this stuff and it’s fun to do. But not for a second do they understand how it works or to think about it critically.
Fair enough, I’ve been utterly stumped by a seven-year-old with a weird proclivity for physics a few times, but I don’t really expect 11-16 year olds or younger to be too critical in the way they think about the technology they use – or when they ask an electronic assistant a question, where the answers come from.
Beyond this, understanding or thinking about which search results come up top and why or if autocomplete answers are culturally problematic; I would be pleasantly surprised by kids opening that particular box of tricks.
Maybe I underestimate kids. Maybe, they should spend more time playing outside and not being worried about this until they are older. But when it is suggested that AI could be taught in schools, I still feel sceptical about how that would work and I have yet to meet a kid under 16 who would prefer to build her own neural network than use one that was ready-made and did something cool.
Maybe it’s just a good idea to show kids how it can be used and then offer open doors to creating their own someday so they know that the future can be theirs instead of run by the big tech companies. I’ll keep thinking.
Thanks for reading.