AI is becoming more and more prevalent in our daily life, including education. There is a need for educators to embrace AI, teach students foundational AI skills, and engage in discussions about its ethical use and potential impact on society.
It’s hard for human beings living on planet Earth to navigate the world without encountering various forms of AI countless times throughout their day. Siri, Alexa, facial recognition, GPS, voice to text are tangible examples of AI that cell phone users encounter to remain productive at work and to stay in touch with friends and family. Other forms, like the one that creates insurance policies and sets prices based on data like age, gender, and ethnicity, are less noticeable, but are there nonetheless. AI is now helping healthcare providers screen patients into and out of treatment based on data. “It lets you know whether a tumor is benign or malignant,” says Kiki Huckaby, Chief Impact Officer at MindSpark, a non-profit that offers teachers professional development on integrating AI in their classrooms.
The Office of Educational Technology defines AI as “automation based on associations.” Its May 2023 report, called Artificial Intelligence and the Future of Teaching and Learning, states that when computers automate reasoning based on associations in data (or associations deduced from expert knowledge), two shifts occur that move computing beyond conventional EdTech. In the first shift, computers don’t merely capture data but begin to detect patterns in that data. The second shift occurs when instead of providing access to instructional resources, the technology automates decisions about instruction and other educational processes.
AI is no longer futuristic, says Huckaby. Seventy percent of industries are using AI, which means education, has to embed it in the curriculum, too. “Artificial Intelligence can seem scary to people, but it’s important that teachers embrace it and think about how they can use it for their own benefit and also how they can expose their students to it, no matter what grade or content they teach,” says Huckaby. “The workforce is dictating the need for it.”
Teaching Students Foundational Skills
Huckaby encourages teachers to view AI as a tool and find ways to help students use it as a thought partner rather than using it to just write an essay. Students are naturally curious and will explore on their own. It’s better that they explore in the classroom under the supervision of a teacher who can lead a conversation on the technology and guide them down paths that result in positive outcomes.
The schools that MindSpark works with don’t just use AI; they also think about how it can improve society. For example, a teacher may present students with a real-life problem in the community and allow them to explore ways they can use AI to solve it. Huckaby says that schools are embedding it into their classrooms because they are being forced to talk about it.
Most recently, schools are reaching out to Huckaby and her colleagues at MindSpark seeking ideas on how they can tell if their students are using ChatGPT to write essays, and how they can frame the discussion about such behavior in a positive manner. “Ultimately we need to teach students foundational skills (while embedding AI into the classroom),” says Huckaby.
Currently there are a number of websites, www.tutoreva.com being one, that offer tutoring services 24hours a day. The AI on one such site is so advanced it recognizes the numbers and math symbols in a hand-written equation held up to a device’s camera and returns a step-by-step procedure on how to solve the problem. “That’s a real concern in the math arena because now a student can show their work very tangibly, but are not actually doing the work,” says Huckaby. On the flipside, these tutoring websites can serve as resources when a student is working on a problem at home and cannot access the teacher.
The End of Coding?
In the past 10 to 15 years, school districts have stressed the importance of learning how to code through their STEM programs. “Every student needs to know how to code and foundational block coding. There’s a million programs and things that exist out there for coding,” says Huckaby. But AI, she speculates, will render coding irrelevant in the next three years. “AI can code. You can prompt and engineer it to create code for you,” says Huckaby. Code generated by AI is far from perfect and still needs to be run through a rigorous quality assurance procedure, scrutinized by a set of human managers and debuggers.
As the need for software coders and programmers diminishes, opportunities for those who can program and engineer AI will spike. “The jobs would stay in the same realm, but they will not be as hard-skill specific,” says Huckaby. As AI becomes even more ubiquitous, the policy side of AI will evolve, laying the groundwork for a whole new set of jobs. The Biden administration put in place some regulations, but, Huckaby says, how will individuals be upskilled in the policy realm?
Huckaby envisions a time when AI technologies will be regulated, much the way the FDA regulates food and drugs. But this can’t come soon enough for Huckaby. “When we work with educators our focus is how we use AI in a positive way, but I’ve seen a data visualization of predicted fake tweets that have been put out there by AI bots and what are real tweets put out by people. It’s alarming,” says Huckaby.
Proceed with Caution
Teachers, students, and parents all have a responsibility regarding the ethical use of AI. Parents must be aware of the data their children are accessing through the websites and apps they’re using, and of course, what type of data is being collected from them and how it’s being used. Teachers have a responsibility to talk about the ethics of AI and engage their students in conversations about AI. They should ask questions such as, where is AI going and how can it help advance society in a positive manner? “Every single industry in the next couple of years is going to use AI,” says Huckaby. The teacher’s role will be to upskill students so they’re curious, explore questions, and understand the foundational pieces so they can make smarter decisions. “There’s a lot of AI out there that is really biased. We know that it’s biased, and it’s still being used in systems. How do we create critical consumers and producers of information that are also diverse?” asks Huckaby. Tech in general is not a very diverse field and if AI is being used by the entire population, she says, then it needs to be representative of the entire population.