AI in Education: Implications & Opportunities for Schools
As with all Artificial Intelligence (AI), the data used to train Generative AI models and the choices taken by those developing the models directly impacts its output. This talk explains some technical aspects of Large Language Models such as ChatGPT to a non-technical audience. This elementary technical knowledge allows you to form an opinion on whether ChatGPT ‘understands’ anything, whether it plagiarises, and how reliable it is. I will also show that AI is cyclical, and that boom is sometimes followed by bust.
A parallel can be drawn with the emergence more than two decades ago of search engines. They didn’t by themselves make us smarter, but they did give us quicker access to information. Sifting through it, choosing the right sources, rejecting what was irrelevant is where the human skill lies, not merely using them.
The power and pace of philanthropy in higher education
I’d say we have a moral obligation to learn as much as we can as quickly as possible, as it’s accelerating at a rapid pace and we’re liable to get left behind unless we’re on the front foot. “For that potential to be realised, we – the government, our schools, colleges and universities – need to be able to understand those opportunities, as well as the real risks new technology brings. Furthermore, the introduction of AI presents an array of new vulnerabilities, making the protection of sensitive data, resources, staff, and Learners a priority. The excitement and allure of AI-generated content should not overshadow the importance of safeguarding personal and sensitive information. As prudent institutions within education, we should exercise caution and consider the data that is input into generative AI tools. One of the more promising opportunities afforded by AI integration, not just in education but in all its facets, is its potential to alleviate workload burden.
Developing and honing these soft skills will be crucial for individuals to thrive in the evolving job market, where automation and AI technologies are reshaping the nature of work. Several natural language processing AI models have come to prominence in recent months, such as generative AIs like ChatGPT. These models demonstrate a huge step forward in accessible AI which will develop substantially and quickly; likely growing to become something we use frequently in our everyday lives. This may be appropriate in two circumstances; either generative AI incapable of answering the question or you actively want candidates to use it in their research to improve their submission. A digital environment is natively where a candidate would use generative AI and other tools rather than paper. While you may be doing that already, our flexible digital assessment ecosystem doesn’t constrain you to just coursework or take-home assignments.
Exploring Business and Digital Transformation Approaches: An Interview with Martin Butler,…
Jisc has also recently released a primer on generative AI, which staff and students may find useful to consult. As a final thought, two other considerations arise when considering how generative AI could fit within your assessment practice. Professor Adam A. Stokes is a Full Professor and Chair of Bioinspired Engineering in The School of Engineering at The University of Edinburgh. He is the Co-Lead of The National Robotarium, the UK centre of excellence in robotics, and Deputy Director of the Edinburgh Centre for Robotics. Before joining the faculty at Edinburgh, he was a Fellow in the George M. Whitesides group at Harvard University, one of the most innovative and entrepreneurial labs in the world. He is enthusiastic about translating innovation out of the lab and into people’s lives.
Students seek clearer guidelines to understand when this line is crossed. Where use of generative AI would be counterproductive to the aims of your assessment, you can prohibit the use of AI without having to return to the labour-intensive complexities of pen and paper. Inspera allows you to run your assessment in your context and then layer in levels of integrity as appropriate. This can include a lockdown browser, proctoring from screen only to audio/visual and the use of similarity and AI detection capabilities. Importantly, AI isn’t deciding what is and is not permitted, but is there to assist you in deciding.
As long as we approach the use of generative AI in education with a thoughtful and nuanced perspective, it has the potential to revolutionise the learning experience. Our work at Cambridge English is built on a communicative approach to language learning and assessment. We are committed to using technology to enhance, and not limit, this approach. This means that we need to take account of the social and emotional aspects of learning, and that genAI (and AI more broadly) needs to support the development of these aspects of learning, as well as the acquisition of knowledge and skills. UNESCO held its first global meeting of Ministers of Education to explore the immediate as well as far-reaching opportunities, challenges and risks that AI applications pose to education systems. Over 40 Ministers shared their policy approaches and plans on how best to integrate these tools into education.
Founder of the DevEducation project
Smartphones use machine learning to provide timely or more relevant services to a given situation. Real-time video communication tools are exploring AI use to enhance the experience, and the use of AI is even being built into autonomous cars. These are all forms of artificial intelligence, some more obvious than others. Going forward, and in the near future, there are very few tasks that professionals who work with computers will do without consultation with an intelligent machine. But universities and colleges need to legitimately verify student learning.
Learning, Media and Technology
And copying and pasting content is never acceptable; students should already know this but it won’t hurt to remind them in the context of AI generated material. Having these conversations with your students will also provide you with a great opportunity to turn the classroom into a collaborative learning space. In admitting we’re not experts in generative AI, we show our students that it’s okay not to know everything and that learning is a continuous process.
We will consider both how AI has been used over the past few years and the impact that the rise of GAI may have on education and the wider society. The Socratic method of inquiry in education is a form of cooperative debate between individuals, predicated on asking and answering questions to stimulate critical thinking. In this adapted use-case, the generative AI becomes a participant in the discussion, with a small group of students (optimally three students). The students are directed to interrogate the AI on a specific topic (for example the concept of entropy) until they can identify an error in the responses, which they then highlight and discuss with an instructor (tutor or lecturer). Rather than finding answers, this activity frames the AI as an unreliable participant, and students are primed to search for errors (to proofread, rather than blindly generate content). In this context, we designed an asynchronous, ChatGPT-assisted code review process for software engineering students at The University of Melbourne.
AI in schools: what are the risks to teachers?
We would also like to hear your views on where using it could benefit education, and about the risks and challenges of using it. In each episode, our thought leaders and sector influencers will delve into the most pressing issues facing the FE sector, offering their insights and analysis on the latest news, trends, and developments. We are experimenting with Artificial Intelligence to make our exclusive articles even more accessible while also automating the process for our team of project managers. That needs to come from the university, but we can also do that in faculties and in individual courses and modules. In the broader information environment, there will be a steep rise of deepfakes, scams, pranks, political mischief and spammy internet content.
Nonetheless, generative AI is here to stay and we cannot avoid or delay adapting our approaches to accommodate it. This starts with the conversations we have with our students on how they can use AI responsibly in their studies. Foteini Spingou is an Education Adviser for the Faculty of Sciences at the University of York and an Honorary Research Fellow at the School of History, Classics and Archaeology at the University of Edinburgh.
- Adjusting learning based on an individual student’s particular needs has been a priority for educators for years, but AI will allow a level of differentiation that’s impossible for teachers who have to manage 30 students in each class.
- Most forms of generative AI, including ChatGPT, are trained using vast amounts of data from the internet and other – unspecified – sources.
- Students will find it really difficult to distinguish between what problems are good to solve quickly with AI, and which problems are more valuable to solve themselves.
- While generative AI isn’t a new concept, recent breakthroughs and public accessibility to this technology have opened doors for the general public to harness the capabilities of AI-generated content.
Second, timetabling needs to adjust to the ways in which students engage with technology. Shifting timetabling away from blocks of time will be very difficult to do but something has to give. What disciplines can be learned in a shorter time because of generative AI platforms?
Finally we will consider the notion of originality in a post-ChatGPT world and how we will need to revise our standards for what is original in research as well as in student work. Just as calculators have changed the way we calculate, Large Language Models will change the way we write. Students and educators embrace genrative ai perpetual learning in the dynamic AI era. Lifelong learning thrives as AI delivers tailored content, fuels curiosity, and aids skill growth. In a tech-driven world, nurturing teacher-student connections is crucial. AI offers insights into progress and preferences, enabling educators to personalize interactions.