The final report has now been published by the Department for Education (DfE) and can be found here.
The EPC’s Education, Employability and Skills Committee responded to The Department for Education (DfE) position on generative AI in and subsequent call for evidence. This was informed by a full member consultation – led by Manish Malik (Canterbury Christ Church University) and Paul Greening (Coventry University) – to enable the EPC to draw together evidence from across the Engineering Academics Network and to remind the DfE that universities are leading the way in the development of generative AI in education.
We highlighted evidence from the network of ways AI is used in Engineering higher education; the positive outcomes of greater use and sophistication of generative AI; ways in which the DfE could support the possibilities; and challenges that need efforts from all involved.
In summary
We would expect to see the following positive outcomes as a result of the greater use and sophistication of generative AI:
- An increase is productivity for staff and students, which would extend to the future workforce. This means companies would expect expert use of these systems and better outcomes from their employees. This will have implications for higher education institutions and lifelong learning circuits. For example, staff may use AI (Artificial Intelligence) for assessment work, may even automate viva and really take the role of ensuring the interactions between the learners and the systems are meaningful and the students are actually learning, by being the human in the loop.
- A beneficial effect of the above could be to improve Just, Equitable, Diverse and Inclusive practices in higher engineering education. For example, compassionate, relatable and accessible feedback, content and assessments.
- Support staff and students with better decision-making and better student outcomes (retention, progression, closing gaps etc) as Learning Analytics and AI come together.
DfE could support the investigation of the possibilities and positiveness of:
- An increase in personalised learning opportunities with AI chatbots and AI Tutors. Students in future may learn and be assessed in different settings than those they currently experience. Personalised tests may become more common, but care should be taken to ensure they are equitable and assess the learning outcomes for which the course is designed. Staff must play the human in loop role here to ensure this is what is happening.
- AI is increasingly present in many design tools as this may hinder the development of critical skills needed by engineers. If the engineers have full appreciation of the design, its underlying theories and limits of the tools, this should be encouraged. In fact, many virtual / simulation tools may become a useful source of data for AI systems to model human interactions with machines being modelled in these tools.
- An increase in supporting the development of self-, co- and shared regulation of learning with AI-led orchestration of individual, pair and small group dynamics and learning within engineering education. If AI is tapped into correctly, it will certainly add to what has been achieved and extend it to many other scenarios. Staff’s role here would be to set up such a system to do what they would normally focus on when orchestrating their class interactions etc, without the technology and instead do what the technology cannot do.
Challenges that need efforts from all involved:
- Ensuring data used in such systems is not bad data with bias.
- Data and privacy concerns and AI systems are secure, so they are not misused.
- Staff and students need to improve their awareness of what behaviours, skills and knowledge will be in demand for their current and future jobs, respectively.
- AI raises concerns about academic integrity, but these can be mitigated, as mentioned above, by AI assessing students multiple times and academics always acting as humans in the loop.
- Trusting such systems can be either too easy due to their magical feel or too difficult for many and something needs to be done to moderate both ends.
- Increasing chasm between Humanities and Technology proponents need to be bridged so the systems are more ethical, sustainable and understood both from the point of view of what is possible but also from the point of view of what is the consequence to society.