TL;DR

Universities must transition from squeezing AI tools into existing practices towards AI-first workforce planning with competency expectations for every role, argues Janice Kay, former University of Exeter Provost. The University of Exeter’s Digital Strategy assessment achieved 41% staff engagement (778 self-assessments), establishing baseline for digital confidence building. Successful implementation requires role-specific skills development, AI leaders identification, and systems built into workflows rather than bolted on.

From Piecemeal Adaptation to Strategic Transformation

The Office for Students recently encouraged universities to make “bold changes” in challenging environments, yet the sector continues squeezing AI tools into existing practices rather than fundamentally rethinking teaching, student support, and assessment. Janice Kay, Director at Higher Futures and former Provost at the University of Exeter, argues institutions need deliberate AI-first workforce planning approaches.

Every role should carry clear AI competence expectations, with staff supported to achieve them. This means embedding AI capability as core institutional priority rather than afterthought, accepting some traditional roles will change dramatically or disappear through automation. The starting point involves understanding current capabilities—not requiring everyone becomes data scientists, but ensuring staff grasp AI basics, large language model capabilities and limitations, analytics functionality, and prompt engineering from simple to sophisticated requests.

Role-Specific Skills and Development Infrastructure

The University of Exeter’s Digital Strategy assessment demonstrates practical implementation, with over 41% staff engagement producing 778 self-assessments—establishing foundations for digital confidence building. However, this reveals the need for specificity: educator skills requirements differ from programme administrators’ or student welfare advisors’ needs.

Development programmes must address role-specific requirements. Educators might learn automated feedback tools, discussion forum analysis, or student engagement prediction systems. Institutions should incentivise skills development through micro-credentials, workload allocation, and promotion criteria, whilst providing time for experimentation and learning. AI proficiency should be integral to roles, not optional extras.

Institutions must intentionally develop AI leaders—academics and professional staff who critically evaluate technologies and embed them ethically with pedagogical soundness and discipline specificity. AI fluency requires combining technical knowledge with learning science, assessment integrity, and data ethics.

Structural Integration Over Bolt-On Solutions

The transformation demands structural change where AI systems integrate into academic and student workflows rather than being added peripherally. The Kortext-Saïd Business School partnership exemplifies this approach, embedding AI assistants directly into virtual learning environments to reshape module design, materials, and assessments.

Mark Bramwell, Saïd Business School CDIO, describes the partnership as “empowering our faculty and learning designers to create smarter, data-driven courses and giving our students a more adaptive, hyper-personalised and engaging learning environment.” Such bold partnerships simultaneously enhance teaching whilst building workforce AI skills and confidence.

Looking Forward

The question shifts from whether to adopt AI technologies to how adoption occurs, particularly regarding workforce transformation. Moving beyond broad debates requires institutions to establish competency baselines, create comprehensive development programmes, identify and empower AI leaders, and pursue structural integration that embeds capabilities throughout organisations rather than treating AI as supplementary tooling for unchanged practices.


Source: HEPI

Share this article