How Learning Teams Should Approach

Tips For Preparing For The Next 12 Months With AI
In the last two years, AI moved from “interesting experiment” to something your organization quietly depends on. McKinsey’s 2025 State of AI survey found that about 4 in 5 organizations now use AI in at least 1 business function, and over 70% regularly use generative AI in their work. Yet fewer than one-third follow most scaling best practices, and fewer than one in five track clear KPIs for generative AI solutions.
Inside learning teams, the story is similar. A recent ATD/Clarity study reports that 80% of Instructional Designers already use AI tools, and nearly 2/3rds only started doing so in the past year. Clarity Consultants, a training consultancy, says AI has arrived faster than the governance, workflows, and skills needed to use it effectively.
So for the next 12 months, I don’t think the question is “What’s the next big AI trend?” The better question for the next 12 months with AI for learning leaders is: How do we turn all this experimentation into sustained value for learners, for the business, and for the people on our teams?
In this article, you’ll find…
Set A Clear AI Ambition For Learning, Not Just “Experiments”
Most organizations are experimenting with AI, but few are clear on what “good” looks like. McKinsey’s research shows that 88% of respondents say their organizations use AI, yet only about 39% report any EBIT impact from those efforts. Many are stuck in pilot mode: enthusiastic trials without a clear destination.
For learning leaders, this is the year to decide what AI is for in your function. Is your primary ambition efficiency (faster course development, reduced Subject Matter Expert (SME) time), reach (serving more learners with the same headcount), quality (better personalization and practice), or a combination? High-performing organizations in the McKinsey survey don’t just chase efficiency; they explicitly link AI to growth and innovation goals as well.
In practice, this means setting 2 or 3 concrete 12-month outcomes. For example: “cut average development cycle time by 30%”, “launch AI-assisted support for your top 5 critical workflows”, or “upskill 70% of managers in responsible AI use.” A clear ambition gives your team permission to say “yes” to the right pilots, and “not now” to shiny distractions.
Redesign One Workflow At A Time
One of the strongest findings in McKinsey’s 2025 survey is that redesigning workflows, not just dropping AI into existing ones, is the practice most correlated with bottom-line impact. Yet only about one in five organizations that use gen AI say they’ve fundamentally redesigned even some workflows.
In learning, this is your sweet spot. Course development and maintenance are structured, repeatable processes: exactly where AI can help if you’re intentional. Start by mapping one high-value workflow end-to-end: for example, “from SME intake to published eLearning” or “from policy change to updated microlearning.” Identify every step where people currently copy-paste, reformat, summarize, or reword. Then choose a small number of AI interventions that change the flow, not just decorate it. That might mean using AI to:
- Turn SME interview transcripts into structured outlines.
- Generate first-draft storyboards and quiz banks.
- Propose variants for different audience segments.
Run one or two of these as genuine pilots with clear before/after metrics (cycle time, SME hours, revision counts). The goal for the next 12 months isn’t to “AI-ify everything”; it’s to prove, in one or two workflows, that AI plus redesign genuinely improves speed and quality.
Raise AI Fluency Across The Learning Team
AI is now part of the day-to-day toolkit for most learning professionals, but fluency is uneven. The ATD study found that while 80% of Instructional Designers use AI, many cite gaps in skills and uncertainty about how to use it appropriately. In other words, adoption has outrun confidence. As a learning leader, treat AI fluency as a core capability, not an optional experiment. Over the next year, consider three layers:
- Foundations for everyone
Shared language around what AI can and can’t do, basic prompting patterns, and clear rules about data privacy, confidentiality, and bias. - Role-specific patterns
Designers need patterns for storyboarding, assessment design, and adaptation; facilitators need patterns for session design, reflection prompts, and feedback; coordinators need patterns for communication and logistics. - Quality and ethics guardrails
Simple checklists for reviewing AI outputs (accuracy, inclusivity, alignment with learning science) and clear examples of where human judgment is mandatory.
You do not need a perfect “AI academy” to start. A few short internal clinics, shared prompt libraries, and side-by-side examples can move your team from hesitant dabbling to confident, critical use.
Share Creation With The Business, Safely
One of the biggest shifts I’m seeing is that learning content no longer starts only inside L&D. Subject Matter Experts, operational leaders, and even frontline managers now have access to the same AI tools as designers. External data in a Business Insider report shows the same pattern: recent Gallup polling found that about 23% of US workers now use AI at least a few times per week, nearly double the rate from mid-2024. Rather than fighting this, learning leaders can harness it with structure. Over the next 12 months, think about how to:
- Provide simple, AI-enhanced templates for SMEs to draft scenarios, SOP explainers, or quiz questions.
- Define what a “good enough first draft” looks like when it arrives from the business.
- Keep L&D in charge of alignment, consistency, and pedagogy—the layers AI and SMEs can’t fully handle.
This is not “crowdsourcing training” in an uncontrolled way. It is acknowledging that others will create content anyway, and giving them a safer, more effective path to do it. Done well, it shrinks the backlog, brings expertise closer to the learner, and frees your specialists to focus on the high-stakes pieces only they can design.
Evolve How You Measure Value, Beyond Completions
Executives are increasingly asking hard questions about AI investments. McKinsey’s 2025 research shows that while AI adoption is widespread, only a minority of organizations see a clear EBIT impact, and very few track specific KPIs for generative AI. Learning teams are not exempt from those expectations. For the next 12 months, I’d focus on 3 measurement shifts:
- From volume to capacity
Track not just how many courses you shipped, but how AI changed your capacity: cycle times, SME hours per project, number of assets maintained per designer, and responsiveness to urgent requests. - From satisfaction to readiness
Keep your traditional metrics (NPS, completions), but add simple indicators of time-to-competence or performance in key tasks, especially in AI-touched programs. - From “AI usage” to AI impact
Resist tracking vanity stats like “number of prompts used.” Instead, tie AI involvement to concrete outcomes: reduced update time after a policy change, faster ramp for new hires, or fewer errors in critical processes.
The goal is not to put your team in a corner. It’s to create a narrative you can share with senior leaders: Here is how AI changed our capacity, and here is what that allowed the business to achieve.
Make Learners AI-Ready, Not AI-Dependent
While we focus on our own tools and workflows, learners’ realities are changing as well. Surveys in education and the workplace show rapid adoption of AI tools by teachers, students, and employees, often without clear guidance. TALIS 2024 data, for example, suggests roughly one in three teachers worldwide are already using AI in their work, mainly to summarize topics and create lesson plans. In higher education, multiple studies report that more than half of students and faculty have used tools like ChatGPT for academic tasks. For learning leaders, this is an opportunity and a responsibility. Over the next year, consider where you can:
- Build AI literacy into leadership, onboarding, and professional development programs.
- Model “good AI use” inside your own learning experiences (for example, showing how to verify AI output rather than presenting it as truth)
- Teach learners how to use AI for reflection, planning, and practice, not just for shortcuts
The aim is not to police every AI interaction. It’s to help people build habits that make AI a thoughtful co-pilot instead of a crutch—so that critical thinking, ethical judgment, and domain expertise stay firmly in human hands.
Bringing It Together In The Next 12 Months With AI
If last year was about discovering what AI could do in learning, the next 12 months are about deciding what you want it to do. That means setting a clear ambition, redesigning at least one critical workflow, raising AI fluency across your team, sharing creation with the business in a controlled way, measuring what actually matters, and giving learners the skills to navigate AI themselves.
None of this requires a moonshot implementation. It does require leadership: the willingness to move beyond scattered pilots and treat AI as part of how your learning organization runs. If you can use this year to make even two or three of these shifts real, you’ll be in a much stronger position when the next wave of tools, agents, and platforms arrives because your foundation, and your people, will already be ready.

LEAi by LearnExperts
Drawing on decades of experience in building training programs, LearnExperts offers an AI-enabled tool that enables clients to quickly and efficiently create learning and training content, as well as exam questions, that inform and develop skills.

