AI summit: Why education needs to be rethought for AI rollout Premium

AI summit: Why education needs to be rethought for AI rollout
Premium

The recently concluded India AI Summit, 2026 has raised the stakes on AI in Education. The energy at the Summit was palpable and Education was a major undercurrent across the Summit. For us to properly realise the benefits of AI, we must consider a much deeper engagement with the challenges of education first!

Sovereign AI

There is a strong focus on where data resides and how it is used for inference. Most governments are rightly concerned about this, not just for security reasons, but also for reasons of providing far more accurate and contextual use of AI. This includes not just the models, but also the required infrastructure to operate at scale in India.

The vision is to secure data, use voice as primary access channel, build domain specific stacks, and establish AI as a multilingual, inclusive, responsible, omnipresent, bias-free ,and contextual digital public infrastructure. The need is to build indigenous ability at every level of the stack, from hardware to software, from models to GPUs to ubiquitous agents. Interestingly mechanisms are being built to include private copyright content into the training of foundational models, while protecting the revenue rights of the copyright owners. The operational levers to enhance innovation and entrepreneurship are also being put into place – AI into the curriculum, scholarships for students, local/national centres of excellence, funding for research & startups, and many others.

Safe, equitable & responsible use of AI

This is without doubt going to be a long and complex journey for policy makers, legal experts and educationists. Many countries have created their own frameworks, but going from those to practice will need hard work and changes in legal and implementation frameworks. Good starting points are more education around AI risks, the implementation of the DPDP Act, effective guardrails and a stated guideline around the use of AI.

The Bharat EduAI stack

This is something I have been espousing for a while now, even when I was providing inputs for the NEP in 2018. India needs an EDU Stack. Early work done by EkStep in DIKSHA has built a foundational platform and architecture for content. And now SAMARTH is in place for education management. But beyond that, very little has been done. An EDU stack needs to focus on secure and reliable core capabilities for content, teaching, learning, assessment, grading, certification and allied activities. Planned as a digital public good, it needs to have built in safeguards and barriers to use.

IIT Madras has been tasked by the Ministry of Education with building the Bharat EduAI stack. The Centre of Excellence in AI for Education has been incubated with the aim of “building sovereign, context-aware AI tools for learning and teaching, ensuring no demographic is left behind in India’s digital transformation”. Bodhan.ai will cover the learning workflow, build foundational assets, scale the infrastructure and with a focus on public widespread and long-term evidence-based deployment to educational institutions. It has posed over 200+ research problems including around LLMs, Language, Agents and Responsible AI, and are soliciting active participation in solving some of these problems.

Re-envisioning education systems

There was a lot of focus on how our systems need to change. Educational governance is one of the obvious areas of major change, but also a deep focus is required on how students are getting skilled on AI across all disciplines and preparing for a post-AI world of jobs. The ambition is to deepen our vertically integrated education system through the use and development of AI. With agentic capability increasing quickly, many aspects of administration may switch to AI and institutions may also feel equipped to generate software platforms for their own specific needs. However, the gross opinion is that the need for these institutions will not fade – AI may be regarded as an adjunct power, rather than being a replacement of any kind – with the human given precedence.

In STEAM (Science, Technology, Engineering, Arts and Mathematics) education, the government is planning to introduce 50,000 more Atal Tinkering Labs in secondary schools to foster innovation and also to introduce computational thinking and experiential AI curriculum from Grade 3 to 12. Starting early may help bridge the knowledge and skills gap perceived at higher levels of the system.

In the creative or orange economy space, an initiative is already underway to establish AVGC Content Creator Labs across 15,000 schools and 5,000 colleges, spearheaded by the Indian Institute of Creative Technologies and supported by companies such as CODE (Centre for Originality, Design and Expression) and the Lorraine Music Academy that are building unique ecosystems of learning, teaching, working, mentoring and certifying creative talent.

In school, AI curriculum will be introduced from Grade 3, with an initial focus on computational thinking progressing to active skilling in the use of AI and thence to thinking deeper about AI as careers or specialisations and subsequently linking to higher and tertiary education opportunities. Higher, Professional, and Vocational education institutions will include AI in their curricula and practice as well.

Beyond the hype

I think it is important to consider that in our pursuit of scale, we will also multiply our mistakes at scale. How do we unlock this proverbial genie out of the box?

I think it is fair to say that change is the only constant. The last time there was a gold rush for changing education was in 2010s when MOOCs were touted as the higher education killer. Universities were going to be soon extinct if learning from experts and certification was going to be free for all. Before that we had Wikipedia, which purported to make teachers irrelevant.

AI is not just more of the same. It is disrupting domains so quickly that our education systems are being forced to adapt and re-architect their approaches to teaching, learning and assessment.

Having spent 30+ years in edTech, spanning school, higher ed, publishing, test prep, vocational, knowledge/libraries, and corporate learning, here are a few observations that we may want to consider. In fact many of these remain the same as I had once suggested to the Planning Commission for the 12th Plan.

Scale and glocalise the narrative

We need a deeper narrative in edTech. It is not about throwing technology at solving our problems. It is about solving our problems with one ingredient being core technology. Digging deeper, we do not have a second line depth of analysis as of now, at least it is not articulated well enough.

Scaling the narrative means really appreciating the fact that there is no one single conception of a teacher, student, parent, educational system, policy maker, leader or teacher educator; no single conception of what a school or a college or a state is; no monolithic conceptions that we can target single monolithic solutions at.

For example, about 7% of our schools are single teacher schools. Only 34.9% have functional desktops/PCs, 19.8% have a functional laptop and 26% have a functional tablet. About 57.8% (8,50,511 schools) have less than 100 students. As per UDISE 2015-16, between 37 and 44% of students from classes 1-8 in India got less than 60% scores in their exams, which I daresay would still be similar in 2026. Large numbers of teachers do not have necessary skills and qualifications and so on. There is no single India, and therefore there is no single ubiquitous AI solution possible for all needs.

We need to meet scale with scale. We have to encourage diverse attempts at solving the same thing for different stakeholders and local contexts. We must steer clear of magic wand solutions that claim to erase systemic issues being faced by us.

Building an AI story in Indian education needs to look beyond the hype of big tech and promises of technology to solve anything.

Rather we must look deeper into our diversity, encourage local solutions and adaptations and reward local alongside the global/national perspectives we have. Governments can scale this change effectively if there are multiplicity of ideas and technologies, not just any one.

System in the loop

We have talked about humans in the loop, and that is vital. But have we talked about educational systems in the loop. That is, how do educational systems adapt to insertion of edTech, and how they accommodate edTech so that it can have the optimal impacts?

As Sarason stated: “It is a system with a seemingly infinite capacity to remain the same in the face of obvious inadequacies, unmet goals, and public dissatisfaction. It is a system in which accountability is so diffused that no one is accountable. It is a system that has outlived all of its reformers, and will outlive the present generation of reformers.”

Through the years we have spoken about the same set of issues that confound the implementation of edTech, somewhat tempered by the shifts we saw in tech use during COVID (which I hope will catalyse this upcoming change to some level).

For example, issues facing teachers range from lack of motivation, capability, teaching diverse multi-grade multi-level classrooms, administrative tasks (inside and outside school), lack of respect, access to technology & bandwidth, fragmentation, discrimination, absenteeism and so many other.

Similarly, each stakeholder, whether it is an official or teacher educator or policy maker or student or parent, come with their own set of challenges. These challenges are occurring within the context of a system of education, either at the state or national level, or at an urban or rural, or at large or small institutions.

Irresponsible edTech thinking attempts to paint a unified picture – the one size fits all. No agency is given to the system or its incumbents to make choices, retain control or influence/own the implementation. It is not different with non-edTech initiatives, for instance, OTBA, CCE.

Just the way there is a gap between policy and practice, there is a gap between edtech solutioning and on-the-ground systemic realities. So we end up building stuff first and think about its implementation at scale later.

How can we start initiatives on AI in education without rethinking what that education should be. Essentially, research in technology, does not automatically imply innovation in teaching and learning? I think we have got that mixed up a bit.

Systems and capabilities need to be adapted to the upcoming changes. This is what we need to overcome first. The LLMs, AI, EduAI stack – all come a little later, or need to be designed in context.

Infrastructure

This is a serious enough problem to consider. If we expect AI to reach every classroom, we need to figure out our basic infrastructure first, then how to scale to millions of classrooms and ~350 mn residences. And this must be reliable and reasonable quality – given that education is mission critical. Of course, this infrastructure requires resources that go beyond our current abilities and the impact on the environment cannot be understated. So, it’s a difficult problem to solve – again perhaps small is beautiful as a strategy may work better.

Research

This one piece confounds me. What is the research happening in edTech? What do we need? How many Ph.Ds do we have in edTech. Apart from the work being done by Prof. Jayashree Shinde and Prof. Vasudha Kamat at SNDTWU and others at IIT Bombay, where is the hardcore research into personalisation, assessments happening? If we have a Principal Scientific Advisor, why don’t we have a Chief Learning and Research Officer for India?

And, do we think, without Indian research on the most confounding edu and tech problems, that we will be able to implement AI in Education. It’s not about the technology. Today, teaching, learning, assessment and digital pedagogies themselves need to be rethought or enhanced to accommodate the new realities.

Community

I think we underestimate the power of individuals and communities to make a difference at scale. Some interventions like ITforChange, have tried this successfully for some time now and it needs to be a structured intervention at scale.

A systematic approach for the adoption of any new change has to include champions that provide local support for stakeholders aligned to national priorities. A large community of champions can include trained professionals (I proposed we identify around 170K such individuals, and maybe even revitalise the idea of an Indian Education Services cadre, first proposed by Anil Bordia, in the Civil Services), hundreds of thousands of mentors (civil society, retired experts, professionals) who could handhold the implementation on the ground.

Content

This is a difficult problem to handle. Contrary to popular edTech belief, chat conversations, audio podcasts, slides are not the be all and end all of content. In fact, in the age of AI, to limit ourselves to those older artifacts, would be to do a disservice to the power of the new technologies. We need more interactive, versatile and reliable content. Perhaps it is time for content to be thought of as Intelligent as well – the shapes that this can take can actually pave the way for true personalisation. I had written in 2007 about how through Content Equivalence we could achieve easy morphing from one content type to another – something that is now possible with AI!

While popular technology talks about voice, multilingual, video, podcasts, and other multimedia elements, we have to talk about other and newer forms of content that were not possible to create at scale earlier – for example simulations and games, interactives, explorations and so on. Content itself must be re-imagined. Why should tech only look at recasting existing textbooks?

Personalised adaptive learning

This is the elephant in the room. The story of personalised learning is a long one, and under-appreciated by most technologists seeking to superimpose technology on education. Personalisation is the holy grail of our educational systems, which are so heavily focused on teaching to the middle. It promises to level the playing field, get access to high quality technology mediated education at scale. However, it is also the most challenging problem to solve.

The earliest approaches were around teaching machines (see Skinner, 1958 and the awesome book by Audrey Watters, Teaching Machines). Then expert systems and the Intelligent Tutoring Systems in the 1980s sparked off by Etienne Wenger (Artificial Intelligence and Tutoring Systems, 1987). We can also look at the work on Ms Lindquist by CMU which tried to model actual teaching – learning interactions in Algebra. Also, the work by companies like Knewton, SmartSparrow, Squirrel AI, our own Educational Initiatives’ EI Mindspark, Vedantu’s Wave 2.0 platfom and a host of others in the first part of this century. These predated the GenAI and Agentic Age, but had a lot of learning to provide to current efforts. Knewton, I remember, had over 40 different parameters to define a learner profile, and I had built similar personalisation frameworks/ideas as well.

The point I am making is that all these systems sought a deeper understanding of the interplay between content, teacher and learner, and built informed frameworks around them. The reality was that there are just too many parameters in this human endeavour of learning and teaching for there to be any one solution.

With AI, this challenge is deeper. We have somehow succumbed to the idea that AI can solve the personalisation problem just because it can handle billions of parameters using largely consumer grade AI. LearnLM from Google attempts to take a crack at this problem – they started with a small RCT (randomised control trial with 165 students, also included 17 human tutors who intervened/approved AI responses before it reached the students) which showed positive results. Now it is a full fledged service on Google.

Assessments and grading

Robograding (or automated grading) is the other use case that is being actively explored for a long time now (code graders, essay graders have been available for quite some time). AI promises to help automate most of the standard formats. But really, the question is whether those standard formats are really the ones that will stay, or whether those have to be reinvented altogether to be AI-ready. For example, teachers are struggling with the validity of homework submissions, given the ubiquity of AI tools available to students.

Capability and capacity development

No large scale change can happen without both. I have said before, the mindsets, capabilities and enabling environments are perhaps the most important ingredient in all of this. We should surely make this a mission mode initiative with the right budgets. With the right kind of capability and capacity, these AI initiatives can percolate deep into our education system.

Rethinking systems

While we may tinker by adding curricula, teacher training and tools, fundamental shifts in education systems often follow systemic structural shifts. So, it may be a good idea to consider those shifts. For example, in WhatIfEDU, I argued that we need to rethink our conception of educational time. For example, we divide the entire syllabus into chunks over the academic calendar and mandate even the number of periods/hours devoted to each subject. This is irrespective of the teaching and learning frictions occurring at the level of an individual classroom. In the age of AI, how will this concept of educational time be rethought by our education authorities?

Similarly, the idea of grouping students into classes, sections and batches? Or of imposing subject boundaries? How will these need to be reconceptualized?

AI and neuroscience

What was missing, I think, from this AI Summit, was perhaps a serious discussion around the singularity of humans and machines. While there was discussion on NEURODx as a technology and model (“ChatGPT for the brain”) for clinical diagnosis for mental health, I am talking more about how in the future, our brains will be wired into the technology.

As I wrote as part of a series of articles on Medium a couple of years back: “There will be a time when our education systems will be natively intelligent — as systems that self-correct, extend, innovate, be efficient and humane — as systems that have space for human creativity, passion, and ingenuity as well as the intellectual power of artificial machines.”

What will these ecosystems look like? Consider what and who they will serve.

(Viplav Baxi is the founder of AmplifiU)

Leave a Reply

Your email address will not be published. Required fields are marked *