Back in October 2024, I wrote on these pages of a group of 12-year-olds who had figured out an ingenious shortcut to finish their homework. Use 40% ChatGPT, 40% Google, and 20% of their own words. At first, it looked like cheating. But with the perspective of distance, I think it was something else entirely.

PREMIUM
Regulators in Europe are telling Meta (which owns Facebook, Instagram and WhatsApp) not to use user data to train AI unless people clearly agree to it. (Shutterstock/ Representative photo)

These children had understood a basic truth: in today’s world, what matters most is the result. Not the process. Not the effort. Ask the right question and let the machine find the answer. Don’t worry too much about what happens in between. This way of thinking isn’t limited to schoolwork anymore, it’s showing up in the way digital systems are being built world over—India included.

Over the last few years, India has quietly built one of the most impressive pieces of technology. Between Aadhaar, UPI, DigiLocker, CoWIN, Bhashini, and ONDC– collectively called IndiaStack– it is now used by millions of people. It helps people prove their identity, send money, download documents, get vaccinated, translate languages, and access other public services.

But here’s what makes India’s system different from those in most other countries: it doesn’t keep your data.

In countries like the United States or across Europe, tech companies track what people do online. Every search, every click, every purchase is saved and studied. That information is then used to target ads, recommend content, and even train artificial intelligence systems. That is why The New York Times is now suing OpenAI (the builders of ChatGPT) — because its news articles were used to train a system without permission.

This is why regulators in Europe are telling Meta (which owns Facebook, Instagram and WhatsApp) not to use user data to train AI unless people clearly agree to it.

In India, the rules—and the values—are different. The digital systems here were built with public money and designed to serve everyone. But they were not designed to spy on people. They were created to work quickly, fairly, and without remembering too much.

Take Aadhaar. All it is built to do is prove a person is who the person claims to be. It cannot track where you go.

Or DigiLocker. It doesn’t keep copies of your CBSE marksheets, PAN cards, or insurance papers. It simply fetches these documents from the source when you ask. That’s all. It’s a messenger, not a filing cabinet.

UPI moves money between people. But it doesn’t remember what you spent it on.

Long story short, these systems were built to function like light switches. They work when needed, and switch off when the job is done. The builders insisted it doesn’t hold on to your personal information for longer than necessary.

That’s why India’s digital model is being noticed around the world. It’s open, fair, and inclusive. But now, with the rise of artificial intelligence, a new kind of problem is emerging.

AI tools need a lot of data to learn how to speak, listen, write, or make decisions. In India, companies are beginning to train AI using public systems. Language tools learn from Bhashini. Health startups are using patterns from CoWIN to build diagnostic tools. Fintech firms are using transaction frameworks to refine how they give loans. This isn’t illegal. It was even expected. These public systems were built to encourage innovation.

But here’s the problem: the public helped create these systems, and now private companies are using them to build powerful new tools—and may be making money from them. Yet the public might not see any of that value coming back.

This is where the story of the 12-year-olds we started with, becomes relevant again.

Also Read: Reimagining education with AI

Just like those students who used machines to do most of the work, there’s now a larger system that is also skipping the middle part. People provide the inputs—documents, payments, identities. The machines learn from them. And then private players build services or products that benefit from all that learning. The people who made it possible? They are left out of the conversation.

In other countries, the debate is about privacy. In India, the debate must now shift to fairness.

It’s not about stopping AI. It’s not about banning companies from using public tools. It’s about asking for transparency and accountability.

If a company is using data or tools from public systems to train its AI, it should say so clearly. If it benefits from public data, it should give something back—like sharing improved datasets, or allowing its models to be audited. If it’s building a commercial product on public infrastructure, it should explain how it used that infrastructure in the first place.

Also Read: Where ‘they’ ends and ‘we’ begins: Charles Assisi on AI

This is not regulation for the sake of it. It’s basic respect for the public that made the system possible in the first place.

India’s digital platforms were built to serve everyone. They were not designed to store people’s information, and that’s what makes them special. But that openness can be misused if those who build on top of it forget where their foundations came from.

It’s easy to be dazzled by AI. But intelligence—human or machine—shouldn’t come without responsibility.

So here’s the question worth asking: If the public built the digital tools, used them, trusted them, and helped them grow—why aren’t they part of the rewards that artificial intelligence is now creating?



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *