How We Built an AI Learning Assistant – Approved by Teachers

Jelena Matecic

Textbooks are great, but let’s be honest - sometimes students need a study buddy with a sense of humor and a knack for explaining photosynthesis. This is where our AI assistant enters the scene.

Textbooks are full of rich knowledge, but let’s face it: students often miss the good stuff. Important facts get buried in small print, side notes, or skipped pages, and curiosity can fade fast.

That got us thinking: what if an AI assistant could make learning more interactive, personal, and fun?

In this article, I’ll share how the AI Base Engineering team at Infobip – of which I’m a member – built a prototype AI tutor for biology. I’ll explain why we chose this subject, how we tested it, and the key lessons we learned along the way.

Spoiler: it’s not about replacing teachers – it’s about helping students learn in new and meaningful ways.

So… Why an AI study buddy?

Our challenge was simple but ambitious: make textbook content more accessible, engaging, and curiosity-driven.

Biology was the perfect testing ground: it’s well-structured, widely taught, and available in digital form. For our prototype, we used official Croatian school textbooks, spanning 7th grade through the second year of high school.

The goal? To support every kind of learner – those falling behind, those racing ahead, and everyone in between. The AI assistant acts like a responsive study buddy: highlighting overlooked facts, answering questions from verified sources, and adapting explanations to each student’s level of understanding.

And just to be clear: this was never about replacing teachers. Our vision is to help students learn and engage more deeply, while keeping teachers central to the process.

Our AI tutor explains, not just defines

To build a reliable assistant, we grounded everything in the curriculum. Using trusted digital textbook content, we crafted precise prompts to guide the assistant toward clarity, simplicity, and curiosity-driven learning.

One of our favorite tactics? The assistant doesn’t just dump definitions. Instead, it might explain photosynthesis like this:

Plants are like little factories; sunlight turns water and air into sugar. Do you want to know more about how that happens?

We also trained the assistant to ask questions back: A Socratic approach that encourages critical thinking. It doesn’t just answer; it engages.

We taught our AI to talk the talk

Designing the assistant’s tone in Croatian was no small feat. The language includes formality distinctions and gendered grammar, so we had to strike a delicate balance: friendly, but not too casual; professional, but not robotic.

We also taught it to respond to tricky situations – from inappropriate language to sensitive topics like human reproduction – with calm professionalism and respect. When students pushed boundaries, the assistant didn’t scold; it simply guided them back toward curious, respectful inquiry.

To meet students where they already are, we brought the assistant to WhatsApp. And with Infobip’s Voice API, they can ask questions or get answers as voice messages. The result? A judgment-free, always-available biology buddy – just a tap (or a voice note) away.

When AI gets creative (and sometimes WRONG)

Let’s address the elephant in the room: the hallucinations.

Like all LLMs, ours sometimes got a bit too creative. Ask for an example? It might cheerfully invent one from thin air. Say hi? You could end up with a TED Talk on evolution. Ask who’s stronger, a lion or a wolf? You might get a philosophical journey through mammal diets, fur types, and migration patterns.

These hallucinations were part of the process – and even charming at times – but accuracy is essential in education. We improved prompts and curated the assistant’s knowledge base more tightly to fix this. Hallucinations might not disappear entirely, but we learned how to keep the assistant on track.

After all, when a student asks about mitosis, they shouldn’t end up hearing about whales.

Test drive, phase by phase: first staff, then students!

Phase 1: Internal Pilot

Our first testers were internal education and tech staff. They knew what to look for and how to break things. Their feedback helped iron out glitches and set a strong foundation.

Phase 2: Teacher Feedback

Next, we brought in real teachers. They tested the assistant against real student questions. Could it explain clearly? Did it stay age-appropriate? Was it pedagogically sound?

The feedback surprised us in a good way. Teachers appreciated the assistant’s thoroughness. When students asked if they could use the assistant during tests, it responded with integrity:

That wouldn’t be correct. But I can help you prepare by giving you 10 questions and evaluating your answers.

Not hardcoded, just good training.

Phase 3: Student Trials

Finally, students used the assistant in a classroom setting. They used it like a study buddy, asking it to quiz them or explain tricky terms. The results? Excited engagement.

They loved the follow-up questions that kept the conversation going. They liked the longer answers.

The only complaint? Voice messages sounded robotic! And yes, it sometimes reads formatting symbols out loud (literally saying “star” instead of bolding).

How AI Can Help Students Learn and Engage

Here’s what we saw, again and again: AI can help students learn and engage by providing:

  • Instant help – Students can ask questions privately, anytime, without fear of judgment.
  • Personalized explanations – If one metaphor doesn’t work, the assistant tries another.
  • Active learning – With questions like “Can you think of household acids?”, the assistant nudges students to connect concepts to real life.
  • A safe space – For shy students, the AI is a no-pressure place to be curious.

Notably, the assistant always encouraged students to verify with their teacher and the textbook. Teachers remain the core of the classroom, and the assistant is just that, an assistant.

Lessons Learned and the Road Ahead

This project started with a simple goal: help students get unstuck. Along the way, it became a deeper exploration of what AI can do in education. What we discovered is this: with careful design and clear boundaries, AI can enhance learning and engagement – complementing, not replacing, human teaching.

Success comes down to the details: prompt phrasing, tone, voice, UX, and content quality. Teacher and student feedback proved invaluable, showing how much students respond when learning feels personal, responsive, and judgment-free.

Next steps? We’ll improve voice UX, expand to new subjects, and keep gathering feedback to make the experience even better.

And to educators and tech innovators alike: building an AI assistant isn’t just a coding exercise, it’s a collaborative effort between tech and teaching. Done right, it becomes more than a tool – it becomes a trusted companion in the learning journey.

And if one more student walks away thinking, “Hey, biology is kinda cool,” then we know we’ve done something right.

> subscribe shift-mag --latest

Sarcastic headline, but funny enough for engineers to sign up

Get curated content twice a month

* indicates required

Written by people, not robots - at least not yet. May or may not contain traces of sarcasm, but never spam. We value your privacy and if you subscribe, we will use your e-mail address just to send you our marketing newsletter. Check all the details in ShiftMag’s Privacy Notice