NaviCare — How a Simple Gen AI Prototype Can Help Newcomers Feel at Home

Published by zarapalevani on

Reading Time: 9 minutes

By Zara Palevani | April 2025

Course dates: March 31 – April 4, 2025

Prepared for Kaggle Host: Google

Kaggle Notebook: Link

Competition page: Link

“Human beings are members of a whole,
In creation of one essence and soul.
If one member is afflicted with pain,
Other members uneasy will remain.”
— Saadi Shirazi, 13th-century Persian poet (inscribed on the UN building)

In today’s fast-moving world, people are constantly on the move — fleeing war, seeking better opportunities, or starting over. Whether they’re refugees, international students, or recent immigrants, many arrive in unfamiliar cities with a simple question:

“Where do I go for help?”

Food banks. Legal clinics. Mental health services. Language classes. Even in a welcoming country like Canada, accessing these resources can feel overwhelming, especially if you’re navigating in a second (or third) language.

As someone who once stood in those very shoes, I asked myself:
Can Generative AI help newcomers feel less lost and more at home?

That question wasn’t just technical — it was personal.

🍁Where This Idea Came From: A Cold Winter, a Warm Crowd

Nineteen years ago, I landed in Canada in the middle of winter. I had missed orientation week, was two weeks late for classes, and had no idea what “Frosh Week” meant. It was a lonely, snowy start.

Months later, I stumbled upon a group of newcomers being led around campus. A few stragglers at the back looked lost. Instinct kicked in — I started pointing things out, telling stories, guiding them. Before I knew it, I had my own mini-tour group following me.

Then a man tapped me on the shoulder and asked, “Whose team are you on? Why aren’t you wearing an orientation leader shirt?”
I blinked. “What team?” I had no idea what he was talking about — I was just helping.
He laughed and said, “Okay, I see what’s happening. You know you could get paid for this, right? Come see me at the Student Union Building.”

That’s how I got my first job on campus — by simply remembering what it felt like to be lost, and choosing to help someone feel found.

The Solution


My prototype, NaviCare, is a Retrieval-Augmented Generation (RAG) assistant that helps immigrants and refugees discover local services in Canada using natural language. It uses embeddings for semantic search, FAISS for fast retrieval, and Gemini for grounded response generation.

How NaviCare Works: AI for Newcomers

NaviCare is powered by Google’s Gemini API, which I fine-tuned to deliver clear, kind, and multilingual responses—perfect for newcomers who may prefer English or French or their native language. But what makes NaviCare special is its Retrieval-Augmented Generation (RAG) feature. Using a technique called embeddings (think of it as turning text into numbers a computer can understand), NaviCare searches a curated database of Canadian community services to find the most relevant resources for your city. For example, if you ask for legal help in Vancouver, it retrieves details about MOSAIC’s free aid programs and feeds them to Gemini for a tailored answer. This ensures responses aren’t just generic—they’re grounded in real, local data.

Meet Maria: A Newcomer’s Journey with NaviCare

Imagine Maria, a single mother who just arrived in Calgary with her toddler and elderly father. She’s overwhelmed, juggling housing applications, doctor visits, and job searches—all in a new country. With NaviCare, Maria types, “Where can I find affordable housing in Calgary?” and instantly gets a recommendation for the Calgary Housing Company, complete with a website and phone number. Next, she asks for pediatric care, and NaviCare points her to a local clinic accepting new patients. For her father, it suggests senior-friendly community centers. In minutes, Maria has a plan—without endless Google searches or hotline wait times. NaviCare’s localized, AI-driven answers gave her confidence to settle in.

Environment Setup

This project runs entirely inside a Kaggle notebook using freely available tools. To set up the environment, I installed the necessary libraries and configured the Gemini API key using Kaggle Secrets.

!pip install -q faiss-cpu sentence-transformers

import numpy as np
import pandas as pd
import faiss
from sentence_transformers import SentenceTransformer

from kaggle_secrets import UserSecretsClient
import google.generativeai as genai

# Load Gemini API key
user_secrets = UserSecretsClient()
GOOGLE_API_KEY = user_secrets.get_secret("GOOGLE_API_KEY")
genai.configure(api_key=GOOGLE_API_KEY)

# Instantiate Gemini model
gemini_model = genai.GenerativeModel("gemini-1.5-pro-latest")

Step 1: Creating the Knowledge Base

I started with a simple block of service descriptions — each one separated by two newlines. This code turns that text into a list of entries I can later embed and search.

chunks = canadian_services_text.strip().split("\n\n")
chunk_texts = [chunk.strip() for chunk in chunks if chunk.strip()]

Step 2: Converting Text to Embeddings

I used the MiniLM model from HuggingFace to convert each service entry into a numerical vector. These embeddings allow me to compare queries and services based on meaning, not keywords.

embedding_model = SentenceTransformer('all-MiniLM-L6-v2')
embeddings = embedding_model.encode(chunk_texts)

Step 3: Building a FAISS Index

Once I had the embeddings, I created a FAISS index. This acts as my semantic search engine — allowing the assistant to retrieve the most relevant service entry for any user question.

dimension = embeddings[0].shape[0]
index = faiss.IndexFlatL2(dimension)
index.add(np.array(embeddings))
chunk_id_to_text = {i: chunk for i, chunk in enumerate(chunk_texts)}

Step 4: Retrieving the Closest Match

When a user asks a question like “Where can I find food banks in Toronto?”, I embed their query the same way, and search the FAISS index for the most semantically similar chunk.

user_question = "Where can I find free legal help in Vancouver?"
query_embedding = embedding_model.encode([user_question])
_, retrieved_indices = index.search(np.array(query_embedding), k=1)
retrieved_context = chunk_id_to_text[retrieved_indices[0][0]]

Step 5: Grounding the Response Using Gemini

Now that I’ve retrieved the most relevant service info, I send it — along with the user’s question — to Gemini using a custom prompt. The model is instructed to only use the retrieved context when responding.

gemini_prompt = f"""
You are NaviCare, an AI assistant for newcomers to Canada.
Please answer the following user question using only the information below.
Be clear and helpful.

Relevant Information:
{retrieved_context}

Question:
{user_question}
"""

response = gemini_model.generate_content(gemini_prompt)
print(response.text)

Step 6: Collecting User Feedback

At the end of the conversation, I ask users whether the answer was helpful. In this MVP, it’s a simple text input — but this could evolve into a formal feedback system for improving the model over time.

feedback_score = input("Was this response helpful? (yes/no): ")

Step 7: Adding Voice Support for French-Speaking Newcomers

As I continued refining NaviCare’s user experience, I realized that many newcomers might prefer hearing responses to reading them. This is particularly important for users with limited literacy or visual impairments.

To support this, I integrated a text-to-speech (TTS) system into the assistant. Now, each response from NaviCare is not only displayed but also spoken aloud using a natural-sounding voice.

Here’s a sample of how it works:

from gtts import gTTS
from IPython.display import Audio, display

def speak_response_fr(text):
    tts = gTTS(text=text, lang='fr')
    filename = 'navicare_fr_response.mp3'
    tts.save(filename)
    display(Audio(filename, autoplay=True))

# Example response
assistant_reply = "Bienvenue sur NaviCare. Je suis là pour vous aider à trouver des services dans votre région."
speak_response_fr(assistant_reply)

This feature allows NaviCare to go beyond written text, making it a multisensory, inclusive experience. While I’ve currently implemented it in French, the underlying system supports dozens of languages. This opens the door for future expansion into multilingual voice support, which I plan to pursue based on community feedback.

Voice interaction is more than a convenience—it’s a bridge to trust and understanding, especially in high-stress, high-uncertainty situations where users are seeking reliable help.

Why I Chose This Stack

  • sentence-transformers (MiniLM): Reliable, efficient embeddings for short service texts.
  • FAISS: Fast and lightweight similarity search — perfect for local prototypes.
  • Gemini API: Strong grounding ability, especially when constrained by a specific context.
  • Kaggle Notebooks: Free, reproducible, and easy to share — aligned with the Capstone’s open format.

Limitations and Future Work

  1. Multilingual Support: Currently English only. Future versions could use Google Translate for automatic query translation and response localization.
  2. Static Data: The prototype uses hardcoded text. I’d like to connect future versions to live data feeds from NGOs and city services.
  3. No Personalization: Each query is treated in isolation. In production, I’d explore privacy-preserving session memory or profile-based customization.
  4. Feedback Loop: Right now, feedback is printed but not logged. Future improvements could track trends and failure cases.
  5. No Front-End: Everything is code-based. A simple web app using Streamlit would make this usable by staff in community orgs or shelters.

In addition, I originally included more diverse, real-world inputs — like typos, mixed-language queries, and informal phrasing — to push the assistant’s limits. However, due to Kaggle’s notebook input() limitation, I removed those examples to avoid runtime errors and ensure the notebook ran cleanly from top to bottom. In a real application, I’d absolutely bring them back to better reflect how newcomers actually ask for help.

Does It Work? NaviCare’s Results

To test NaviCare, I ran sample queries like “How do I apply for OHIP in Ontario?” and “Where can I find food banks in Toronto?” The AI’s responses—based on real Canadian resources—earned an average rating of 4.67 out of 5 for clarity and relevance. For example, users loved the direct link to Ontario’s OHIP application process (5/5) and detailed Calgary housing options (4/5). I even visualized these scores in a bar chart to show NaviCare’s consistency. These results prove it’s not just a prototype—it’s a reliable tool for newcomers.

Why NaviCare Matters

This project wasn’t just a technical exercise. It came from lived experience.

When I moved to Canada, I didn’t know where to start. I found services by accident — through people, signs, or trial and error. NaviCare is what I wish I had: a patient, multilingual guide who doesn’t need you to “know the system.”

All you need is a question. And now, you have somewhere to ask it.

That’s the kind of tech I believe in — not just intelligent, but empathetic.

Try NaviCare Today!

NaviCare blends AI innovation—like Retrieval-Augmented Generation—with real-world impact, helping newcomers find homes, jobs, and community in Canada. With a real-time rating and city-specific answers, it’s a step toward digital inclusion for all. Want to see it in action? Try NaviCare on Kaggle and share your feedback! In the future, I envision NaviCare partnering with nonprofits or 211 services to reach even more people. Join me in making newcomers feel at home.


0 Comments

Leave a Reply

Avatar placeholder

Your email address will not be published. Required fields are marked *