8/27/2024

Setting Up Ollama with Azure Cosmos DB

Getting started with Ollama and Azure Cosmos DB can seem daunting. However, with step-by-step guidance, you'll discover the flexibility & efficiency these tools bring to your AI applications. In this blog, we'll delve right into the nitty-gritty of integrating Ollama with Azure Cosmos DB—a thrilling adventure in the world of Conversational AI!

What is Ollama?

Ollama is a powerful framework that allows developers to create AI applications using local models. By utilizing its capabilities, you can harness the power of AI without the complexities associated with cloud services. Ollama strives to simplify AI integration into applications, therefore enabling developers to focus on building innovative solutions.

What is Azure Cosmos DB?

Azure Cosmos DB is a globally distributed, multi-model database service designed for any scale. Its world-class performance is ideal for high-throughput applications that need low-latency responses. Leveraging the power of NoSQL, it allows developers to seamlessly manage large volumes of data, making it a perfect companion for applications developed using Ollama.

Why Integrate Ollama with Azure Cosmos DB?

Integrating Ollama with Azure Cosmos DB enables you to:
  • Lower Costs: By using local models alongside a distributed database, you can maintain lower operational expenses without sacrificing performance. This is especially relevant if you're managing large datasets.
  • Enhanced Control: With data stored locally in Azure Cosmos DB, you have greater control over how your application functions.
  • Seamless Scalability: As your application evolves, both Ollama and Cosmos DB ensure you can scale effortlessly without a hitch.

Prerequisites

Before we dive in, make sure you have the following installed:
  • Python 3.11.6 or higher
  • Node.js (version 20 or higher)
  • Git
  • A free Azure account for Cosmos DB

Step 1: Clone the Ollama Azure Cosmos DB Demo Repository

To kick things off, we need to clone the demo repository from GitHub - run-llama/azure-cosmos-db-demo. This will provide a baseline environment for your setup.
1 2 git clone git@github.com:run-llama/azure-cosmos-db-demo.git cd azure-cosmos-db-demo

Step 2: Set Up Azure Cosmos DB MongoDB Cluster

  1. Log into your Azure account & navigate to the Azure portal.
  2. Select Create a resource & choose Azure Cosmos DB.
  3. Ensure you select Azure Cosmos DB for MongoDB API during the setup process.
  4. Create a free tier cluster, choosing the vCore option for vector searches.
  5. Once the cluster is provisioned, navigate to the Connection Strings section to retrieve your connection string. You'll need this for later!
Azure Cosmos DB Resources

Step 3: Create Environment Variables

Create a
1 .env
file in the root of your cloned repository to configure your connection with MongoDB. Populate it with the environment variables you need:
1 2 3 MONGODB_URI=mongodb+srv://<username>:<password>@your-cluster-url.mongodb.net/?tls=true&retryWrites=false MONGODB_DATABASE=tiny_tweets_db MONGODB_COLLECTION=tiny_tweets_collection
Don't forget to replace
1 <username>
,
1 <password>
, and
1 your-cluster-url
!

Step 4: Set Up Python Virtual Environment & Install Dependencies

In order to avoid complications with packages, it's best to work in a virtual environment. To set everything up, run the following commands:
1 2 3 python3 -m venv .venv source .venv/bin/activate pip install -r requirements.txt
The
1 requirements.txt
file includes essential libraries like MongoDB drivers amongst others.

Step 5: Import Data into MongoDB

Next, you need to import the sample data into your MongoDB. You can use the provided
1 tinytweets.json
file containing approximately 1000 tweets.
To import:
1 2 bash python 1_import.py
This script will connect to your MongoDB instance & upload the tweets to the designated collection.

Step 6: Load & Index Your Data

Once the data is imported, it’s time to index that data using LlamaIndex, which will prepare it to be queried efficiently.
In your
1 .env
file, add your OpenAI API key:
1 2 plaintext OPENAI_API_KEY=sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
Now you need to create an index using the command below:
1 2 bash python 2_load_and_index.py
This script will create embeddings of the tweet data which are stored back into MongoDB.

Step 7: Run a Test Query

With your embeddings set, it’s time to verify everything worked by running a test query:
1 2 bash python 3_query.py
This will run a query to see if everything is running smoothly, extracting information based on provided criteria.

Deploying Flask API & Frontend

Next up, you will deploy a Flask API & the frontend application.
  1. Create a new public GitHub repository & push your cloned repo into it, making sure you don't include the
    1 .git
    folder.
  2. Go ahead & deploy the Flask app:

Run Flask API

Navigate to the
1 flask_app
folder & run:
1 2 bash flask run
Visit http://127.0.0.1:5000 to check if your API is accessible.

Deploy Frontend with Next.js

For the frontend, run:
1 2 3 4 bash cd next_app npm install npm run dev
Access your frontend at http://127.0.0.1:3000.

Step 8: Celebrate Your Success!

You've successfully integrated Ollama with Azure Cosmos DB!!! 🎉 To further enhance audience engagement for your project, consider using Arsturn.
Arsturn helps you effortlessly create custom chatbot experiences that can BOOST conversions & engagement on your website. Imagine having an interactive tool that connects meaningfully with your users to address their queries, gather feedback, & much more—all without needing technical expertise. So what're you waiting for?
Check out Arsturn to supercharge your Conversational AI today without needing a credit card!

Conclusion

Integrating Ollama with Azure Cosmos DB isn’t just a benefit; it’s a GAME-CHANGER! As we've seen, it allows for an efficient and cost-effective way to develop powerful AI solutions. Whether you're building a simple chatbot or enhancing a data-driven application, you now have the knowledge to make it happen. Happy coding! 🎈

Copyright © Arsturn 2024