Unlocking Local AI: Using Ollama with Agents
David Jones-Gilardi
Written by David Jones-Gilardi
March 26, 2025
Unlocking Local AI: Using Ollama with Agents
By now, there’s a good chance you’ve heard about generative AI or agentic flows (If you’re not familiar with agents and how they work watch this video to get up to speed). There’s plenty of information out there about building agents with providers like OpenAI or Anthropic. However, not everyone is comfortable with exposing their data to public model providers. We get a consistent drum of questions from folks wondering if there’s a more secure and cheaper way to run agents. Ollama is the answer.
If you've ever wondered how to run AI models securely on your own machine without sharing your data with external providers, well, here you go!
If you’d rather watch this content, here’s a video covering the same topic.
David Jones-Gilardi
Written by David Jones-Gilardi
March 26, 2025
Unlocking Local AI: Using Ollama with Agents
By now, there’s a good chance you’ve heard about generative AI or agentic flows (If you’re not familiar with agents and how they work watch this video to get up to speed). There’s plenty of information out there about building agents with providers like OpenAI or Anthropic. However, not everyone is comfortable with exposing their data to public model providers. We get a consistent drum of questions from folks wondering if there’s a more secure and cheaper way to run agents. Ollama is the answer.
If you've ever wondered how to run AI models securely on your own machine without sharing your data with external providers, well, here you go!
If you’d rather watch this content, here’s a video covering the same topic.
Similar Readings (5 items)
Unlocking Local AI: Using Ollama with Agents
The Best Local Coding LLMs You Can Run Yourself
## Llama: Your Open-Source LLM Option
Summary: OpenAI and Jony Ive may be struggling to figure out their AI device
Summary: Meta Llama: Everything you need to know about the open generative AI model
Summary
David Jones-Gilardi discusses using Ollama for building secure and cost-effective agents in Local AI, an alternative to public model providers like OpenAI or Anthropic. The article explains how to run AI models safely on personal machines without sharing data externally. The content is also