Skip to content

AmritaPanjwani/Local-AI-Research-Agent-Ollama-LangChain-

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Local AI Research Agent (Ollama + LangChain)

Python Ollama LangChain License: MIT

A fully local AI research agent powered by Ollama and LangChain. It can search the web, query Wikipedia, optionally save results, and return structured output via a Pydantic schema. Runs offline with no API keys or cloud services required.


Features

  • Tool-using agent (Search, Wikipedia, Save)
  • Structured output with Pydantic
  • 100% local via Ollama
  • No OpenAI keys or costs
  • Autonomous tool selection
  • DuckDuckGo Search integration
  • Wikipedia integration

Tech Stack

Component Technology
LLM Runtime Ollama
Model Qwen 2.5 Coder 7B
Framework LangChain
Parsing Pydantic
Tools DuckDuckGo + Wikipedia

Setup

Install Ollama

curl -fsSL https://ollama.com/install.sh | sh

🧩Pull the Model

Download the LLM locally:

ollama pull qwen2.5-coder:7b

Clone the Repository

git clone <your-repo-url>
cd <project-folder>

Create Virtual Environment

python3 -m venv venv
source venv/bin/activate

Install Dependencies

pip install -r requirements.txt

🎯Usage

python agent.py

Example:

What can I help you research? who are sharks

📤Example Output

Topic      : Sharks
Summary    : Sharks are cartilaginous fish...
Sources    : ['Wikipedia']
Tools used : ['wikipedia']

🧩Pydantic Response Schema

class ResearchResponse(BaseModel):
    topic: str
    summary: str
    sources: list[str]
    tools_used: list[str]

🧰 Tools Available

Tool Purpose
search DuckDuckGo web search
wikipedia Encyclopedia lookup
Save_text_to_file Optional saving

Example save usage:

research WW2 and save to file

🧪 Notes on Model Support

This project uses the following model:

qwen2.5-coder:7b

because it supports:

  • tool calling
  • reasoning
  • structured responses

Note: Other models may not support tool usage.

📌 Requirements

  • Python 3.9+
  • Ollama installed
  • macOS / Linux / WSL supported

📜 License

This project is licensed under the MIT License.


⭐ Contributions

PRs, issues, and improvements are welcome.

If you like the project, consider giving a ⭐ on GitHub!

About

A fully local AI research agent powered by Ollama + LangChain that can search the web, query Wikipedia, and optionally save results. It chooses tools intelligently based on user queries and returns structured Pydantic responses. Runs offline, requires no API keys, protects privacy, and is perfect for experimentation and learning.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages