A fully local AI research agent powered by Ollama and LangChain. It can search the web, query Wikipedia, optionally save results, and return structured output via a Pydantic schema. Runs offline with no API keys or cloud services required.
- Tool-using agent (Search, Wikipedia, Save)
- Structured output with Pydantic
- 100% local via Ollama
- No OpenAI keys or costs
- Autonomous tool selection
- DuckDuckGo Search integration
- Wikipedia integration
| Component | Technology |
|---|---|
| LLM Runtime | Ollama |
| Model | Qwen 2.5 Coder 7B |
| Framework | LangChain |
| Parsing | Pydantic |
| Tools | DuckDuckGo + Wikipedia |
curl -fsSL https://ollama.com/install.sh | shDownload the LLM locally:
ollama pull qwen2.5-coder:7bgit clone <your-repo-url>
cd <project-folder>python3 -m venv venv
source venv/bin/activatepip install -r requirements.txtpython agent.pyExample:
What can I help you research? who are sharksTopic : Sharks
Summary : Sharks are cartilaginous fish...
Sources : ['Wikipedia']
Tools used : ['wikipedia']class ResearchResponse(BaseModel):
topic: str
summary: str
sources: list[str]
tools_used: list[str]| Tool | Purpose |
|---|---|
| search | DuckDuckGo web search |
| wikipedia | Encyclopedia lookup |
| Save_text_to_file | Optional saving |
research WW2 and save to fileThis project uses the following model:
qwen2.5-coder:7b
because it supports:
- tool calling
- reasoning
- structured responses
Note: Other models may not support tool usage.
- Python 3.9+
- Ollama installed
- macOS / Linux / WSL supported
This project is licensed under the MIT License.
PRs, issues, and improvements are welcome.
If you like the project, consider giving a ⭐ on GitHub!