-
-
Notifications
You must be signed in to change notification settings - Fork 116
Expand file tree
/
Copy pathmanifest.json
More file actions
249 lines (249 loc) · 13.2 KB
/
manifest.json
File metadata and controls
249 lines (249 loc) · 13.2 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
{
"manifest_version": "0.3",
"name": "minutes",
"display_name": "Minutes — Meeting Memory for AI",
"description": "Conversation memory for AI: record, transcribe, search, and query every meeting and voice memo locally. Minutes 0.14.1 fixes v0.14.0 regressions in call-end auto-stop countdown handling and 1:1 source-aware diarization while keeping the 0.14 desktop context, video-review, live transcription, and self-healing MCP extension improvements.",
"long_description": "Minutes is an open-source meeting corpus you own. Record, transcribe, search, and query every meeting and voice memo on your own disk. No cloud, no vendor to outlive. Output is structured markdown in `~/meetings/` that Claude Desktop, Claude Code, Codex, Gemini CLI, Cursor, OpenCode, Pi, and any MCP-compatible client read from the same folder, without a proprietary SDK or API key. Ten years from now, `grep` still works on your corpus.\n\n29 tools • 7 resources • Interactive MCP App dashboard • Prompt templates\n\n**What's new in 0.14.0:**\n• **Desktop context is now a first-class MCP surface.** A new local `context.db` store captures meeting-adjacent activity (docs, terminals, browser tabs) around every recording and live session. Three new MCP tools query it: `activity_summary` for what happened in a window, `search_context` for keyword lookups, and `get_moment` for rewinding to a specific timestamp. Works without a transcript — useful even when you weren't recording.\n• **New `/minutes-video-review` workflow.** Drop a Loom, ScreenPal recording, bug repro clip, or local screen-walkthrough and Minutes produces a durable review artifact: transcript, sampled frames, metadata, and a briefable bundle any agent can read. Surfaced as a plugin skill and portable skill for Codex/Gemini/OpenCode/Pi.\n• **Apple Speech live backend is now its own config surface.** Standalone live transcription (`minutes live`) has its own backend picker, separate from the general transcription engine. Old overloaded configs migrate automatically. Current scope is explicitly standalone-live-only; see `docs/APPLE_SPEECH.md`.\n• **Cleaner fallback chain for live transcription.** Apple Speech falls back to Parakeet and then Whisper when unavailable. Parakeet itself now falls back to CPU when the GPU path is unavailable, so builds without Metal/CUDA still work end-to-end.\n• **Hybrid source-aware diarization for call capture.** When dual-source stems are available (mic + system audio), diarization now keeps your local voice ownership stable and refines remote speakers from the system stem. Chunks are aligned by index, not wall-clock, so long calls stay coherent.\n• **Device-name canonicalization.** Decorated picker labels persisted from older versions (`\"MacBook Pro Microphone (Built-in)\"`, etc.) now resolve to the real CoreAudio device so your saved recording device keeps working after upgrades.\n• **Calendar is safer and more honest.** `[calendar] enabled=false` is now respected throughout. The background calendar poller no longer auto-launches Calendar.app. The `calendar-events` helper is correctly bundled into release `.app` builds on macOS.\n• **Native call capture can bypass the loopback preflight** when the desktop path is available, so ScreenCaptureKit-based setups don't need loopback installed to record system audio.\n• **Windows and Linux desktop-context collectors.** Shared implementation lands alongside the existing macOS path. Platform-gated cleanly so cross-compile stays green.\n• **Self-healing Claude Desktop extension.** The hosted `.mcpb` no longer breaks when our CLI version advances. Same-major skew (e.g. MCP 0.13.3 against CLI 0.14.0) is silent-compatible, auto-install fetches the latest CLI via GitHub's `releases/latest/download/` redirect instead of a pinned tag, and a new `minutes capabilities --json` feature probe lets the MCP server hide tools whose backing CLI subcommand is missing rather than exposing them and surfacing runtime errors (issue #183 Phases 1 and 2).\n• **Build reliability.** Auto-recovers from stale `ort-sys` clang_rt paths after Xcode upgrades (the recurring `library not found` after macOS major updates). CLI and app feature flags stay in sync.\n\n**Agent surface polish:**\n• `llms.txt` enumerates the full skill catalog with CI enforcement.\n• Plugin tree now emits per-skill `scripts/` and `references/` assets for Claude host's `copy` asset policy.\n• Generated agent docs (`.agents/skills/`, `.opencode/skills/`, `.opencode/commands/`) stay in lockstep via a compiler check.\n• Surface audits guard against silent skill/hook/pack drift.\n\n**Setup:** `npx minutes-mcp` gets you search, browsing, and the dashboard right away. For recording and transcription, Minutes can auto-install the CLI on first use, or you can install it yourself with `brew install silverstein/tap/minutes` or `cargo install minutes-cli`, then run `minutes setup --model tiny`, `minutes setup --parakeet`, or `minutes setup --demo`. Want to preview the full experience without recording? Run `minutes setup --demo`.\n\n**What Minutes already gives you:**\n• Filesystem-as-API: meetings land as YAML-frontmatter markdown in `~/meetings/`. Every agent reads the same source of truth. No vendor lock-in.\n• Start/stop live meeting recordings\n• Process audio files (WAV, M4A, MP3, OGG), Looms, and ScreenPal recordings\n• Local transcription with Whisper or Parakeet. Nothing leaves your machine.\n• Speaker diarization, native, no Python required\n• Dictation mode for clipboard and daily notes\n• Live transcript reads for mid-meeting AI coaching\n• Desktop-context rewind and activity summaries around recordings and live sessions\n• Full-text search across every meeting and memo\n• Decisions, commitments, questions, and relationship context across meetings\n• An interactive MCP App dashboard inside Claude conversations\n\n**Privacy:** all audio processing happens locally via whisper.cpp or parakeet.cpp plus pyannote-rs. No audio or transcripts are sent anywhere.",
"version": "0.14.1",
"author": {
"name": "Mat Silverstein",
"url": "https://github.com/silverstein"
},
"repository": {
"type": "git",
"url": "https://github.com/silverstein/minutes"
},
"homepage": "https://useminutes.app",
"documentation": "https://useminutes.app/for-agents",
"support": "https://github.com/silverstein/minutes/discussions",
"license": "MIT",
"icon": "icon.png",
"keywords": [
"meetings",
"transcription",
"voice-memo",
"whisper",
"parakeet",
"memory",
"search",
"action-items",
"decisions",
"mcp-apps",
"interactive",
"speaker-diarization",
"privacy",
"local",
"offline",
"meeting-notes",
"voice",
"relationship-intelligence"
],
"tools": [
{
"name": "start_recording",
"description": "Start recording audio from the default input device"
},
{
"name": "stop_recording",
"description": "Stop the current recording and process it"
},
{
"name": "get_status",
"description": "Check if a recording is currently in progress"
},
{
"name": "list_processing_jobs",
"description": "List background processing jobs for recent recordings"
},
{
"name": "list_meetings",
"description": "List recent meetings and voice memos"
},
{
"name": "search_meetings",
"description": "Search meeting transcripts and voice memos"
},
{
"name": "get_meeting",
"description": "Get full transcript of a specific meeting"
},
{
"name": "activity_summary",
"description": "Summarize meeting-adjacent desktop context for a linked artifact, context session, or time window"
},
{
"name": "search_context",
"description": "Search desktop-context events across app focus and captured window titles, including opted-in browser titles"
},
{
"name": "get_moment",
"description": "Show the local desktop-context rewind around a linked artifact, session, or timestamp"
},
{
"name": "process_audio",
"description": "Process an audio file through the transcription pipeline"
},
{
"name": "add_note",
"description": "Add a timestamped note to the current recording or an existing meeting"
},
{
"name": "consistency_report",
"description": "Flag conflicting decisions and stale commitments"
},
{
"name": "get_person_profile",
"description": "Build a profile for a person across all meetings"
},
{
"name": "research_topic",
"description": "Research a topic across meetings, decisions, and follow-ups"
},
{
"name": "qmd_collection_status",
"description": "Check if the Minutes output directory is registered as a QMD collection"
},
{
"name": "register_qmd_collection",
"description": "Register the Minutes output directory as a QMD collection"
},
{
"name": "start_dictation",
"description": "Start dictation mode — speech to clipboard and daily notes"
},
{
"name": "stop_dictation",
"description": "Stop dictation mode"
},
{
"name": "track_commitments",
"description": "List open and stale commitments, optionally filtered by person"
},
{
"name": "relationship_map",
"description": "All contacts with relationship scores and losing-touch alerts"
},
{
"name": "list_voices",
"description": "List enrolled voice profiles for speaker identification"
},
{
"name": "confirm_speaker",
"description": "Confirm or correct speaker attribution in a meeting transcript"
},
{
"name": "get_meeting_insights",
"description": "Query structured meeting insights (decisions, commitments, questions) with confidence filtering"
},
{
"name": "start_live_transcript",
"description": "Start a live transcript session for real-time meeting transcription"
},
{
"name": "read_live_transcript",
"description": "Read utterances from the active live transcript with optional cursor or time window"
},
{
"name": "open_dashboard",
"description": "Open the Meeting Intelligence Dashboard in the browser — visual overview of conversation memory"
},
{
"name": "ingest_meeting",
"description": "Extract facts from a meeting and update the knowledge base (person profiles, log, index)"
},
{
"name": "knowledge_status",
"description": "Show the current state of the knowledge base — configuration, adapter, people count, log entries"
}
],
"prompts": [
{
"name": "meeting_prep",
"description": "Prepare for an upcoming meeting",
"arguments": [
"meeting_topic"
],
"text": "Search my past meetings about ${arguments.meeting_topic}. Find relevant decisions, open action items, and context I should review before the meeting."
},
{
"name": "weekly_review",
"description": "Review this week's meetings",
"text": "List my meetings from the past week and summarize key decisions, action items, and commitments. Flag anything overdue."
},
{
"name": "find_action_items",
"description": "Find action items assigned to someone",
"arguments": [
"person"
],
"text": "Search for all open action items assigned to ${arguments.person}. Show what was committed, when, and in which meeting."
},
{
"name": "person_briefing",
"description": "Get a briefing on a person before a meeting",
"arguments": [
"name"
],
"text": "Build a profile for ${arguments.name} across all my meetings. What topics do we discuss? What commitments are open? What decisions have we made together?"
},
{
"name": "topic_research",
"description": "Research a topic across all meetings",
"arguments": [
"topic"
],
"text": "Research ${arguments.topic} across all my meetings. Show related decisions, open follow-ups, and which meetings discussed it most."
},
{
"name": "start_meeting",
"description": "Start recording a meeting",
"arguments": [
"title"
],
"text": "Start recording a meeting titled '${arguments.title}'. I'll transcribe it locally when we're done."
}
],
"server": {
"type": "node",
"entry_point": "crates/mcp/dist/index.js",
"mcp_config": {
"command": "node",
"args": [
"${__dirname}/crates/mcp/dist/index.js"
],
"env": {
"MINUTES_HOME": "${user_config.minutes_home}",
"MEETINGS_DIR": "${user_config.meetings_dir}"
}
}
},
"user_config": {
"meetings_dir": {
"type": "directory",
"title": "Meetings Directory",
"description": "Where meeting transcripts and voice memos are saved",
"required": false,
"default": "${HOME}/meetings"
},
"minutes_home": {
"type": "directory",
"title": "Minutes Data Directory",
"description": "Where Minutes stores its configuration, logs, and PID files",
"required": false,
"default": "${HOME}/.minutes"
}
},
"compatibility": {
"platforms": [
"darwin",
"win32",
"linux"
],
"runtimes": {
"node": ">=18.0.0"
}
},
"privacy_policies": [
"https://useminutes.app/privacy"
]
}