Use a local LLM to search everything you've seen, said, or heard — fully private, no cloud needed.
- Screenpipe running locally (
http://localhost:3030/healthshould respond) - LM Studio running with a model loaded (e.g.
qwen2.5-coder-14b) - Make sure your LM Studio model supports tool calling (qwen2.5, mistral-small, etc.)
# Adjust the LM Studio URL/port and model name to match your setup
$response = Invoke-WebRequest -Uri "http://192.168.9.201:1234/v1/chat/completions" -Method Post `
-ContentType "application/json" `
-Body '{
"model": "qwen2.5-coder-14b",
"messages": [
{
"role": "system",
"content": "You are a helpful assistant with access to the user screen and audio data via screenpipe. Use the search_screenpipe tool to find what the user saw, said, or heard on their computer. ALWAYS call the tool before answering — do not guess. Return concise answers based on the tool results. Include timestamps and app names when relevant."
},
{
"role": "user",
"content": "What was I working on in the last 2 hours?"
}
],
"tools": [
{
"type": "function",
"function": {
"name": "search_screenpipe",
"description": "Search the user screen recordings (OCR text) and audio transcriptions captured by screenpipe. Returns matching text with timestamps and app names.",
"parameters": {
"type": "object",
"properties": {
"q": {
"type": "string",
"description": "Search query keyword or phrase. Leave empty string to get recent activity."
},
"content_type": {
"type": "string",
"enum": ["ocr", "audio", "all"],
"description": "Type of content to search: ocr for screen text, audio for transcriptions, all for both."
},
"limit": {
"type": "integer",
"description": "Max results to return. Default 5, max 50."
},
"start_time": {
"type": "string",
"description": "ISO 8601 start time filter, e.g. 2026-02-11T09:00:00Z"
},
"end_time": {
"type": "string",
"description": "ISO 8601 end time filter, e.g. 2026-02-11T11:00:00Z"
},
"app_name": {
"type": "string",
"description": "Filter by application name, e.g. chrome, firefox, code, slack"
}
}
}
}
}
],
"tool_choice": "auto"
}'
# Parse the response
$result = $response.Content | ConvertFrom-Json
$toolCall = $result.choices[0].message.tool_calls[0]
Write-Host "Model wants to call: $($toolCall.function.name)"
Write-Host "With arguments: $($toolCall.function.arguments)"# Parse the arguments the model chose
$args = $toolCall.function.arguments | ConvertFrom-Json
# Build the screenpipe search URL
$params = @()
if ($args.q) { $params += "q=$($args.q)" }
if ($args.content_type) { $params += "content_type=$($args.content_type)" }
if ($args.limit) { $params += "limit=$($args.limit)" }
if ($args.start_time) { $params += "start_time=$($args.start_time)" }
if ($args.end_time) { $params += "end_time=$($args.end_time)" }
if ($args.app_name) { $params += "app_name=$($args.app_name)" }
$url = "http://localhost:3030/search?" + ($params -join "&")
Write-Host "Calling screenpipe: $url"
$screenpipeResult = (Invoke-WebRequest -Uri $url).Content
# Preview the data
$screenpipeResult | ConvertFrom-Json | ConvertTo-Json -Depth 5 | Select-Object -First 100# Send the tool result back so the model can answer your question
$followUp = Invoke-WebRequest -Uri "http://192.168.9.201:1234/v1/chat/completions" -Method Post `
-ContentType "application/json" `
-Body (@{
model = "qwen2.5-coder-14b"
messages = @(
@{ role = "system"; content = "You are a helpful assistant with access to the user screen and audio data via screenpipe. Summarize the tool results concisely. Include timestamps and app names." },
@{ role = "user"; content = "What was I working on in the last 2 hours?" },
@{ role = "assistant"; content = $null; tool_calls = @(@{ id = $toolCall.id; type = "function"; function = @{ name = $toolCall.function.name; arguments = $toolCall.function.arguments } }) },
@{ role = "tool"; tool_call_id = $toolCall.id; content = $screenpipeResult }
)
} | ConvertTo-Json -Depth 10)
$answer = ($followUp.Content | ConvertFrom-Json).choices[0].message.content
Write-Host "`n--- Answer ---"
Write-Host $answer- Model not calling the tool? Try
"tool_choice": {"type": "function", "function": {"name": "search_screenpipe"}}to force it - Hallucinating tool calls? Switch to
qwen2.5:14b(non-coder) ormistral-small— they handle tool calling better - Screenpipe API docs:
http://localhost:3030/— has full OpenAPI spec with all available search parameters - More search params:
window_name,browser_url,min_length,max_length,speaker_ids
GET http://localhost:3030/search?q=meeting&content_type=all&limit=10&start_time=2026-02-11T00:00:00Z
| Parameter | Description | Example |
|---|---|---|
q |
Search keyword | meeting |
content_type |
ocr, audio, or all |
all |
limit |
Max results (default 5) | 10 |
offset |
Pagination offset | 0 |
start_time |
ISO 8601 start | 2026-02-11T09:00:00Z |
end_time |
ISO 8601 end | 2026-02-11T17:00:00Z |
app_name |
Filter by app | chrome |
window_name |
Filter by window title | slack |