AI Systems
Local-first
thesecretlab local platform
Qwen 3.6 for private local reasoning.
A browser console for the local Qwen runtime: fast coding support, private reasoning, stack trace analysis, and implementation planning through an Ollama-compatible endpoint on your own machine.
Runtime
35B A3B local
Endpoint
localhost:11434
Mode
private by default
Prompt Workspace
Browser-to-local inference through your own machine.
Response
0 chars
Local model output will stream here.
Private by default
The browser calls your local endpoint directly. Prompts do not route through a public hosted model from this page.
Qwen 3.6 local
Configured around your `qwen3.6-35b-a3b-unsloth-q3ks:latest` local runtime and stable Modelfile defaults.
Operational note
For live use from thesecretlab.app, Ollama must allow this origin with OLLAMA_ORIGINS and be reachable from the browser.