
The Winning Platform for
Specialized Small Language Models
Discover, deploy, and integrate specialized AI models (0.1B-7B parameters) that deliver expert-level knowledge in specific domains. From medical diagnosis to embedded systems, find purpose-built models that run efficiently on edge devices.
Specialized over general
Expert SLMs tuned for real work—precise, predictable, and production‑ready.
Integrates anywhere
API and MCP for agents/IDEs—connect in minutes with no heavy lift.
Private by design
Run locally or fully on‑prem when data must stay inside the perimeter.
Featured Models
Why SLMs?
SLMs are specialized, private, and predictable—domain experts that deliver crisp, production‑ready answers. Run them at the edge to keep data local and move fast; bring in an LLM only when a question truly needs open‑ended exploration.
Explore Millions of Micro‑SLMs
Hover any star to preview an expert model — domain, size, latency, price, and tags.
For Creators
Publish expert SLMs to the global network—free or paid. Earn a share when paid models are used.
Perfect for manufacturers and service companies to ship experts trained on products, manuals, and equipment.
For Users
Tap the entire network via API in your apps and LLM workflows, or integrate with agents like Claude Code using MCP.
Find the right expert instantly and drop it into your stack without heavy lift.
For Enterprises
Host slm.wiki fully on‑prem so teams can create private experts on sensitive corpora—available only on your internal network.
Roll your own AI network for ultra‑low cost, with control, security, and compliance built in.
Use Cases
Research
Deep dives on when small beats big, orchestration patterns, and the emerging SLM marketplace.
Where specialized SLMs outperform general LLMs—and how to route between them.
Read more →Design patterns for ranking, matching, and escalating queries across experts.
Explore patterns →How a network of niche experts changes cost, speed, and quality dynamics.
View insights →