Scribeist LogoScribeist

Local Providers

How to connect Ollama and LM Studio in Scribeist.

3 min readLast updated Apr 2, 2026

Local providers in Scribeist

Scribeist supports these local BYOK providers:

  • Ollama
  • LM Studio

These providers are useful if you want to run models on your own machine instead of paying a hosted API provider.

Important

Local providers are available on BYOK plans only.


How local provider models work

  • Ollama shows the models available from your Ollama instance
  • LM Studio shows the models available from its local API server

This means Scribeist only shows models your local setup is actually serving.


Connect Ollama

1. Start Ollama

Make sure Ollama is installed and running on your machine.

The default local endpoint is usually:

http://127.0.0.1:11434

2. Pull a model

Example:

ollama pull qwen2.5:0.5b

3. Connect it in Scribeist

  1. Go to Account > Integrations
  2. Open the Ollama connection
  3. Enter the base URL, usually http://127.0.0.1:11434
  4. Save

After the connection succeeds, Scribeist lists the Ollama models available from that server.


Connect LM Studio

1. Download a model

Downloading a model is not enough by itself.

The model must also be loaded into LM Studio and exposed through the local server.

2. Load the model

Open LM Studio and load the model you want to use into memory.

3. Start the local server

In LM Studio, start the local API server.

The default local endpoint is usually:

http://127.0.0.1:1234

4. Connect it in Scribeist

  1. Go to Account > Integrations
  2. Open the LM Studio connection
  3. Enter the base URL, usually http://127.0.0.1:1234
  4. Save

After the connection succeeds, Scribeist lists the models currently exposed by the LM Studio server.


Quick test

If you want to confirm the local server is working before connecting Scribeist:

Ollama

Open:

http://127.0.0.1:11434/api/tags

LM Studio

Open:

http://127.0.0.1:1234/v1/models

If you get a valid response listing models, Scribeist should be able to discover them too.


Troubleshooting local providers

I downloaded a model but Scribeist shows nothing

For local providers, downloading is not always enough.

Check:

  • the local server is running
  • the base URL is correct
  • the model is actually loaded or available from that server

The connection fails to save

Check:

  • the local app is running
  • the endpoint URL is correct
  • no firewall or local network rule is blocking the request

I only use local providers. Will Scribeist still work?

Yes. On a BYOK plan, Scribeist can use Ollama or LM Studio without needing a hosted provider connection, as long as the local provider is reachable.

Still need help?