Touhou Little Maid

Touhou Little Maid

3M Downloads

[Bug] Local LLMs are incompatible?

mappleaf opened this issue ยท 3 comments

commented

Minecraft Version

  • 1.12.2 (End of support)
  • 1.16.5 (End of support)
  • 1.18.2 (End of support)
  • 1.19.2 (End of support)
  • 1.20/1.20.1
  • 1.21/1.21.1

What happened?

I have installed Open WebUI with ollama, I've tried to connect TLM to it instead of regular services like this (in config/touhou_little_maid/sites/llm.json):

"local-webui": {
    "id": "local",
    "api_type": "openai",
    "enabled": true,
    "icon": "touhou_little_maid:textures/gui/ai_chat/openai.png",
    "url": "http://127.0.0.1:8080/api/chat/completions",
    "secret_key": "<redacted for privacy>",
    "headers": {},
    "models": [
      "qwen3:8b"
    ]
  }

At this moment I've tried almost every enpoint (espicially /api/chat/completions, /ollama/api/chat/, /ollama/v1/chat/completions and /openai/chat/completions), returned me just error code 400 (sometimes was 405, 402 and even 422 during my tests). Probably TLM just doesn't support it natively for some reason?
I'm not so into AI things, so I'm asking other users or dev to check it out to approve/refute issue (very likely I may be just dumb and has set up something wrong).

Relevant log output

[ForkJoinPool.commonPool-worker-23/ERROR] [touhou_little_maid/]: LLM request failed: http://127.0.0.1:8080/api/chat/completions POST, error is HTTP Error Code: 400, Response (POST http://127.0.0.1:8080/api/chat/completions) 400
[Render thread/INFO] [minecraft/ChatComponent]: [System] [CHAT] Error in LLM Request Received: HTTP Error Code: 400, Response (POST http://127.0.0.1:8080/api/chat/completions) 400

Contact Details

No response

commented

This is not Bug, You config is wrong, you need install Ollama or LM Studio.
Using my config file.
https://gist.github.com/AkarinLiu/2f88a35d9f05f207546635f31cb1e67c

commented

Your config is wrong

What is wrong exactly? I don't see any difference in json fields, and I can use any chat-completions endpoint if it has openapi protocol support i think.


you need install Ollama or LM Studio. Using my config file.

I'm using Open WebUI, as I said, it's just an api provider and manager, so I can connect through it using its endpoints. Ollama is installed, also as I said, otherwise I wouldn't be using LLMs at all, which is working just fine from webui, also through api (both tested and confirmed working).