MCA Reborn [Fabric/Forge]

MCA Reborn [Fabric/Forge]

7M Downloads

unable to build ?

blenderman94 opened this issue · 11 comments

commented

Greetings im trying to thinker with this mod on my pc to make some changes to / able to use my own large langue models with this mode for some more "private" content. i try to build the mod with the gradlew.bat it did not throws any compile error but the end result of compilation semingly nowhere. before it start to compile it barks about git but it is installed. if its not possible for some reason to give that control to users can you just make it to work with kobold cpp its already openai compatible so it wil be no big deal i guess.

commented

problem semingly solwed by reinstalling windows 11

commented

Koboldcpp has an OpenAI endpoint and thus works out of the box; I assume it is http://localhost:5001/v1/chat/completions by default.

https://github.com/Luke100000/minecraft-comes-alive/wiki/GPT3-based-conversations#custom-endpoints

commented

it work halfway but it gives back an error Bedeutungen
Hinweis: Wenn du dich betätigen, um die Integration von GitHub Actions mit der Meldung in Slack zu erreichen, musst du zuerst eine Slack-App-API-Token in dein Projekt anfordern und ein Konto in dein Slack-Konto erstellst, das dich berechtigt, auf deine Slack-Channel zu senden.

  1. Anfordern eines Slack-App-
commented

the message goes in for completion but the game says an error ingame as well ingame it says unkown error check log i somehow made it to work on the server side it gneratesa response but it is not coming back to the game

commented

bug

commented

Hmm, just tested it, works for me:
/mca chatAI model "http://localhost:5001/v1/chat/completions"

Tested with koboldcpp 1.74, default settings, llama 8B model.

Maybe share the latest.log

commented

i see im on 1.77 i gues i donwgrade to 74 and give it a shot

commented

i dont get it i just did what you sayed same version same modell and the message still not coming back ingame. did you have to change any firewall setting or whatewer to get this work? actualy you are using the python version of kobold or the cpp one if on cpp then which one im using the old cpu version maybe i try with ogabuga webui might save me

commented

i suspect there is something stupid is going on no matter which program i use it produces the same problem firewall is set to allow kobold and java to go trough im on 1.20,1 maybe that the problem?

commented

The latest log would tell us more

commented

very odd, but glad it works :)