Now that a AI connection is build into Rapid PHP 2025, when can we expect to connect it to our own local LLM models?
Preferable via koboldCCP or oobabooga.
Preferable via koboldCCP or oobabooga.
Statistics: Posted by Hans Meiser — Fri May 03, 2024 4:22 pm