Elia + LocalAI + Kubernetes for chatting to a self hosted LLM #26
richiejp
started this conversation in
Show and tell
Replies: 1 comment 5 replies
-
|
Hey - thanks for this. Sorry I didn't reply earlier, it slipped by me. I just recently released 1.0 which greatly changes how things work - not sure if that's going to make things harder or easier for you. Elia does now support local LLMs, but it offers that support via |
Beta Was this translation helpful? Give feedback.
5 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi @darrenburns , great project! I have been looking for ways to quickly test LLMs I have deployed on Kubernetes and was told about Elia on the LocalAI Discord. I have created an Elia container and K8s config which automatically configures Elia to connect to a model deployed on LocalAI (OpenAI API compatible). This is a bit hacky because Elia doesn't allow arbitrary models to be specified, but works.

https://github.com/premAI-io/prem-operator/blob/main/docs/guides/elia.md
Beta Was this translation helpful? Give feedback.
All reactions