LiteLLM Proxy #637
-
|
Currently I'm using Aider with LiteLLM as Proxy configured as described in the documentation: https://docs.litellm.ai/docs/providers/litellm_proxy |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 6 replies
-
|
AiderDesk has direct support for LiteLLM - you should be able to set it up via the Model Library. However, I personally do not use this provider, so I am not really sure if that's something really supported. The integration with the provider has been submitted by @jutaz, so maybe he will be able to answer your question. |
Beta Was this translation helpful? Give feedback.


Hm, I see, so this is a specific provider different from LiteLLM, I guess. Can you try to set it up as an OpenAI Compatible provider to see it that will work?
Change the Base URL and set the API Key, if needed.