openai/ prefix tells LiteLLM to use the OpenAI-compatible path. The model name after the prefix (orchid01) is forwarded to the Orchid API.
LiteLLM Proxy config
If you’re running a LiteLLM proxy server, add Orchid as a model in your config:model: orchid01.