add managed vllm server

This commit is contained in:
Dakota 2025-11-07 13:06:49 -06:00
parent 578175a709
commit e6ac3abdcb
9 changed files with 597 additions and 15 deletions

View file

@ -181,6 +181,7 @@ These class-level variables in `BaseEnv` can be overridden in your subclass to c
* **`server_cls: Type[APIServer]`**:
* Default: `APIServer`
* Purpose: Specifies the class to be used for managing interactions with API servers (e.g., inference endpoints). Should mostly be used for developing additional API interfaces, but if you need a nonstandard way of connecting with an existing API you can use this to easily slot in any modifications you need.
* **Note:** In most cases, you should use the `server_type` field in your `APIServerConfig` instead of overriding this. Set `server_type` to `"openai"` (default), `"vllm"`, `"sglang"`, or `"trl"` to automatically use the appropriate server class with enhanced features like native API access and full token/logprob tracking.
## Provided Functionality