atropos/atroposlib
teknium e1ece3e64e Add reasoning configuration support across server implementations
- Updated server classes (OpenAIServer, SGLangServer, TrlVllmServer, VLLMServer) to accept a ReasoningConfig parameter during initialization.
- Enhanced ReasoningConfig to allow flexible max_tokens without strict validation, accommodating varying provider limits.
- Implemented reasoning configuration injection in APIServer methods for chat and completion handling.
- Updated tests to reflect changes in max_tokens validation logic.

This commit integrates reasoning capabilities into the server handling architecture, improving compatibility with diverse reasoning models.
2026-01-05 23:20:01 +00:00
..
api [pre-commit.ci] auto fixes from pre-commit.com hooks 2025-11-07 18:10:40 +00:00
cli fix error in function inference_node_wandb_watcher.py 2025-06-27 22:13:37 +02:00
envs Add reasoning configuration support across server implementations 2026-01-05 23:20:01 +00:00
frontend --slurm and --testing in outer namespace 2025-05-02 03:46:34 -07:00
tests Add reasoning configuration support across server implementations 2026-01-05 23:20:01 +00:00
utils Update metrics.py 2025-10-23 10:26:21 +02:00
__init__.py first commit 2025-04-29 12:10:10 -07:00
FAQ.md linting 2025-05-16 20:40:15 -07:00
type_definitions.py Update type_definitions.py 2025-10-23 10:26:51 +02:00