fix imports and style issues

This commit is contained in:
Allan Niemerg 2025-05-27 11:00:35 -05:00
parent 7a653044a4
commit 013090579d
4 changed files with 176 additions and 146 deletions

View file

@ -78,7 +78,8 @@ python -m environments.smolagents_integration.smolagents_env process \
--env.max_concurrent_processes 8 \
--env.use_chat_completion true \
--openai.model_name "gpt-4o" \
--openai.base_url "https://api.openai.com/v1"
--openai.base_url "https://api.openai.com/v1" \
--openai.api_key x
```
```bash
@ -91,7 +92,8 @@ python -m environments.smolagents_integration.smolagents_env process \
--env.max_concurrent_processes 8 \
--env.use_chat_completion true \
--openai.model_name "gpt-4o" \
--openai.base_url "https://api.openai.com/v1"
--openai.base_url "https://api.openai.com/v1" \
--openai.api_key x
```
Note: The command syntax uses dots (`.`) to separate namespaces. Also, the OpenAI API key should be set in your environment variables as `OPENAI_API_KEY` or in a `.env` file in the project root.
@ -118,8 +120,8 @@ python -m environments.smolagents_integration.smolagents_env serve \
--env.use_chat_completion true \
--env.max_concurrent_processes 5 \
--env.group_size 8 \
--openai.model_name "gpt-4o" \
--openai.base_url "https://api.openai.com/v1"
--openai.model_name "your-model-name" \
--openai.base_url "http://localhost:8000/v1"
```
## How It Works
@ -204,4 +206,4 @@ Each task includes:
- **Web tool errors**: If Tavily tools aren't working, make sure you have set the `TAVILY_API_KEY` environment variable and have installed the `tavily-python` package.
- **Tool import errors**: If you see errors about missing tool modules, ensure your working directory allows proper imports of the tools folder.
- **Permission errors with file tools**: Ensure your process has the correct permissions to read/write files in the directories being accessed.
- **Memory issues**: If you encounter memory usage problems, try lowering the `max_concurrent_processes` parameter.
- **Memory issues**: If you encounter memory usage problems, try lowering the `max_concurrent_processes` parameter.