mirror of
https://github.com/NousResearch/atropos.git
synced 2026-04-19 12:57:58 +00:00
| .. | ||
| data | ||
| .gitignore | ||
| __init__.py | ||
| dataset_scr.py | ||
| grpo.py | ||
| judgement_model.py | ||
| llm_label.py | ||
| physical_env.py | ||
| physical_server.py | ||
| prepare_push_hf_dataset.py | ||
| pyrender_utils.py | ||
| README.md | ||
| render_stl.py | ||
| test_rendered_sphere_views.png | ||
| test_renderer_example.py | ||
| test_stl_env.py | ||
Physical Environment
Our project is a physical environment to train LLMs to generate STL files, the same files used in physical CAD designs.
Setup
$ pip install pyrender trimesh pyglet matplotlib torch transformers pydantic vllm numpy requests tenacity wandb
Shared libraries for Ubuntu GL rendering.
$ sudo apt-get install libglfw3-dev libgles2-mesa-dev libnvidia-gl-570-server
Training Data Generation
- Use
dataset_scr.pyto create the following directory structure:dataset/ ├── stls/ │ ├── model_0001.stl │ ├── model_0002.stl │ └── ... ├── images/ │ ├── model_0001.png │ ├── model_0002.png │ └── ... └── labels.json - Use
render_stl.pyto generate images from STL files. - Use
llm_label.pyto label the STL and image files. - Use
prepare_push_hf_dataset.pyto push the dataset to Hugging Face.
Generated run: https://wandb.ai/csxl/atropos-environments_hack0/runs/dlexyg5r Training run (ran out of memory): https://wandb.ai/csxl/grpo-physical-trainer/runs/t61am7gu