EmbodiedAgents 🤖


EmbodiedAgents 🤖#

EmbodiedAgents is a production-grade framework, built on top of ROS2, designed to deploy Physical AI on real world robots. It enables you to create interactive, physical agents that do not just chat, but understand, move, manipulate, and adapt to their environment.

  • Production Ready Physical Agents: Designed to be used with autonomous robot systems that operate in real world dynamic environments. EmbodiedAgents makes it simple to create systems that make use of Physical AI. It provides an orchestration layer for Adaptive Intelligence.

  • Self-referential and Event Driven: An agent created with EmbodiedAgents can start, stop or reconfigure its own components based on internal and external events. For example, an agent can change the ML model for planning based on its location on the map or input from the vision model. EmbodiedAgents makes it simple to create agents that are self-referential Gödel machines.

  • Semantic Memory: Integrates vector databases, semantic routing and other supporting components to quickly build arbitrarily complex graphs for agentic information flow. No need to utilize bloated “GenAI” frameworks on your robot.

  • Pure Python, Native ROS2: Define complex asynchronous graphs in standard Python without touching XML launch files. Yet, underneath, it is pure ROS2 compatible with the entire ecosystem of hardware drivers, simulation tools, and visualization suites.

Checkout Installation Instructions 🛠️

Get started with the Quickstart Guide 🚀

Get familiar with Basic Concepts 📚

Dive right in with Examples Recipes

Contributions#

EmbodiedAgents has been developed in collaboration between Automatika Robotics and Inria. Contributions from the community are most welcome.

Table of Contents#

Complete Reference