Language Comparison
| Language | Typical Latency | Key Libraries | Best For | Limitation |
|---|---|---|---|---|
| Python | 1-10 ms (non-deterministic) | NumPy, OpenCV, PyTorch, rclpy | Rapid prototyping, ML/AI, data processing, research | GIL prevents true parallelism; not real-time safe |
| C++ | <1 ms (deterministic with RT kernel) | Eigen, rclcpp, MoveIt2, Drake | Real-time control loops, drivers, production systems | Slow iteration; complex build systems; memory management |
| Rust | <1 ms (deterministic) | ros2_rust, nalgebra | Safety-critical systems, memory-safe drivers | Small robotics ecosystem; steep learning curve |
| MATLAB / Simulink | N/A (offline / codegen) | Robotics System Toolbox, ROS Toolbox | Control system design, simulation, academic research | Expensive license ($2K+/yr); not used in production |
| Proprietary (KRL, RAPID, Karel) | <1 ms (built into controller) | Vendor-specific | Industrial robot programming on OEM controllers | No portability; vendor lock-in; limited ecosystem |
ROS2 Architecture for Robot Control
ROS2 (Robot Operating System 2) is not a language -- it is a middleware framework that provides communication (DDS), package management, and a standard architecture for building robot software. Key concepts:
- Nodes: Independent processes (one per sensor, controller, planner). Communicate via topics (pub/sub), services (request/response), and actions (long-running tasks).
- Topics: Asynchronous message passing.
/joint_states(sensor_msgs/JointState) publishes joint positions at 100-1000 Hz./camera/image_raw(sensor_msgs/Image) publishes frames at 30-60 Hz. - Control loop hierarchy: Servo-level (1-10 kHz, runs in motor controller firmware) → Joint-level (100-1000 Hz, ros2_control) → Task-level (10-30 Hz, MoveIt2/policy inference)
The practical recommendation: write Python nodes for perception, planning, and high-level logic. Write C++ nodes for control loops that must meet timing guarantees. This is the pattern used by all major ROS2 robot drivers.
Real-Time Control Requirements
Different robot components have different timing requirements:
- Servo control loop: <1 ms (1-10 kHz). Runs in motor controller firmware (not in your code). Handles PID position/velocity/torque control.
- Joint control loop: 1-10 ms (100-1000 Hz). Sends target positions/velocities/torques to motors. Must be deterministic. Use
ros2_controlwith a real-time kernel (PREEMPT_RT patch for Linux). - Motion planning: 10-100 ms per plan. MoveIt2 generates collision-free trajectories. Not real-time but should be fast enough for reactive replanning.
- Policy inference: 10-50 ms per step (20-100 Hz). Neural network policies (imitation learning, RL) running on GPU. Python is fine here -- the GPU handles the heavy compute.
- Vision processing: 16-33 ms per frame (30-60 Hz). Detection, segmentation, pose estimation. GPU-accelerated, Python with PyTorch/ONNX.
Sample Code: Basic Arm Control with ROS2 Python
import rclpy
from rclpy.node import Node
from trajectory_msgs.msg import JointTrajectory, JointTrajectoryPoint
from builtin_interfaces.msg import Duration
class ArmController(Node):
def __init__(self):
super().__init__('arm_controller')
self.publisher = self.create_publisher(
JointTrajectory,
'/joint_trajectory_controller/joint_trajectory',
10
)
self.joint_names = [
'joint1', 'joint2', 'joint3',
'joint4', 'joint5', 'joint6'
]
def move_to_pose(self, positions, duration_sec=2.0):
"""Send arm to target joint positions."""
msg = JointTrajectory()
msg.joint_names = self.joint_names
point = JointTrajectoryPoint()
point.positions = positions # radians
point.time_from_start = Duration(sec=int(duration_sec),
nanosec=int((duration_sec % 1) * 1e9))
msg.points = [point]
self.publisher.publish(msg)
self.get_logger().info(f'Moving to: {positions}')
def main():
rclpy.init()
controller = ArmController()
# Home position
controller.move_to_pose([0.0, -1.57, 1.57, 0.0, 1.57, 0.0])
rclpy.spin(controller)
rclpy.shutdown()
if __name__ == '__main__':
main()
This example works with any ROS2-compatible arm (OpenArm 101, UR, Kinova) that exposes a JointTrajectoryController. For the OpenArm 101, the joint names are documented in the URDF and MoveIt2 config provided with the arm.
Key Libraries and Frameworks
- MoveIt2: Motion planning framework. Collision-aware trajectory generation, inverse kinematics, grasp planning. Python and C++ APIs. The standard for arm manipulation in ROS2.
- Nav2: Navigation stack for mobile robots. Path planning, obstacle avoidance, localization (AMCL). Used with TurtleBot 4, Unitree Go2.
- Drake: MIT's robotics library for multibody dynamics, optimization-based control, and simulation. C++ with Python bindings. Excellent for contact-rich manipulation research.
- PyTorch / JAX: ML frameworks for training manipulation policies. PyTorch dominates in robotics research (used by RT-2, Octo, ACT). JAX is gaining traction for RL (brax, mujoco-mjx).
- OpenCV: Computer vision. Camera calibration, image processing, ArUco detection. Available in Python and C++.
- LeRobot (Hugging Face): Open-source framework for robot learning. Dataset management, policy training (ACT, Diffusion Policy), and deployment. Python-based.
When to Use Each Language
- Use Python when: Prototyping new behaviors, writing ML training pipelines, processing sensor data, scripting experiments, building web dashboards. 90% of your robotics code should be Python.
- Use C++ when: Writing motor drivers, implementing control loops that must meet hard timing deadlines (<1ms jitter), building production systems where performance matters, contributing to MoveIt2 or Nav2.
- Use Rust when: Building safety-critical components where memory safety is paramount, writing new motor drivers, or when your team has Rust expertise.
- Use proprietary languages when: You must program an industrial robot (KUKA, ABB, FANUC) on its native controller. Avoid if possible -- prefer ROS2 bridges.