Neural Scaling Laws in Robotics

Sebastian Sartor, Neil Thompson

May 22, 2024
Neural scaling laws have driven advances in ML, yet robotics remains underexplored. Via a meta-analysis of 327 papers, we measure how data, model size, and compute affect performance for robot foundation models (RFMs) and LLMs on robotic tasks. Performance improves with resources following a power-law, scaling faster than in language tasks; new robot capabilities emerge as scale increases, suggesting notable gains with more data and compute.