AI/ML News

Stay updated with the latest news and articles on artificial intelligence and machine learning

When machines start building their own minds

Computer scientist Peter Burke has demonstrated that artificial intelligence can autonomously generate the control systems – “brains” – of other robots, completing complex coding tasks significantly faster than traditional human teams. The project leverages advanced generative AI models, including ChatGPT, Gemini, and Claude, to create a fully functional drone control system that operates entirely onboard the drone.

Burke, a professor of electrical engineering and computer science at the University of California, Irvine, structured the project around two types of “robots.” The first is the AI software running on laptops and in the cloud, responsible for writing code. The second is the drone itself, which uses a Raspberry Pi Zero 2 W to host and run the AI-generated software in real time.

Traditional drone systems rely on ground control software such as Mission Planner or QGroundControl to manage flight. Burke’s approach replaces the ground-based control station with a web-hosted system called WebGCS (web ground control station), which runs directly on the drone. This allows pilots to access a live control dashboard via a standard web browser, providing real-time telemetry, mission planning, and autonomous navigation.

The development process was organized into four intensive sprints. The first sprint used Claude in a browser to generate the initial codebase, but memory limitations prevented the project from completing. Subsequent attempts with Gemini 2.5 and Cursor IDE improved functionality but encountered errors, such as issues with Bash shell scripting and context limitations across multiple files.

The fourth and final sprint, using Windsurf IDE, allowed the AI to successfully produce the WebGCS system. Over 2.5 weeks and approximately 100 hours of human labor, the AI generated 10,000 lines of code, including Python, HTML, JavaScript, and Bash scripts. This is roughly 20 times faster than Burke’s previous human-led project, Cloudstation, which required four years of cumulative work by a team of students.

The project highlighted current limitations in AI coding: while models can effectively handle codebases up to around 10,000 lines, performance degrades sharply for larger systems. Research confirms that exceeding token limits in AI models leads to reduced accuracy in code generation and debugging.

The implications of this work extend beyond drones. By demonstrating that AI can autonomously create complex, multi-language software systems, Burke’s project points toward a future where machines can design and manage other machines. While the current system remains limited to single drones, the research suggests the potential for AI-controlled swarms, autonomous spatial intelligence applications, and large-scale automated control systems.

Technology like these could radically transform the field of aerial robotics, making autonomous navigation, planning, and decision-making more accessible. However, questions about reliability, testing in unpredictable environments, and safety oversight remain central challenges for the future of AI-driven robotics.