News
CES 2026: AI Robots Use “Visual Taste” to Cook Perfectly
Your chef is now a camera
Photo by Florian Schindler on UnsplashAt CES 2026 in Las Vegas, the spotlight shifted from simple automated arms to “intelligent culinary agents” capable of tasting with their eyes. Multiple manufacturers, including Nosh Robotics, Starbot Inc., and Antamix, demonstrated a new generation of cooking bots that use real-time computer vision to adjust cooking parameters autonomously. Nosh Robotics unveiled a system that monitors the visual progress of food—such as the browning of onions—to dynamically control heat and timing, rather than following a fixed timer.
The technology marks a crossover between consumer and commercial capability. Nosh’s unit features a camera-based “eye” that functions like a human chef’s, managing over 500 recipes with precise ingredient dispensing and stirring. Meanwhile, Antamix displayed a multifunctional unit designed for both B2C and B2B markets, integrating viscosity sensors and a 1,000-watt motor to handle heavy doughs and complex prep tasks like chopping and steaming in a single footprint. These systems enable a “design once, sell to both” strategy, reducing R&D costs while bringing industrial-grade consistency to smaller kitchens.
Why it matters: This represents the “Level 4” of kitchen automation: autonomy. Previous robots required strict inputs; these systems adapt to variables like ingredient size or pan heat. For HoReCa operators, this technology promises to solve the consistency crisis caused by high staff turnover. It allows ghost kitchens and QSRs to deploy “adaptive” standard operating procedures (SOPs) that self-correct, ensuring the 100th burger tastes exactly like the first without constant human supervision.
