Exhibition
February 25, 2025
From January 22 to 24, Tokyo hosted five industry exhibitions, including Factory Innovation Week (FIW) 2025, the 17th AUTOMOTIVE WORLD, and the 11th WEARABLE EXPO, drawing a total of 85,430 visitors. FIW 2025 featured an area dedicated to robotics, where artificial intelligence (AI) technologies and advanced sensing solutions that enhance robot perception were key highlights.
Factory Innovation Week (FIW) held at Tokyo Big Sight South Hall
FIW 2025, held at Tokyo Big Sight, consisted of four future factory technology exhibitions, including the GREEN FACTORY Expo, which focused on green factory technologies/solutions, and the Manufacturing Industry’s HR Expo, which addressed the industry’s labor shortage. Among them, the RoboDEX zone showcased industrial robotics technologies and products.
Polyscope X enables easy creation of operator-friendly interfaces
One of the most prominent exhibitors in the RoboDEX zone was Universal Robots (UR), a Denmark-based manufacturer of collaborative robots with a Japanese branch in Minato, Tokyo, headed by Tsuyoshi Yamane, General Manager. At the exhibition, UR announced a complete overhaul of its proprietary “Polyscope” software, which is used to program robot movements and settings via a teaching pendant.
The newly developed “Polyscope X” enables the creation of simplified operator interfaces. Previously, system integrators had to pre-program various motion patterns, limiting operators to simply selecting and performing predefined actions. With Polyscope X, operators can now easily adjust workpiece size, weight and placement settings directly on a user-friendly interface.
In addition, a new “Smart Skill” feature has been introduced that allows frequently used actions to be performed with a single button press. For example, a command such as “Move the robot arm straight until it touches the workpiece” typically requires complex coordination with force sensors and extensive configuration. But Smart Skill simplifies this by allowing users to select a pre-configured command.
Another key feature of Polyscope X is its AI compatibility. It integrates seamlessly with ROS 2 (Robot Operating System 2), an open-source framework widely used to implement AI functionality in robotics. UR also introduced the “AI Accelerator”, an expansion kit designed to add AI capabilities to UR’s collaborative robots. The kit includes NVIDIA processing units and a 3D vision sensor to enable AI-based object recognition, autonomous path generation, and visual inspection.
Previously, using ROS 2 for control meant that Polyscope features could not be used. With Polyscope X, however, users can use ROS 2 for AI-driven functions such as autonomous path generation, while relying on Polyscope for standard operations.
At the exhibition, UR demonstrated real-world applications of AI-driven object recognition, path generation and visual inspection. “The response was overwhelmingly positive. Many visitors asked specific questions about how these solutions could be applied to their operations, demonstrating strong industry interest,” said Takaaki Yoshioka, Senior Marketing Manager.
Bridgestone Soft Robotics Ventures, an in-house venture of Bridgestone, exhibited its logistics-oriented demonstration system with the “TETOTE and” robotic hand.
This hybrid hand integrates suction functionality with “TETOTE”, a soft robotic hand that uses artificial rubber muscles to grasp a workpiece. The addition of suction enables the system to grasp and extract workpieces from difficult positions, improving handling efficiency for high-speed transport and handling of delicate or fragile items.
At the exhibition, the system was paired with a 3D vision sensor from Eureka Robotics, a Singapore-based robotics company with a Japanese branch in Chuo, Tokyo, headed by Prof. Pham Quang Cuong. The system demonstrated how to pick up workpieces without prior registration, demonstrating its adaptability.
Tactile sensors in the robotic fingertips allow delicate handling of an origami crane without crushing it
XELA Robotics, founded by Dr. Alexander Schmitz, Associate Professor at Waseda University, displayed its latest tactile sensors for robot hands.
“Our sensors can detect forces not only in the vertical direction, but also in the horizontal direction with high accuracy. This allows robots to sense the weight of objects based on the surface tension on their fingers, detect slippage, and perform more dexterous and precise grasping,” Dr. Schmitz explained.
One of the most attention-grabbing exhibits was a demonstration where a robotic hand gently grasped an Origami crane, demonstrating its delicate force control.
Having primarily served research institutions, XELA Robotics plans to expand its sales efforts to industrial customers starting this year.
Nikon’s 3D vision sensor enables robots to track the movement of workpieces on a conveyor
Nikon showcased its latest 3D vision sensor, which was launched in the autumn of 2024. The sensor demonstrated high-speed bin picking for small components, as well as conveyor tracking, in which a robot follows workpieces moving along a conveyor in real time. Unlike conventional systems, Nikon’s vision sensor enables seamless tracking using only visual data, eliminating the need for direct integration between the conveyor and the robot.
In addition, Nikon introduced a prototype function that allows robots to move in sync with the shape of objects on a moving conveyor, demonstrating the evolving capabilities of vision-based automation.
January 28, 2025
December 13, 2024
September 20, 2024
September 10, 2024
October 21, 2024