Basics
December 13, 2024
You’ve probably heard the term “3D vision sensor” thrown around a lot in discussions about industrial robots. But when asked, “What does that actually mean?” it may be complicated to give a straight answer. In our “Master Key Terms in the World of Robotics” column, we explore essential and emerging terms in the industrial robotics industry. This time, we focus on the “eyes” of robots – 3D vision sensors.
If robotic arms and mobile robots are considered the hands and feet of an automation system, vision sensors are its eyes. A vision sensor captures images of objects and converts them into data. This data is then analyzed, sometimes with the help of artificial intelligence (AI), to identify the object’s shape, orientation, and more.
While 2D vision sensors capture flat, two-dimensional images, 3D vision sensors add depth to the equation, perceiving objects in three dimensions. This capability is achieved through various methods, such as:
・Stereo vision: Mimicking the human eye, two cameras capture images from slightly different angles to calculate depth.
・Time of Flight (ToF): Measures the time it takes for a laser emitted from a light source to bounce back after hitting an object.
・Stereo vision provides highly accurate sensing, but requires calibration regularly.
・ToF sensors excel at wide-area sensing and can operate in low-light conditions, making them ideal for applications such as robotic picking. However, they are less effective outdoors in sunlight and can have small errors in height measurements.
Both methods generally struggle to detect objects made of certain materials, such as transparent, highly reflective, or matte black surfaces. However, advances in the technology are beginning to overcome these challenges.
The ability to perceive in three dimensions enables robots to accurately identify the position, shape, and size of irregularly shaped objects. This accuracy facilitates the development of more flexible robotic systems.
However, the cost of implementing 3D vision sensors, including the sensor itself and the associated data processing systems, can quickly become prohibitive. Selecting the right product for the application is therefore critical to achieving cost-effective automation.
September 13, 2024
November 10, 2023
October 22, 2024
November 18, 2024