By manipulating the different attributes of light, methods such as time of flight, interferometry, or confocal displacement are used in various industrial applications. The many geometric-based measurement techniques mentioned in the previous video are becoming standard in various industrial applications. However, niche cases where extreme resolution is necessary will have to rely on one of these light-based methods.
Capturing 3D data with Light-based Measurement Techniques
These measurement techniques rely on analyzing one of the following aspects of light in order to obtain an object’s 3D data —
- Speed of light
- Wave nature of light
- Specific wavelengths of light
Understanding the different methods used in machine vision will help you determine what is best for your automation project. Human talents and their tacit knowledge are the most valuable assets a manufacturer owns. That is why at Hermary, we engineer reliable 3D vision scanners and use point cloud data to help manufacturers digitize this wealth of knowledge.
If you would like us to cover any specific subjects or have any questions or feedback, leave us a comment below or contact us.
Hi, welcome to Machine Vision for Industry Professionals and educational video series for engineers to learn about machine vision and how to work with it.
Methods of Capturing 3D Data
In this video, we will explore the light-based methods used to capture 3D machine vision data introduced in the prior video of this series. By the end of this video, you should have a basic understanding of the different light-based methods and techniques used to capture 3D information in industrial environments.
The three broad techniques we will discuss take advantage of either the speed and distance over which light travels, the wave nature of light, or the ability to separate and isolate specific wavelengths of light optically.
My Name is Josh Harrington, and I am a Product Applications Engineer at Hermary, a leading Machine vision hardware manufacturer. I’m here to answer common questions regarding machine vision and share my experience in the field.
Time of Flight
Time of flight measurements involves determining the distance of an object by analyzing light as it is projected from a source and how/when it arrives at a detector. Time of flight can be broadly split into two categories: direct and indirect.
Direct time of flight devices emit a pulse of light and measure how long the light takes to hit an object and return to the detector. These types of measurements are particularly applicable to applications that require a very long-range such as range finders, or LIDAR scanners mounted to cars or mobile robots. This type of measurement typically requires some sort of motion to capture measurements over an area.
Indirect time-of-flight devices emit a continuously modulated light. A specialized detector then captures this light reflected off an object and measures the modulation phase. By knowing how far light will propagate over one period of modulation, the phase of the returned light correlates to the distance the light traveled. Most indirect time-of-flight measurement devices use a detector that has an array of phase detectors making them capable of making many simultaneous depth measurements.
Typically, these devices are limited to working within specific ranges as the measurement is not absolute but accurate to within one modulation period. Other drawbacks include measurement errors due to the light bouncing off multiple surfaces.
3D measurement devices that incorporate time-of-flight methods are growing in popularity as the cost of the devices is typically less than the highly calibrated geometry-based measurement devices. More manufacturers are also developing data processing techniques to mitigate noisy data and range issues associated with this technique.
The second light-based measurement technique we will discuss is interferometry. There are many types of interferometry-based measurements and variants, but at its core, an interferometric measurement relies on the comparison of a reference beam and measurement beam of light. The comparison is most often viewed as a pattern of interference fringes due to the constructive or destructive interference of the two beams.
A very basic example of this technique is a Michelson interferometer, where a coherent beam of light is split, then reflected off a target and a reference surface. The beams of light are then recombined to interfere with one another on a detector such as a camera.
Differences in either position or structure of the target surface will cause the reflected light to interfere with the reference beam of light. The destructive and constructive interference between these two light sources will create an interference pattern that correlates to the differences.
Interferometry measurements can measure very accurately down to the nanometer but are extremely sensitive to noise and thus typically used in highly controlled environments. Interferometry is very popular in the semiconductor and optics industry for measuring very small deviations from a target where nanometer-level accuracy is required.
Confocal Displacement Measurement
The final light-based measurement that we will discuss here is a newer and less common technique, but an interesting technique that is capable of measuring on highly specular surfaces, something that many other 3D measurement techniques can have trouble with. This method is called confocal displacement measurement.
This technique incorporates a beam of white light that is optically diffracted to have different wavelengths of light focused at different depths and a confocal imaging setup to capture only the light that is in focus. This light can then be analyzed for wavelength and correlated to a specific depth.
There are devices on the market that can make these measurements as a single point or a profile. These types of devices are well suited to many inspection tasks as they can easily cope with specular reflections and the sensor can be made to be relatively small. The main drawback to these sensors is they are only feasible over a relatively small range and are thus limited to applications that need high accuracy over just a small range.
There are many other light-based 3D measurement techniques that are either hybrids, tweaks, or extensions of the methods discussed here. By cleverly manipulating light, 3D information can be captured and used to enhance our ability to automate and optimize tasks and processes in many industrial automation applications.
In the next video in this series,][ we will explore what 3D measurement data looks like. If you have any comments or questions, please drop a line in the comment section below. Thanks, it’s been great having you here.