10 min read
As industries continue to evolve and become more automated, industrial system integrators are constantly seeking new ways to improve efficiency and reliability.
The application of the GigE Vision standard for industrial automation imaging systems is one such field that is gaining attention. Since 2008, Hermary has been a member of the GigE Vision Technical Committee, tasked to design “an open transport platform based on gigabit Ethernet for high-performance vision applications.”
This article is based on an interview with Hermary’s R&D Director, Reza Sahraei. As a GigE Vision Technical Committee member, Reza’s tasks include reviewing and providing feedback on each version’s proposed changes. On behalf of Hermary, he also voted on the latest revision (2.2) of the GigE Vision Specification.
Before the interview, it is ideal to have an overview of the following key machine vision standards.
Machine Vision Standards: USB3 Vision®, CoaXPress, Camera Link®, and Camera Link HS®
For detailed specifications of each vision technology, please visit Vision Standards by A3.
In addition to GigE Vision, USB3 Vision®, CoaXPress®, Camera Link®, and Camera Link HS® are all standardized transport layer and communication protocols that allow data transfer from industrial cameras to computers over desired hardware. A3 (Association for Advancing Automation) currently manages and owns all five standards. Each standard has its specific advantages and limitations.
USB3 Vision uses USB 3.0 connections to transfer data between cameras and computers. USB3 Vision offers high bandwidth and easy integration with existing computer systems, making it a popular choice for machine vision and other industrial applications. However, it is limited by a maximum cable length of three meters.
CoaXPress is a standard that uses coaxial cables to transmit data from cameras to computers or other devices. It provides high-speed data transfer, long cable lengths, and low power consumption. It is well-suited for applications that require high-resolution imaging over long distances, such as vehicle traffic monitoring or video surveillance.
The first high-speed camera interface, Camera Link, was created in 2000. Data transmission between cameras and computers uses the Camera Link standard and specialized cables. It offers minimal latency and great data transmission speeds. Camera Link HS (high-speed) was released in 2012 using a different physical layer and protocol to provide even more bandwidth and quicker data transmission rates.
What is the GigE Vision® Standard?
The GigE Vision standard is a set of technical specifications for vision products (an application, a device, or both) to communicate effectively and efficiently over Ethernet networks. GigE Vision-compliant products are guaranteed to communicate with each other and work together, ensuring interoperability amongst devices.
Vision software standards: GenICam
While standardized vision hardware defines the communication protocols, GenICam, by the European Machine Vision Association (EMVA), builds on these protocols and codifies how software applications communicate with imaging devices. Generic Interface for Cameras, or GenICam, ensures frequently-used features such as image size, pixel format, and exposure time can be accessed and controlled in a consistent way regardless of device models and brands. Such standards enable system designers to develop software compatible with a wide range of imaging devices rather than writing custom code for each vision component.
Why GigE Vision?
One of Hermary’s earliest successes was LPS-2016, a co-planar profile scanner. We built it using Ethernet (the speed was 10 Megabit/sec at the time), so Hermary has a rich history using Ethernet. It is a widely used networking technology that offers several advantages over the other four networking technologies. One of the most significant advantages of Ethernet is its ability to handle high-speed data transfer over long distances.
Our scanners are most likely used in environments high in vibration, EMI, or even ESD. Naturally, many manufacturers locate their computer rooms away from these machine centers. Ethernet’s cable length can reach 100 meters, making it the ideal physical layer for transferring data from scanners to a PC in a factory environment.
The other reason Ethernet is well-suited to machine vision is its reliability in data transfer, thanks to its built-in error-checking and correction capabilities. Ethernet has also made many technologies, for example, PTP (Precision Time Protocol), possible. Whether it’s a meat processing plant or a sawmill, downstream processes rely on the timeliness and accuracy of our scanners’ data to make decisions. We must take all precautions to ensure our data arrives in the computer room on time and intact. The Ethernet cable is also ubiquitous, making it easy to source and cost-effective for our system integrators.
Will we use USB 3.0 in the future? It is entirely possible, but so far, Ethernet works best for our customers’ applications.
How do machine vision standards, i.e., GigE Vision, help with system integration?
One of the committee’s core values is ensuring interoperability between all certified vision devices and software applications. Whether to identify defects or tell robots where to go, system integrators tackle complex problems with automation on a daily basis. Before 3D machine vision became more widely available, 2D-based solutions often required incorporating vision components from multiple vendors. And this is where the machine vision standards come into play.
The GigE Vision Communication and Streaming Protocols
A GigE Vision-certified primary application can communicate with compliant devices by following the protocol in the specification document. GigE Vision Control Protocol (GVCP) sends control messages between a primary application and devices. It enables an application to configure a device and establish stream channels from the device. The device also follows the protocol to alert the primary application when particular events occur.
Another important protocol is the GigE Vision Stream Protocol (GVSP), which allows a device to send image data, data type, or image status to the primary application or elsewhere. So when a primary application taps into the device, it knows what kind of data it’s expecting. Both GVCP and GVSP use UDP as it has less overhead. GVSP provides a mechanism to ensure reliable packet transmissions between a device and an application. Both GVCP and GVSP are necessary for effective communication in GigE Vision.
What is the process of getting the GigE Vision certification?
This is a critical step to make sure compatibility with the GigE Vision Standard and interoperability with other compliant applications.
Prepare the device for testing: Before a device is issued the certification, we always test it ourselves to ensure the scanner’s software, hardware, and firmware are all compliant with the latest GigE Vision Specification. The GigE Vision Validation Framework is a handy tool for testing the interoperability of devices built on GigE Vision.
Bring the device to a testing center: The next step is to bring the scanner to a testing center. The EMVA’s International Vision Standards Meeting in Vienna this year (2023) also provides vendors a chance to certify their devices at the event venue.
The device is plugged into at least three primary applications from different companies. These applications will run through a checklist to make sure the device is transmitting, executing, or responding the way GigE Vision specification intended. Once this process is done, a software validation framework runs different scenarios on the device. The certification is awarded to the device only after it passes both processes.
GigE Vision: Working to solve a problem together
The objective of establishing vision standards is to ensure interoperability between certified vision devices and applications. The committee members all work very closely with each other so that the specification benefits every company.
“This is a collaborative effort between 50-plus machine vision vendors because we understand that manufacturers have challenges that system integrators want to provide solutions to. One vision company may not have all the answers, but if we work to coexist and communicate, system integrators can develop a richer solution for the end users.”
How have the standards evolved to reflect changing technologies?
GigE Vision standard started with 2D cameras and peripheral devices. 3D machine vision was added to the specification about five years ago, but the committee is open to any machine vision components. It could be x-ray, lighting, or even technologies we haven’t heard of yet. If it can benefit the industry, then the protocol should support it. We want to help system integrators retrieve valuable vision data from 2D, 3D, or N-dimension devices in the future.
When the GigE Vision committee was first created, the Ethernet speed was much slower than now. The most typical system architecture at the time was to have the camera take images of a scene, clock out the data, and send it to a computer room for image processing using Camera Link. The device itself did not perform any image processing.
On the contrary, devices using Ethernet or USB had to process the data before transmitting it because sending everything would take too long. These devices were made to be smart out of necessity so they could process data at the sources. Now we refer to this paradigm as Edge computing. And we see the industry moving towards smart edge devices, such as IoT and IIoT, even though the speed of Ethernet is 1,000 times than when we first started.
Can Cloud computing benefit machine vision devices?
Like everything else, Cloud computing has its pros and cons.
In Edge computing, a device has a finite space to house the hardware, limiting its computing capabilities at the edge. However, we have powerful processors and fast Ethernet speeds now. It takes microseconds or even nanoseconds for the device to process the data and later send it to a PC, where software algorithms will determine what to do next. Edge computing is fast, so most industrial applications use this method.
In Cloud computing, the tools you can put your data through become somewhat unbound. However, it opens up a gateway for potential meddling, so cybersecurity is the number one concern with Cloud computing in industrial automation. The second issue with Cloud computing is the delay. Even with the most powerful processor in the market and fast transmission speeds, it can take up to milliseconds – almost 1,000 times longer than Edge computing – for a downstream process to receive commands. For now, cloud computing presents a magnitude of delay or latency unacceptable to many industrial applications. But this may change one day.
How does artificial intelligence come into play?
However, solution seekers can benefit most when Edge computing and Cloud computing work in tandem. Artificial intelligence is a good example.
I see more and more machine vision systems using this architecture, where scanners at the Edge collect as much data as possible and upload it to the cloud for simulations. You choose the right AI model and tell it what you are identifying. In most of our use cases, we use a convolutional neural network (CNN). You train the CNN model by feeding labeled scanner data. You then evaluate and validate the results using a separate dataset to see how well it generalizes to new, unseen images. Further adjustments may be required at this stage. But once you are satisfied with the model, it becomes the configurations you can load onto the device.
Think of it as building a pasta maker, which takes a long time. But once you build it, making pasta is easier. Training the model can sometimes take weeks, depending on the architecture and dataset size, but patience and thoughtfulness can pay great dividends in the future.
What is the outlook, in your opinion, for machine vision?
A new protocol called OPC UA (Open Platform Communications United Architecture) is analogous to what we do at the GigE Vision technical committee but at a grander scale. It aims to achieve communication and interoperability between industrial devices that operate in the Field, Edge, and Cloud. The difference is OPC UA comprises members from the general automation industry, not just machine vision companies. The goal of OPC UA aligns with the vision of Industry 4.0, where industrial processes become more intelligent, connected, and automated. The protocol is still evolving but will provide a common language for different devices and software systems on separate layers to share information and automate processes more efficiently and effectively.
What is new on the horizon for you?
We have had a few hiccups with sourcing the right chipsets in the past few years. But now that’s solved, I am happy to say the new Amadeus platform is one step closer to launch. The name itself is a reference to Wolfgang Amadeus Mozart, who is considered to be one of the greatest music composers in history. We name this platform Amadeus because the new platform, also built upon the GigE Vision standard, offers greater composition flexibility to its users. The platform resembles the structure of an orchestra, which follows the conductor’s guidance to perform music for the audience. Like GigE Vision technical committee’s role in advancing machine vision, our role is to provide our channel partners with the best tool to do their job.