Veel gestelde vragen  over machine vision componenten

U heeft een machine vision specifieke vraag? Klik op onderstaande (Engelstalige) vragen voor onze antwoorden.

Staat uw vraag er niet bij?

Neem dan contact op met een van onze adviseurs. Zij staan u graag te woord via ons contactformulier, of bel naar 076-5440588, onze medewerkers staan u graag te woord.

A vision system encompasses much more than "just" a camera. This video provides a short introduction to the individual components.

When it comes to industrial image processing, there are many tasks, from quality controls to sorting foods, that require more than just one single camera. Responsibilities like these require a more comprehensive image processing system, known as a vision system. The camera is the centerpiece of the vision system, which is not to say that the other components in the system are not important. Using a yummy cookie as our sample product, we'll explain in more detail just how a vision system works.

Enjoy a surprisingly delicious lesson.

Baslers expert Thies Moeller details the most important components of this kind of system and explains how it is structured and what advantages it offers.

Hoe en wanneer werden digitale industriele camera's ontwikkeld, hoe werken ze  en waar worden ze voor gebruikt? 

Leer er meer over en bekijk  Baslers  Vision Campus video!

Its numerous advantages make USB 3.0 a highly attractive interface for many applications:
High data rates of up to 350MB/s
Low CPU load – Image Preprocessing via FPGA as in Basler ace USB 3.0 saves resources for software and the actual image processing
With Plug & Play, the software can easily detect the camera and allows for flexible use with varying host PCs
Low latency and jitter times
Standardized and tested accessories from host adapters to cables to hubs
Standardized, protocol and event-based communication between camera and software
Standardized camera control via GenICam 2.0

 

Switching from one camera interface to another always involves some integration effort within the system. This effort will pay off, if following criteria are met:
The hardware of the camera interface in use is gradually becoming more expensive and harder to obtain, as is currently the case with FireWire.
The bandwidth cannot meet the requirements of modern vision systems anymore (higher frame rates, higher resolution, or another image format). This, for example, applies to FireWire (IEEE1394b: max. 64MB/s) and USB 2.0 (max. 40MB/s).
Savings due to reduced manufacturing costs usually justify this one-time integration effort. This might apply when changing from FireWire or from frame grabber-based interfaces such as “analog” or “Camera Link base”.

The market offers a wide selection of industrial cameras with USB 3.0 interface. What makes Basler’s ace USB 3.0 so special is its extremely small footprint (29.3 mm x 29 mm x 29 mm) and its robustness. It is entirely USB3 Vision compatible, and provides – thanks to its 56MB frame buffer - additional advantages such as stability and flexibility. A wide range of firmware features on the proprietary FPGA, for example image preprocessing through debayering, color improvement, or automatic gain, are an additional bonus. Useful features such as sequencing or short-term image buffering complement the list of benefits. Furthermore, the camera boasts 4 Input/Output ports. What truly distinguishes the ace from its competitors is its extremely attractive performance-price-ratio. The camera is based on our tried and tested ace platform which has been on the market with GigE and Camera Link models since 2009, and which ranks among the most successful cameras of its class, worldwide.

While USB 3.0 describes the data interface in general, USB3 Vision represents the uniform standard jointly defined by the leading companies in the machine vision industry. USB3 Vision-certified hardware and software guarantee full compatibility among each other, and provide increased stability and therefore long-term investment security for customers. While USB 3.0 solely manages the interface’s basic requirements, USB3 Vision does a lot more: It also defines the mechanical requirements for accessories, the camera control by the software via GenICam, or the data transfer method via Bulk. Generally speaking, USB3 Vision makes USB 3.0 fit for industrial use.

Application notes are available that provide a detailed description of bandwidth limitations we have seen when testing some of the USB 3.0 hardware available.

Click here to download the application notes in PDF format.

If you connect a Basler USB 3.0 camera to a USB 3.0 port on your PC for the first time, it may happen that it takes quite some time (e.g. 5 minutes) for your camera to be ready for use.

This will be always the case if your PC has internet access.

The following article explains why this happens and suggests a measure to prevent it.

This article assumes that you have already installed the Basler pylon 4.0 Beta software
package (or newer) that is appropriate for the operating system (in our case Windows 7 x86) of your PC (download from: www.baslerweb.com ) and you are using a Basler USB 3.0 camera, a USB 3.0 host adapter with Renesas xHCI chip-set, and an appropriate USB 3.0 cable specified by Basler.

When you connect a Basler USB 3.0 camera to a USB 3.0 port on a PC for the first time, the Windows “Plug & Play Manager” will always check the Windows Update database for a better driver for your device. This check will be done independently if a pylon camera driver for your camera is already present on the PC or not.

This check will be relatively quickly skipped though, if your PC has no internet access.
In case your PC does have an internet access though, the procedure for checking for a better driver on Windows Update may take up to 5 minutes or longer.

In order to prevent Windows from checking for driver updates for devices having an appropriate driver already installed, you may consider the suggested solution below.

Go to “Start” -> “Control Panel” -> “System and Security” -> “System” (or simply press simultaneously the “Windows” button in the left bottom keyboard corner and the “Pause” button in the right upper keyboard corner):

Now click on “Advanced system settings” and click on the “Hardware” tab:

Now click on “Device Installation Settings”:

Now select “No, let me chose what to do” and select “Install driver software from Windows Update if it is not found on my computer.”.

You could also select the third option in this list, but it will affect driver updates for all other components and devices on your system. Hence, we do not recommend doing so.

After having selected the second option in the list click “Save Changes” and “OK” to complete the configuration.
USB 3.0

In pylon4linux 2.3.3 you may happen to get the following error message while trying to run the pylon viewer (or the SpeedOMeter), although the environment variables PYLON_ROOT and GENICAM_ROOT_V2_1 and the LD_LIBRARY_PATH were correctly exported:

"PylonViewerApp: symbol lookup error: /usr/lib/libQtNetwork.so.4:
undefined symbol: _ZN14QObjectPribate15checkWindowRoleEv"

In this case it seems some of the Qt libraries, i.e. libQtNetwork, were loaded from "/usr/lib" while other Qt libraries were still loaded from the local pylon folder "pylon/bin".
However, due to a mismatch of the different Qt versions, the pylon viewer will fail to start.

In order to fix the problem, you have to remove (e.g. save them as a backup in a folder on the desktop or delete them permanently) all local Qt libraries and their correspondent links that pylon viewer uses, which are placed in "pylon/bin".
The libraries and links (totally 16) that should be removed are:

libQtCore*
libQtGui*
libQtNetwork*
libQtXml*

It is very probable that you get the following error messages while trying to compile the pylon SDK samples under Ubuntu 11.xx, Fedora 13 or any other linux distributions based on the same platform revision:

/usr/bin/ld: AcquireSingleFrame.o: undefined reference to symbol
'GenICam::GenericException::what() const'
/usr/bin/ld: note: 'GenICam::GenericException::what() const' is defined
in DSO
/home/ringdahl/test_camera/pylon/genicam/bin/Linux64_x64/libGCBase_gcc40_v2_1.so
so try adding it to the linker command line
/home/ringdahl/test_camera/pylon/genicam/bin/Linux64_x64/libGCBase_gcc40_v2_1.so:
could not read symbols: Invalid operation

It turns out that this problem is caused by changing the Linker behavior in newer Linux distributions, i.e. Ubuntu 11.04 or 11.10, Fedora 13 etc.

That is, up to the above mentioned revisions, the Linker used to search for dependent libraries (indirect linking of shared library symbols) automatically. In the case of pylon, it is the libGCBase_gcc40_v2_1.

However, from e.g. Ubuntu 11.04 on all shared libs must be explicitly added to the command-line compiler flags in order to be found during compilation.

Hence, in this given case, use the following command-line switch:

-lGCBase_gcc40_v2_1

More information on that can be found under:
https://wiki.ubuntu.com/NattyNarwhal/ToolchainTransition

If you want to access a GigE camera with a known IP address that was configured in a different subnet and was connected to a router, you will fail, because such cameras are "hidden" for pylon and its Broadcast UDP Device Discovery Messages due to the routers "swallowing" all broadcast messages.

Hence, one cannot normally discover a GigE camera being connected to a router.

For that purpose the "Add Remote GigE Camera" feature (push F9 in pylon viewer) was implemented in order for users to be able to discover a GigE camera configured in a different subnet and connected to a router if they knew its IP address in advance. In this case, pylon will send a Unicast UDP discovery message and hence discover the camera and enable you to configure it and grab images.

In order to test this feature (if you did not have a router at hand) you may take 2 PCs and a switch and configure them the following way:

PC 1:
• pylon application runs on PC1
• IP address of network adapter is 192.168.0.2 (255.255.255.0)
• gateway for that adapter is configured as 192.168.0.1
• network adapter is connected to a switch

PC 2:
• Windows XP, configured as router
(i.e. IP forwarding is enabled: Microsoft TechNet )
• adapter 1: 192.168.0.1, connected to the same switch as PC1
• adapter 2: 192.168.1.1
• camera: 192.168.1.100, camera is connected to adapter 2

If you are trying to run a GigE camera on a Linux system and get the following error message while trying to grab images, you may want to try the solution described below:

"Failed to allocate resources
Failed to set stream grabber property. SocketBufferSize not valid!
../../../../pylonsrc/Pylon/PylonTL_GigE/PylonGigE/GxStream.cpp : 1178"

At the start of image acquisition pylon sets the SocketBufferSize to the maximum supported by the given OS value.
This value can be also changed in the StreamGrabber Nodemap every time, of course.

In the initialization stage pylon checks the maximum supported value for SocketBufferSize, which is reported by the system.
This value can be also retrieved with the command:
sysctl net.core.rmem_max

This value will then be used as the maximum value for the SocketBufferSize in the Feature Properties in pylon viewer (see attached screenshot).
Pay attention though, "sysctl" returns values in [Bytes], but the Nodemap works with [KBytes].
Hence, the reported maximum value in pylon viewer should have the following value:
"sysctl net.core.rmem_max" / 1024.

If pylon should fail at reading out the value for the SocketBufferSize for any reason, it will always use a default value of 2048 [KBytes].
However, under circumstances this value may be too big and hence not supported by the given system. This will in turn result in the mentioned error message!

Hence, if you run in this error message, you should check the current and maximum value for SocketBufferSize reported in the Feature Properties in pylon viewer first (see the attached screenshot).

Then open a console and execute "sysctl net.core.rmem_max". Now divide the reported value by 1024 and compare it with the maximum value reported in pylon viewer.
If these two values should be different, this means that pylon has failed in reading out the maximum system value, i.e. "net.core.rmem_max".

In this case you should manually set the SocketBufferSize to a smaller, secure value, e.g. 64, and try to grab images then.
Pay attention though, the value for the SocketBufferSize should be as big as possible in order not to degrade the performance.

While using the pylon GEV Filter Driver you can retrieve given statistic parameters from the StreamGrabber. These parameters are very helpful in order to judge if your setup is correct (including camera configuration) and your hardware components are appropriate (or appropriately configured) and/or perform well.
In other words, through these parameters you can see if you are losing images and upon that you can take measures to prevent that.

You can access these parameters either in pylon viewer (pylon viewer -> Stream Parameters -> Statistic) or from the pylon API.
Pay attention, the statistic parameters are only available while continuously grabbing images in pylon viewer. In the pylon API, these parameters will be available after "StreamGrabber.PrepareGrab()" and before "StreamGrabber.StopGrab()" were called.

In the statistic section in pylon viewer the statistic parameters will appear like that and this is their meaning:

- Total Buffer Count
Counts the number of received frames
Node Name: Statistic_Total_Buffer_Count

The Total Buffer Count will count the number of all buffers with "status == succeeded" and "status == failed". That is, all successfully and all incompletely grabbed (error code: 0xE1000014) buffers. That means, the Failed Buffer Count will also be included into this number.

C++ sample code:
int64_t i = camera.GetStreamGrabberParams().Statistic_Total_Buffer_Count.GetValue();

- Failed Buffer Count
Counts the number of buffers with at least one failed packet
Node Name: Statistic_Failed_Buffer_Count

The Failed Buffer Count will count only buffers, which were received with "status == failed". That is, buffers that were incompletely grabbed (error code: 0xE1000014).

C++ sample code:
int64_t i = camera.GetStreamGrabberParams().Statistic_Failed_Buffer_Count.GetValue();

- Buffer Underrun Count
Counts the number of frames lost because there were no buffers queued to the driver.
Node Name: Statistic_Buffer_Underrun_Count

The Buffer Underrun Count is increased when an image was received, but there were no queued, free buffers in the driver input queue, which caused the loss of the frame.

C++ sample code:
int64_t i = camera.GetStreamGrabberParams().Statistic_Buffer_Underrun_Count.GetValue();

- Total Packet Count
Counts the number of received packets
Node Name: Statistic_Total_Packet_Count

The Total Packet Count counts all packets ever seen, i.e. all successfully and unsuccessfully received packets.

C++ sample code:
int64_t i = camera.GetStreamGrabberParams().Statistic_Total_Packet_Count.GetValue();

- Failed Packet Count
Counts the number of failed packets (status != success)
Node Name: Statistic_Failed_Packet_Count

The Failed Packet Count counts received packets that were successfully received by the driver, but have been reported as "failed" by the camera. The most common reason is that the data is the result of a packet-resend request, which could not be satisfied by the camera since the requested data was already overwritten by new image data inside the camera memory.

However, when a packet was simply missed by the PC, i.e. when the original packet and the packet requested by a resend request both have been missed, it is not counted as "failed packet". In such situations Failed Buffer Count will be incremented though.
This why it is possible to have a failed buffer count > 0, but a failed packet count of zero.

C++ sample code:
int64_t i = camera.GetStreamGrabberParams().Statistic_Failed_Packet_Count.GetValue();

- Resend Request Count
Counts the number of emitted PACKETRESEND commands
Node Name: Statistic_Resend_Request_Count

The Resend Request Count counts the number of sent resend requests.
Pay attention, the pylon GEV Filter Driver will send only one resend request upon detecting a missing packet. If a packet could not be retrieved though, only the Failed Buffer Count will be increased.
Also, if consecutive missing packets were detected (e.g. packet number 5,6,7 and 8 were missed), only one resend request will be sent for all consecutive missing packets.
That is why, it is possible to see higher numbers for Resend Packet Count than for Resend Request Count.

C++ sample code:
int64_t i = camera.GetStreamGrabberParams().Statistic_Resend_Request_Count.GetValue();

- Resend Packet Count
Counts the number of packets requested by PACKETRESEND commands
Node Name: Statistic_Resend_Packet_Count

The Resend Packet Count counts the number of packets requested by resend requests.

C++ sample code:
int64_t i = camera.GetStreamGrabberParams().Statistic_Resend_Packet_Count.GetValue();

Pay attention, if the GEV Filter Driver did not receive the "leader" (packet notifying the beginning of a frame/image) of a given frame, it will disregard the complete frame and no resend requests will be sent and no statistic parameters will be increased.
That is, if the leader was lost, the complete frame will be lost and the user will not be notified about that. This is why it is recommendable to check the Frame Counter of upon receiving images in order to detect lost frames.
Pay attention, the GEV Performance Driver behaves in a different way in this regard.

Application notes are available that provide a detailed description of how to interface Basler GigE cameras with VisionPro 7.2 software from Cognex.
Click here to download the application notes in PDF format.

For some cameras, running the camera at a line rate near zero is not a problem. For example:
L304k, L304kc, L400k, and L800k cameras have no minimum required line rate when an external trigger (ExSync) signal is used to trigger line acquisition.
Keep in mind that for proper operation, the exposure time should be at least 10% of the line period. And if these cameras are used in free run, there is a 10 Hz minimum line rate
Runner cameras have no minimum line rate when an external line start trigger (ExLineStTrig) signal is used to trigger line acquisition.
However if an external line start trigger is not used, there is a minimum 100 Hz line rate.

On some cameras, there is an absolute minimum line rate. For example:
L100k, L301k, L301kc and sprint cameras all have a minimum line rate of 1KHz.

So why do some cameras have an absolute minimum line rate?

Basic Camera Principles
The principle of how a camera works is that during line exposure, photons from a light source strike the pixels in the camera's sensor and generate electrons. At the end of each line exposure, the electrons collected by each pixel are transported to an analog-to-digital converter. For each pixel, the converter provides a digital output signal that is proportional to the number of electrons collected by the pixel.
Below Minimum Line rates
If a camera is triggered at a rate below the specified minimum, it is much easier to fall into an over exposure situation. This happens due to an effect called "shutter inefficiency". The electronic shutter on digital cameras is not 100% efficient, and the pixels in the camera will collect some photons even when the shutter is closed. At very low line rates, you have long periods of time between exposures when the shutter is closed but the pixels are still collecting some photons and generating electrons. When the electrons collected with the shutter closed are added to the electrons collected during an exposure, the electrons can flood the electronics around the pixel.
After an Over Exposure
After an over exposure or with a trigger rate below 1kHz, it takes several readout cycles to remove all the electrons from the pixels and the electronics. For this reason, gray values will be abnormally high during the first several readouts after an over exposure.

Solutions
Use a camera that can operate at line rates near zero such as the L304k, L304kc, L400k, and L800k.
If you use a camera with a higher specified minimum line rate:
Don't operate the camera below its minimum specified rate.
Design an application which accepts a few lines that are brighter than normal.
Run the camera in free-run mode and collect only the lines that you need.
Send dummy trigger signals to the camera and ignore the lines generated by the dummy triggers

Application notes are available that provide a detailed description of how to interface Basler GigE cameras with Vision 8.2.1 Acquisition Software from National Instruments.
Click here to download the application notes in PDF format.

Application notes are available that provide a detailed description of how to interface Basler GigE cameras with Matrox MIL 8.0 Software.
Click here to download the application notes if PDF format.

Application notes are available that describe using pylon's DirectShow filter along with some common open source software to capture video from Basler cameras.
Click here to download the application notes in PDF format.

Taal / Language / Langue Nederlands English Français
Bel mij terug Bel mij terug