Page tree

Versions Compared


  • This line was added.
  • This line was removed.
  • Formatting was changed.

This guide gives an step by step introduction in industrial hyperspectral imaging

Table of Contents

Children Display


This document guides the reader through a complete hyperspectral imaging workflow beginning with the setup of components and ending up with the inspection of a color stream to discriminate taught plastics.

To get this tour practical, the differentiation of three plastic plates by means of its molecular fingerprint is shown. In this example hyperspectral line-scan technology (VNIR range) is used to differentiate plastic plates of white color.

Step 1 - Setup a hyperspectral system

Before start, check to have all components available:

  • CCI-compliant hyperspectral camera
  • Illumination capable for hyperspectral measurements
  • Calibration target
  • Perception Studio program
  • When real-time processing is needed, a CC-compliant streaming adapter (e.g. an industrial PC with the Perception Core program running on it)
  • A black marker (needed for setting up the sharpness)
  • A good absorbing background material often is helpful (e.g. a black foam)
  • Reference material for first measurements (e.g. plastic parts)

Setup hardware and software

In this section the setup of hardware and software components for industrial hyperspectral imaging is described.

Setup the hardware components

Mount the camera and lighting so the cameras field of view gets illuminated. It is good practise to use a black background material (e.g. a black foam).

Hint: Be sure the illumination brings light not only from one side onto the line of inspection. Shadows will harm the measurement quality. Best is to use a diffuse illumination or to illuminate from 2 sides.
Be aware about the influence of alternating current to the measurement results. The alternation can cause a ripple in the measured intensity over time. E.g. use an illumination powered by direct current to avoid this disturbing influence in the data.

Keep the minimum measurement distance of your camera in mind - e.g. ask your camera provider about this measure.
Hint: A distance smaller than this will result in not sharp camera images.

Streaming of molecular information is aimed:
Connect the camera with a CCI-compliant streaming adapter (Perception Core on it). Connect your PC to the streaming adapter.

Streaming is not needed:
Connect the camera to your PC.


titleMeasurement setup
Measurement setup
Specim FX10 hyperspectral camera, 2-sided Halogen illumination and a white tile in the camera's field of view

Setup the software components

Make sure the camera is connected properly (see previous section).

→ Install the Perception Studio program on your PC.

  • Streaming of molecular information is aimed:
    Download and install Perception Studio Inline on your PC.
  • Streaming is not needed:
    Download and install Perception Studio Offline on your PC

→ Make sure the Perception Core program is running on the streaming adapter (e.g. on the industrial PC).

→ Start the Perception Studio program and make sure the hardware was detected properly.

Hint: In case you are connected to the streaming adapter, the Perception Core icon will get shown in the device section of the user interface. In case of a direct connection to the camera, the cameras vendor icon will get shown in the user interface.


titleConnection to a device
Connected to FX10 33417Not connected

The camera FX10 of serial number 33417 is connected.

No device found.

Setup the acquisition

Setup the optics and acquisition parameters.

Setup the camera's acquisition parameters

→ Open Perception Studio's Setup perspective and start the live visualization of camera data.

→ Select the data dimension "spatial" and inspect the camera live data in the view.

→ Control the camera's acquisition parameter to find a proper saturation of the sensor.

Inspect the sensor's saturation for the "white" situation (light gets reflected from the calibration target) vs. the "dark" situation (e.g. the cap is on the lens and blocks the optical path).
Be sure the signal dynamic is reasonable high. Avoid under or over-saturated sensor regions.


titleWhite and dark situation
White situationDark situation

The cameras shutter is open.

The cameras shutter is closed.

Setup the camera's image sharpness

→ Open Perception Studio's Setup perspective and start the live visualization of camera data.

→ Select the data dimension "spatial" and inspect the camera live data in the view.

→ Put a black marker into the camera's field of view (e.g. onto the calibration target) and inspect obtained live data.

→ Adjust the optical setup (e.g. the lens) to achieve the maximum spatial sharpness.

Try to get an imaging of the marker with best contrast possible - so the marker appears with highest sharpness in the cameras image.


titleSharpness ok vs. not ok
Sharpness not okSharpness ok

A black marker was put onto the calibration target (line of inspection).

The fore lens need to get adjusted.

The fore lens was set properly.

Setup the measurement

The interaction of light with objects in general is diverse. One part is reflected from the object, another one get absorbed by the object and in case of a thin object, one part get transmitted though the object.
In industry, very often, molecular properties of objects are attempted to get measured by the study of reflected light per wavelength relatively to a known target.

Setup the system to measure relatively to a target.

Setup the balancing

→ Open Perception Studio's Setup perspective and start the live visualization of camera data.

→ Put the calibration target into the camera field of view and inspect obtained camera data.

→ Perform dark balancing

→ Perform white balancing

When inspecting the "white" image, be aware the cameras sensor is saturated properly. Under- or over-saturated pixel regions will cause measurement errors.

Block the optical path of the camera (e.g. put the cap onto the fore lens) and click on Record Dark Image button.

Be sure the camera sensor is properly saturated than click on the Record White Image button.

When done, check the "balanced" option on the top of the graph and inspect balanced live data of the camera.


titleNot balanced vs. balanced camera data
Not BalancedBalanced

The calibration target was put on the line of inspection. The shutter of the FX10 camera was used to darken the optical path.

Not balanced data.

Balanced live data.

titleInfluence of temperature
Directly after balancingCooled down illumination (5min.)

Balanced data get compared captured directly after a balancing and captured after the illumination was cooled down.
Note: The illumination temperature has a great influence on the measurement quality.

Balanced camera live data captured directly after balancing.

Balanced camera live data captured after a 5min. cooling down periode of the illumination

Step 2 - Acquire hyperspectral data

This chapter guides you through the acquisition process of hyperspectral data.

Be sure your hyperspectral system was set up properly (e.g. see the previous chapter).

→ Open Perception Studio's Acquire perspective

→ Set the length of the planned acquisition process (number of frames to be captured over time).

→ Click on Record button and wait till the system is ready for acquisition.

→ Confirm the start of the acquisition and move the measurement objects through the line of inspection

→ Crop the record to ensure reasonable data sizes

→ Provide documentation to this measurement

→ Save the record.


titleThe acquisition process
Acquisition of three plastic plates


An acquired hyperpsectral data scene of a hand holding three plastic plates.
The outbounds of the scene were restricted to cover the object of interest.
On the left, some description were attached to the data set.

Three plastics (VNIR).hsd

Step 3 - Explore hyperspectral data

Explore the acquired data to get an idea about its quality and e.g. about measurement errors.

→ Open Perception Studio's Explore perspective

→ Select the data on which exploratory investigation should get performed.

→ Inspect the scene

→ Select objects of interest and inspect its spectral information

→ Learn about the influence of different preprocessing


titleThe selection process
Selection of objects


The plastic plastes as well as the hand and background was selected by means of the Select Spectra tool.

titleThe influence of preprocessing
Without preprocessing1st derivative of data

Reflectance spectra of the objects scanned.

Preprocessing: 1st derivative. The spectral slope per wavelength of scanned objects is shown.
Note: Bias information in the spectra are gone, instead changes in the spectra (slope information) are visualized.

Step 4 - Model the application relevant information:

→ Open Perception Studio's Model perspective

→ Select the data for the modelling process.

→ Choose from one modelling method available in the ribbon.

→ Develop a model and save it for later usage.

Dependent on the application different approaches to modelling an information are available:

Model a colored perception (CCI)

Applying a CCI methods results in a chemical color image which expresses chemical information of observed objects by color information.


titleCCI-Preview method

The CCI-Preview method gives an unsupervised approach to information extraction out of hyperspectral imaging data.
Perprocessing 1st derivative was applyed to the data beforehands. The spectra of the plastic plates are distinct from each other (different color).

titleCCI-Extract method

The CCI-Extract method gives an unsupervised approachbased on unscrabling principle spectral components from selected spectra.
Perprocessing 1st derivative was applyed to the data beforehands. Components PC1, PC2 and PC3 well distinguish the spectra of the plastics.

titleCCI-Correlate method

The CCI-Correalte method correlates selected spectra to scene and shows the correlation result per spectra set.
Perprocessing 1st derivative was applyed to the data beforehands. The correlation of selected spectra well-discriminates the plastics.

titleCCI-Constrain method

The CCI-Constrain method let the user to constrain colors following his expectations about the chemically scene.
Perprocessing 1st derivative was applyed to the data beforehands. The result matches the expectations of the user.

Model a classification

Applying a classification method results in a classification image which expresses a classification ID by a color information.


titleCase Discrimination method

The Case Discrimination method allows a discrimination upon selected spectra sets. Perprocessing 1st derivative was applyed to the data beforehands.
Different to the CCI-methods, the case discrimination method results in a class ID per taught material (e.g. per spectra set).

Model a statistical feature

Applying one of these methods results in a gray value image which expresses a statistical property by its pixel values.


titleStatistical feature method - dynamic

The statistical feature methods allows an extraction of information based on statistical methodology. Perprocessing 1st derivative was applyed to the data beforehands.
The spectral dynamic is extracted (difference of max and min value in the spectra).

Step 5 - Setup for live streaming:

In this step a CCI-compliant HW adapter is setup to perform a modelled feature extraction in real-time.

→ Open Perception Studio's Setup perspective and select the Perception Core to be the target device.

→ Make sure the hyperspectral system is set up properly.

→ Be sure the spectral range explained by your model fits to the spectral region of interest set.

→ Be sure the measurement system is calibrated. E.g. setup the balancing.

→ Scroll down the parameter list to the systems configuration panel and click on Add button.

→ Give the new configuration a meaningful name, select the models to be applied and configure streaming options.

→ Activate the new configuration by clicking on the Active button.


titleAdd a configuration

The name was set to "Three plastics" and "Correlate_1stDerivative" was selected to be the model to be applied in real-time.
The streaming is configured to be GigEVision compliant. An image is configured to get upsampled to a height of 1000 prior transmission to the client application.

titleSystems configuration panel

The configuration "Thress plastics" is available and activated.


Step 6 - View live data: 

In this step the output of a CCI-compliant streaming adapter is viewed.

→ Open Perception Studio's View perspective and select the Perception Core to be the target device.

→ Activate the configuration of interest.

→ Click on Play button and inspect the live streaming.


titleLive-view of a feature stream

The configuration "Three plastics" is activated. The model "Correlate_1stDerivative" get applied in real-time and results in a color stream shown in the view.
Left, the three plastic plates got scanned in a crossed configuration (the plates overlap partly) - this results in a mixture of colors.
Right, the three plastic plates got scanned without overlap.

Step 7 - Connect your machine vision application

This step summarizes the communication of your machine vision application with the streaming adapter (Perception Core).

→ Setup the Perception Core for live streaming (see a previous chapter).

  • Streaming via UDP:
    Be sure to opt the stream type "UDP" when adding a new configuration to the Perception Core.
    Specify the IP and port your application is listen to.
  • Streaming via GigEVision:
    Be sure to opt the stream type "GEV" when adding a new configuration to the Perception Core.
    Specify the NIC (streaming adapter) which should receive the streamed data.

→ Prepare your application to receive live data

→ E.g. prepare your application to control the Perception Core programmatically


titleGenICamBrowser program

The Perception Core is configured by the run job "Three plastics" (stream type = GigiEVision).
Stemmer Imaging's GenICamBrowser program is used as client.

On start of GenICamBrowser the device "Perception System" was selected. After pressing play, molecular information get extracted live from camera raw data and is shown in the program.


© 2017 2019 by Perception Park GmbH
The content of this page and any attached files are confidential and intended solely for the addressee(s). Any publication, transmission or other use of the information by a person or entity other than the intended addressee is prohibited. If you receive this in error please contact Perception Park and delete copied material. Perception Park GmbH, Nikolaiplatz 4 / 3Wartingergasse 42, A-8020 8010 Graz; Austria; FN 400381x