TULIPP

Tulipp will develop a reference platform that defines implementation rules and interfaces to tackle power consumption issues while delivering high, efficient and guaranteed computing performance for image processing applications.

Using this reference platform will enable designers to develop an elementary board at a reduced cost to meet typical embedded systems requirements: Size, Weight and Power (SWaP) requirements. Moreover, for less constrained systems which performance requirements cannot be fulfilled by one instance of the platform, the reference platform will also be scalable so that the resulting boards be chained for higher processing power.

Underwater Time Of Flight Image Acquisition – UTOFIA

A new compact, cost-efficient concept for underwater range-gated imaging system.

UTOFIA, a H2020 project (633098) started in February 2015, will offer a compact and cost-effective underwater imaging system for turbid environments. Using range-gated imaging (Figure 1), the system will extend the imaging range by factor 2 to 3 over conventional video systems. At the same time, the system will provide video-rate 3D information.

CISTERN

CISTERN aims to research technologies for CMOS image sensors, time of flight sensors, zoom optics, multispectral imaging and real time image processing algorithms that are needed in the next generation of several application domains.

Application domains that will be covered are:

  • Digital Lifestyle: Broadcast image systems for the first generation of Ultra High Definition Television and 3D entertainment systems.
  • High-end Security, Ultra High Definition (UHD) Surveillance Systems.
  • Multi-spectral imaging for specific applications like sorting in the food industry

The SEERS Project

The major goal of SEERS is to develop a snapshot spectral imager in the IR range based on low cost uncooled FPAs, with embedded processing capabilities. The targeted range includes near infrared (NIR), short wavelength infrared (SWIR), mid wavelength infrared (MWIR), and long wavelength infrared (LWIR).

A multisensor approach in a beamsplitting arrangement is to be adopted, combined with multi-aperture (MA) imaging, and embedded processing for cognitive image fusion and image-based measurement. Video rate performance, multispectral video analytics software (VAS) for the IR, and demonstration in a real video management software (VMS) framework in real operational conditions are important goals in SEERS.

CHEQUERS

In a world where explosive, toxic or otherwise lethal substances are, sadly, no longer restricted to theatres of war, but are becoming increasingly common in civilian areas (encountered either by misfortune or misadventure), the ability to detect and identify hazardous chemicals and compounds quickly, easily and at significant range is highly attractive.  Even after a terrorist attack has occurred, significant danger still exists from the threat of further concealed devices, thus significantly impeding the rendering of aid whilst the scene is declared safe.

I-ALLOW

The objective of the I-ALLOW project is to study, design, develop and test an innovative imaging solution which improves the safety of population and infrastructure in day-time, night-time and all-weather conditions.

Eyes of Things

“Eyes of Things” is an international project focused on building the best embedded computer vision platform ever. EoT is an Innovation Action funded under the European Union’s H2020 Framework Programme that is going to unfold for three years starting in January 2015.

SLOPE Project

The SLOPE project will integrate information from remote sensing and on-field surveying systems, to support analysis to characterize forest resources. Spatial information will be integrated with multi-sensor data in a model for Sustainable Forest Management and for optimization of logistics during forest operations.

PickNPack

The project will develop three types of modules that can cope with the typical variability of food products and the requirements of the sector regarding hygiene, economics and adaptability, which will be working closely together:

  • A sensing module that assesses quality of the individual or small batch products before or after packaging.
  • A vision controlled robotic handling module that picks up and separates the product from a harvest bin or transport system and places it in the right position in a package.
  • An adaptive packaging module that can accommodate various types of packaging with flexibility in terms of package shape, size, product environment, sealing and printing.

Please get a better impression by the following video:

Go to project

Clever Robots for Crops

Crops was a large-scale integrating FP7 EU project in the theme Nanotechnologies, Materials and new Production Technologies within the call: Automation and robotics for sustainable crop and forestry management.

The project started in October 2010 and was accomplished in September 2014.

CROPS has developed scientific know-how for a highly configurable, modular and clever carrier platform that includes modular parallel manipulators and intelligent tools (sensors, algorithms, sprayers, grippers) that can be easily installed onto the carrier and are capable of adapting to new tasks and conditions. Several technological demonstrators have been developed for high value crops like greenhouse vegetables, fruits in orchards, and grapes for premium wines. The CROPS robotic platform is capable of site-specific spraying (targets spray only towards foliage and selective targets) and selective harvesting of fruit (detects the fruit, determines its ripeness, moves towards the fruit, grasps it and softly detaches it). Another objective of CROPS was to develop techniques for reliable detection and classification of obstacles and other objects to enable successful autonomous navigation and operation in plantations and forests. The agricultural and forestry applications share many research areas, primarily regarding sensing and learning capabilities.