New dev kit supports neuromorphic sensor : Page 3 of 4

September 24, 2020 //By Nick Flaherty
Kit provides algorithms for neuromorphic image sensor
Development kit for neuromorphic sensor from Prophesee provides algorithms and code for event-driven vision processing
both C++ and Python APIs as well as extensive documentation and a wide range of samples organized by increasing difficulty to incrementally introduce the fundamental concepts of event-based machine vision.

“We understand the importance of enabling the development ecosystem around event-based vision technology. This software toolkit is meant to accelerate engineers’ ability to take advantage of its unique benefits without having to start from scratch,” said Verre. “The tools offer productivity and learning features that are valuable regardless of where a development team is on the adoption curve of event-based vision and will jumpstart design projects with production ready design aids.”

Metavision Player provides a Graphical User Interface that allows engineers to visualize and record data streamed by PROPHESEE-compatible Event-Based Vision systems as well as read provided event datasets.

Metavision Designer is a tool that allows engineers to interconnect components very easily for fast prototyping of Event-Based Vision applications. It consists of a rich set of libraries, Python APIs and code examples built for quick and efficient integration and testing.

The Metavision SDK provides access to the algorithmsvia APIs, ready to go to production with Event-Based Vision applications. The algorithms are coded in C++ and available via pre-compiled Windows and Linux binaries in its free license version.

The Metavision Intelligence suite is available in both time-unlimited free trial as well as a professional version, providing access to source code, advanced modules, revision updates, full documentation and support.

The fourth generation sensor is small enough to use in a mobile phone, says Verre. “In mobile, localisation mapping for augmented reality (AR) requires low latency understanding of the position of the device in space particularly for AR that needs precise positioning, and the robustness to lighting with 140dB range so you can handle lighting conditions outdoors,” he said.

Related neuromorphic articles 

Vous êtes certain ?

Si vous désactivez les cookies, vous ne pouvez plus naviguer sur le site.

Vous allez être rediriger vers Google.