Israeli company eyeSight, producer intuitive gesture control technology, has completed extensive work in partnership with ARM to optimise its gesture recognition solution for use on ARM’s Mali T600 Graphics Processing Units. Manufacturers using ARM Mali GPUs can use eyeSight’s natural-feeling, advanced gesture control capabilities, using GPU Compute for improved robustness, accuracy and energy efficiency. The improved efficiency of gesture computation through use of the GPU will also improve a variety of new use cases, such as face and emotion detection, long distance finger tracking, and even 3D motion recognition (such as finger pointing for selection).
The ARM Mali T600 GPU Compute optimised engine from eyeSight provides a solution for enabling gesture in mobiles, tablets and TVs and a range of other devices. Products featuring ARM Mali GPUs with eyeSight’s gesture solution will allow users to control user interfaces (UIs) and content such as music and movies, to activate usability applications, play games, or to browse menus with easy yet powerful hand and fingertip-level gestures. With eyeSight’s technology, Mali devices can recognise a rich language of gestures including directional gestures, (such as up, right, wave, etc.), hand signs (such as a ‘thumbs-up’), and tracking of hands and even fingertips, (for mouse-cursor-accuracy).
ARM's Pete Hutton, executive vice president and general manager, Media Processing Division, observes, “The optimisation of gesture middleware solutions using Mali GPU Compute in combination with ARM Cortex-A processors using NEON technology... enables performance, accuracy, robustness and efficiency. Developers no longer need to worry about processing or ambient limitations as they create the gesture-enabled applications of the future.”
Low-quality video resulting from low light conditions, or slow CPUs, usually results in compromised gesture recognition. eyeSight’s technology is particularly efficient, and Mali GPUs take processing load away from the CPU. eyeSight’s video pre-processing sits between the camera and the gesture control, ‘cleaning up’ the image so that the shapes and movement of hands and fingers can be recognised, even when the