Elliptic Labs Launches First SDK for Touchless Gesturing on Android Smartphones Using Ultrasound
At CEATEC, smartphones show touchless gesturing that responds to natural hand movements around and beyond the device screen.
CEATEC 2013, Japan and Palo Alto, CA —- Elliptic Labs, the leader in ultrasonic touchless gesturing, today brings science fiction closer to reality with the launch of the first SDK for touchless gesturing on Android smartphones using ultrasound. Visitors to CEATEC in Tokyo on October 1-5 will see ultrasonic touchless gesturing demoed live for the first time on a smart phone at the Murata booth at Hall 2 Booth 2A72.
“Elliptic Labs delivers touchless gesturing in a natural way all around the screen of a smart phone or tablet, using the movements you use in daily life. Now, with our software SDK, we are giving smartphone manufacturers a way to easily and cost effectively include consumer-friendly touchless gesturing in their phones,” said Laila Danielsen, CEO, Elliptic Labs. “Our technology is also great for playing games on smartphones. It uses little power and with our high resolution, you will be able to play popular games such as Fruit Ninja, Subway Surfers or any other games that require high relative accuracy and speed.”
Equipped with tiny microphones, transducers, and Elliptic Labs software, Android smartphone OEM’s can now use the ultrasound spectrum (above 20kHz) to enable touchless gesturing. Sound waves sent from the device interact with a user’s hand and it’s this interaction that moves objects on a device screen. Accurate time-of-flight measurement and distributed sensing (capturing movement from multiple angles) enable true 3D interaction above, below and to the side of the screen at 180 degrees. Android smartphone manufacturers will now have an easy way to integrate touchless gesturing into smart phones.
The Elliptic Labs Smartphone SDK will be available starting October 2, 2013.
About Elliptic Labs
Elliptic Labs is a global company and the world leader in AI virtual sensors for the smartphone, IoT, and automotive industries. We transform products using machine learning and/or sensor fusion to combine ultrasound with data from existing device sensors to produce smarter, greener, safer, and more intuitive devices. This allows us to eliminate the need for infrared, radar, and time of flight hardware sensors, saving OEMs components costs and freeing up design space. Our AI Virtual Smart Sensor Platform™ provides precise presence sensing and enables touch-free gesture controls such as scroll, approach, and double-tap — software which is now deployed in over 30 million devices. Going forward, we strive to empower more industries with our AI platform and build a stronger AI ecosystem together with the industry.
KimberPR for Elliptic Labs
1 650 773 7288