Artificially intelligent virtual helpers may have been all the rage this month with the launch of Google’s Assistant and Samsung’s play to acquire AI company Viv, but one company is looking to bring a different kind of smart to your phone.
While digital assistants with machine learning capabilities are changing the way we vocally interact with our smartphones, software company Qeexo is aiming to revolutionize the tangible side of things by making touch screens more – well – touch-friendly.
According to CEO Sang Won Lee, Qeexo believes there are better ways to interact with touch screens beyond the current multi-touch standard. That’s why the company has come up with two different solutions, FingerSense and TouchTool, which allow devices to recognize more parts of the hand to offer a wider range of action beyond just unlocking a device or clicking an app.
Unlike a solution like Apple’s 3D Touch, which focuses on sensing finger pressure, FingerSense is a touch platform that allows a device to distinguish between different touchscreen inputs including fingertips, knuckles, nails and styluses. TouchTools is another solution that looks at the pose of a user’s hand to call up virtual tools.
“Multi-touch is just counting the number of fingers touching the screen, but we’ve proven to the market that there is a better way to utilize the touch screen,” Lee said. “TouchTool will provide another dimension by making the device smart enough to understand user intentions based on the shape of their hand.”
Similar to the virtual assistant introduced by Google, Lee said Qeexo’s solutions use machine learning to remember and cater to a user’s touch. But unlike a digital assistant, Lee said the machine learning used by FingerSense and TouchTools is embedded in the device rather than based on an outside server.
Though there is currently no crossover that would allow a virtual assistant to learn and respond to FingerSense or TouchTools gestures, Lee said the two technologies collide where the Qeexo solutions can provide faster access to the assistants via a simple gesture on the device screen.
According to Lee, TouchTools and FingerSense can also be applied in the home or in a vehicle to make user interactions with touch screens simpler and safer. In cars, for instance, Lee said TouchTools could give drivers safer access to their touchscreen’s voice-activated menu via a knuckle swipe.
“With analog buttons (in car consoles), you didn’t have to pay much attention to where your fingers were because you could just feel around and know where your hand was,” Lee said. “But when you’re interacting with a touch screen today there’s no way to find the right button without looking at the screen, which takes a driver’s attention away from the road.”
Lee said Qeexo is currently working with unspecified automakers to develop the best solution for vehicle applications, but the company’s FingerSense solution is already out in the smartphone market.
In August, FingerSense made its debut in the U.S. and international markets via Huawei’s Honor 8 smartphone. Lee said is also working with Alibaba and noted the company’s solutions have received “a lot of attention” from Chinese manufacturers.
Another place to keep an eye out for FingerSense? Google’s Nexus.
Lee said Qeexo is working to enable FingerSense on Google devices and in particular is aiming to allow developers to try FingerSense APIs on the Nexus 6p. Lee declined to comment on whether Qeexo is working on anything else with the tech giant.
Though TouchTools was announced in July as a proof of concept, Lee said more news about TouchTools applications is coming at the Consumer Electronics Show in January.