Brilliant! Not sure I like that particular implementation (with IR), but the idea of having the whole desktop keyboard work like a giant track pad is an enticing one.
Can a track pad surface be integrated into every keyboard key?
The problem with the implementation demonstrated in the video is that the gesture is recognised, and then the action happens.
There is no one-to-one mapping between the gesture in progress and action on screen, so I suspect it will not feel anywhere near as nice as a trackpad or touchscreen.
I think that could be improved with a higher resolution sensor, but the question is whether you could build such a thing while retaining space for physical keys.
IIRC, current gesture recognition still takes a couple hundred milliseconds on modern consumer touch screens (iPhones, etc.), especially for the "click" action.
A lot of plastic is already IR transparent but totally opaque in the visible spectrum. You can hack a cheap camera and take out the IR filter to make a neat "xray camera".
Maybe you should do that and write a paper about it. :)
My initial thoughts: 1) is there space underneath the key for such a sensor? 2) would the infrared light bounce off of the plastic on the key? 3) would transparent keys affect chicken-peckers (and other users) adversely?
Not saying it's not possible, although I imagine there will be challenges.
Should be able to be improved once its in use and they have enough usage reports. For now I'm guessing there is too many false positives during typing itself, eg moving your hand toward the mouse or something else. There is also all those peckers out there to think about doing weird and strange hovering motions. In the future, if they can map out enough false positives they could potentially fix it.
Also, I'm sure someone with this tech could make use of a certain gesture that activates one-to-one style mapping for a certain period of time / until a button is pressed.
For that matter even with Touch Screen, a gesture is extracted/identified by touch Algorithms (either in Touch controller or SoC). Only later, an action is taken. So there will defenitely be an inherent delay. These algos involve similar classification methods as mentioned in this demo.
But at least on a touch screen I can perform a partial gesture and see a partial result — i.e., the feeling is that the image zooms as my fingers pinch the screen, or the map pans as my fingers move. It's a very different feeling to the approach demonstrated in the video.
I always wanted this to exist. But ofcouse, never had the hardware skills to make one.
I have the same feeling as you about the IR thing. makes me if they are only BETWEEN keys, wouldn't the make the sensing in accurate?
My HP laptop's mouse buttons are actually part of the trackpad itself. That is to sya, you can even move your finger over them and it still functions as if you are using the track pad. so, couldn't we extend this to the entire keyboard? It's just a really cool concept to having them merged as one. It wouldn't disrupt you each time u have to use the mouse. You can just do it right on the keyboard with moving you hands all the way to the mouse or keypad!
What's wrong with the IR implementation? I guess they could introduce capacitive sensors as well, but I'd imagine IR + capacitive sensing would only make it more accurate.
IR makes sense mainly because there has to be a simple way to know that you're trying to type and not trying to move around the mouse/gesturing. Soon as your fingers are a certain height off the keyboard, it can know you're not typing. Otherwise you'd probably want to hold down SHIFT or something to indicate you want to use the mouse/gesture.
Not sure how that would work in practice, but I personally work with a cup of tea between my hands most of the time. That could interfere. Leap Motion was popular some time ago, and it detected the cup as such, not something bigger. I don’t know if they used IR though.
As you said, when interpreting non-obvious outputs of probes, several detection elements presumably help interpret issues: say pressure, conductivity for trackpads. I believe ‘Moves’ (the app recently bought by Facebook) use both the phone cell-tower signal to triangulate location and the motion to correct the occasional jump to the next neighbourhood when the nearest tower is suddenly busy.
For that ‘virtual trackpad’, using several frequencies or motion sensors might help; I’m not a big fan of your suggestion of a button: it seems… cumbersome, but it might actually work. I was made to realise today how I unconsciously used three- to four-button keystrokes most of the time.
Can a track pad surface be integrated into every keyboard key?