The idea that a keyboard used with a finger is even Patentable???? How is this even close to being something that needs to be protected? So the visualization of anything is a patent worthy idea? There has to be prior art in the hundreds of touch typing?
It is so frustrating knowing the Prior work of Microsoft Surface before the iPhone with multi-touch and pinch to zoom.
I would love all Android phones to just stop being sold in protest immediately. Though this would never happen. The very idea that this was something invalid in September and now passed through smells of some outside pressure.
> The idea that a keyboard used with a finger is even Patentable???? ...
This patent isn't for that.
> It is so frustrating knowing the Prior work of Microsoft Surface before the iPhone with multi-touch and pinch to zoom.
I assume any work published prior to the priority date of the patent (could be as early as April 2007 based on the filing date) would have been presented by those who requested the re-examination.
> ... The very idea that this was something invalid in September and now passed through smells of some outside pressure.
This patent has never been decided invalid. The First Office Action (which suggested invalidity) is the very first stage of invalidation and is made before there is any defence of the patent and frequently claims suggested invalid there are later decided to be valid. I don't think that the stage that has just determined the patent is valid would have been the final stage of invalidation (despite being called final).
Note also that as I read the claims [IANALATISNLA] if your touchscreen device naively translates single or multiple touches without using heuristics based on the initial movement to decide whether it is scrolling/rotating/something else then this patent would not affect you.
I highly doubt that. I have a FingerWorks Touchstream LP that (while I don't use it anymore) was performing the same exact gestures we all recognize as "swipes" on capacitive touchscreen phones/tablets. As evidence I point you to the founder's dissertation on the topic from 1999: http://www.eecis.udel.edu/~westerma/main.pdf
The only difference between how it worked on a Touchstream LP and a modern capacitive touchscreen is that the trouchscreen has a "screen". Other than that they're absolutely identical.
Actually, the Touchstream LP had capabilities that exceed today's capacitive touchscreens. For example, it supports ten-finger chording and could recognize which finger was which. So if I pressed my index and middle fingers (say, with both hands) in a certain region of the keyboard it would recognize that and send whatever pre-programmed sequence of keystrokes or mouse movements I wanted.
It was awesome technology! It only had one (big) flaw though: No tactile feedback.
You take it for granted but the "click" you feel when you press a key on your keyboard is very, very important to typing quickly and efficiently. Without it you have to constantly ensure your hands & fingers in properly positioned. This is why autocorrect is so important with software keyboards (if you're tapping instead of swiping). Without it--if you tried to type really quickly without thinking about what is actually being output from the IME--you'll end up with typos all over the place.
Another thing we take for granted is the ability to "feel" a key as we press it. On a capacitive touch surface there's no way to do that without accidentally typing that key. This is why I have high hopes for that technology that produces little bubbles on top of touchscreens on-demand to provide tactile feedback as to key locations.
So that is the reason why I eventually gave up on using the Touchstream LP. While it was the greatest mousing device I've ever used (and I've used HUNDREDS) it was not that great of a keyboard. I'd be happy to use it day-to-day for typing up English words and sentences but it was just too error-prone to use in a bash shell or for programming. Even with the "programmer pad" feature (which let you enter keys like braces, brackets, parens, etc using handy gestures) it still resulted in too many typing mistakes--even after training myself and using it regularly for a year.
Wait, it could identify which fingers were in use? Surely there was some sort of limit on this in some way. I find it doubtful that I could put my two index fingers on the surface somewhere and the system would know that was my two index fingers as opposed to any two of the rest of my fingers.
What you describe is more like counting touch contacts within a certain space on the surface and reacting to that regardless of which fingers were which.
While the parent comment is completely wrong that it was acquired from them when it is actually an Apple patent Fingerworks is mentioned in the "Other references" (that is references to things that aren't patents) section:
PR Newswire, "FingerWorks Announces a Gestrue Keyboard for Apple PowerBooks," Jan. 27, 2004, 2 pages. cited by other .
PR Newswire, "FingerWorks Announces the ZeroForce iGesture Pad," Feb. 18, 2003, 2 pages. cited by other .
I haven't studied the cited patents, some of them may also refer to Fingerworks.
It is so frustrating knowing the Prior work of Microsoft Surface before the iPhone with multi-touch and pinch to zoom.
I would love all Android phones to just stop being sold in protest immediately. Though this would never happen. The very idea that this was something invalid in September and now passed through smells of some outside pressure.