Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Please understand what prior art means before talking about it again.

And Apple acquired these patents from Fingerworks who predated Microsoft Surface by nearly a decade.




That patents has nothing to do with Fingerworks.


I highly doubt that. I have a FingerWorks Touchstream LP that (while I don't use it anymore) was performing the same exact gestures we all recognize as "swipes" on capacitive touchscreen phones/tablets. As evidence I point you to the founder's dissertation on the topic from 1999: http://www.eecis.udel.edu/~westerma/main.pdf

The only difference between how it worked on a Touchstream LP and a modern capacitive touchscreen is that the trouchscreen has a "screen". Other than that they're absolutely identical.

Actually, the Touchstream LP had capabilities that exceed today's capacitive touchscreens. For example, it supports ten-finger chording and could recognize which finger was which. So if I pressed my index and middle fingers (say, with both hands) in a certain region of the keyboard it would recognize that and send whatever pre-programmed sequence of keystrokes or mouse movements I wanted.

It was awesome technology! It only had one (big) flaw though: No tactile feedback.

You take it for granted but the "click" you feel when you press a key on your keyboard is very, very important to typing quickly and efficiently. Without it you have to constantly ensure your hands & fingers in properly positioned. This is why autocorrect is so important with software keyboards (if you're tapping instead of swiping). Without it--if you tried to type really quickly without thinking about what is actually being output from the IME--you'll end up with typos all over the place.

Another thing we take for granted is the ability to "feel" a key as we press it. On a capacitive touch surface there's no way to do that without accidentally typing that key. This is why I have high hopes for that technology that produces little bubbles on top of touchscreens on-demand to provide tactile feedback as to key locations.

So that is the reason why I eventually gave up on using the Touchstream LP. While it was the greatest mousing device I've ever used (and I've used HUNDREDS) it was not that great of a keyboard. I'd be happy to use it day-to-day for typing up English words and sentences but it was just too error-prone to use in a bash shell or for programming. Even with the "programmer pad" feature (which let you enter keys like braces, brackets, parens, etc using handy gestures) it still resulted in too many typing mistakes--even after training myself and using it regularly for a year.


Wait, it could identify which fingers were in use? Surely there was some sort of limit on this in some way. I find it doubtful that I could put my two index fingers on the surface somewhere and the system would know that was my two index fingers as opposed to any two of the rest of my fingers.

What you describe is more like counting touch contacts within a certain space on the surface and reacting to that regardless of which fingers were which.


>I highly doubt that.

Have you read the actual application and the references?


While the parent comment is completely wrong that it was acquired from them when it is actually an Apple patent Fingerworks is mentioned in the "Other references" (that is references to things that aren't patents) section:

PR Newswire, "FingerWorks Announces a Gestrue Keyboard for Apple PowerBooks," Jan. 27, 2004, 2 pages. cited by other .

PR Newswire, "FingerWorks Announces the ZeroForce iGesture Pad," Feb. 18, 2003, 2 pages. cited by other .

I haven't studied the cited patents, some of them may also refer to Fingerworks.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: