4. The computer-implemented method of claim 1, wherein the indication of the user's reaction is identified from facial expressions of the user captured by an image capture device during the time period.
5. The computer-implemented method of claim 1, wherein the indication of the user's reaction is identified from user speech patterns captured by an audio capture device during the time period.
6. The computer-implemented method of claim 1, wherein the indication of the user's reaction is identified from gestures and body movements of the user captured by an image capture device during the time period.
7. The computer-implemented method of claim 1, further comprising storing the emotional state of the user in a database.
http://appft1.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&Sect2=...
Claims 4,5,6, and 7 are particularly relevant:
4. The computer-implemented method of claim 1, wherein the indication of the user's reaction is identified from facial expressions of the user captured by an image capture device during the time period.
5. The computer-implemented method of claim 1, wherein the indication of the user's reaction is identified from user speech patterns captured by an audio capture device during the time period.
6. The computer-implemented method of claim 1, wherein the indication of the user's reaction is identified from gestures and body movements of the user captured by an image capture device during the time period.
7. The computer-implemented method of claim 1, further comprising storing the emotional state of the user in a database.