DETAILED ACTION
Notice of Pre-AIA or AIA Status
1. The present application is being examined under the pre-AIA first to invent provisions.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of pre-AIA 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(b) the invention was patented or described in a printed publication in this or a foreign country or in public use or on sale in this country, more than one year prior to the date of application for patent in the United States.
2. Claim(s) 1-2 and 11-12 is/are rejected under 35 U.S.C. 102(b) as being anticipated by Hongo (US Patent/PGPub. No. 6677969).
Regarding Claim 1, (Original)
Hongo teaches
an electronic device (Col. 3, Ln. 23-32, FIG. 1-2, i.e. instruction recognition system A), comprising:
a first sensing device (Col. 3, Ln. 23-32, FIG. 1-2, i.e. face orientation detection section 14) configured to generate first gesture data (Col. 3, Ln. 51-57, FIG. 1-2, i.e. image data);
one or more second sensing devices (Col. 3, Ln. 23-32, FIG. 1-2, i.e. line-of-sight detection section 12) configured to generate gaze data (Col. 3, Ln. 37-45, FIG. 1-2, i.e. line of sight); and
a processor (Col. 3, Ln. 49-50, FIG. 1-2, i.e. CPU which performs predetermined processing) coupled to (FIG. 1-2, i.e. as shown by the figure(s)) the first sensing device and the one or more second sensing devices, the processor (i.e. please see above citation(s)) programmed to:
process the first gesture data (i.e. please see above citation(s)) to detect a gesture (Col. 3, Ln. 51-57, FIG. 1-2, i.e. recognizes the nodding action); and
process the gesture in accordance with the gaze data (i.e. please see above citation(s)) to perform one or more operations (Col. 4, Ln. 4-9, FIG. 1-2, i.e. the line of sight of the user is detected when the user has performed nodding action, and an icon toward which the line of sight is oriented is determined to be selected) on the electronic device, the one or more operations (i.e. please see above citation(s)) related to a location (Col. 4, Ln. 4-9, FIG. 1-2, i.e. icon toward which the line of sight is oriented) at which a gaze is directed (i.e. please see above citation(s)).
Regarding Claim 2, (Original)
Hongo teaches
the electronic device of claim 1, the processor further programmed (i.e. please see above citation(s)) to:
determine that the gesture (i.e. please see above citation(s)) is time-aligned (Col. 2, Ln. 23-32, FIG. 1-2, i.e. line of sight … corresponds to a gazing state where eyes remain stable) with the gaze data (i.e. please see above citation(s)); and
in accordance with a determination (Col. 2, Ln. 23-32, FIG. 1-2, i.e. detected line of sight is determined) that the gesture is time-aligned with the gaze data (i.e. please see above citation(s)), perform the one or more operations (Col. 4, Ln. 4-9, FIG. 1-2, i.e. icon … to be selected) related to the location at which the gaze is directed (i.e. please see above citation(s)).
Regarding Claim 11, (Original)
Hongo teaches
a method (Col. 1, Ln. 17-20, FIG. 1-2, i.e. interaction processing method), comprising:
at an electronic device (Col. 3, Ln. 23-32, FIG. 1-2, i.e. a decision-of-selection determination section 16; a decision-of-selection execution section 17) in communication with a display (Col. 3, Ln. 23-32, FIG. 1-2, i.e. display section 20):
receiving first gesture data (Col. 3, Ln. 51-57, FIG. 1-2, i.e. image data)from a first sensing device (Col. 3, Ln. 23-32, FIG. 1-2, i.e. face orientation detection section 14);
receiving gaze data (Col. 3, Ln. 37-45, FIG. 1-2, i.e. line of sight) from one or more second sensing devices (Col. 3, Ln. 23-32, FIG. 1-2, i.e. line-of-sight detection section 12);
processing the first gesture data (i.e. please see above citation(s)) to detect a gesture (Col. 3, Ln. 51-57, FIG. 1-2, i.e. recognizes the nodding action); and
processing the gesture in accordance with the gaze data (i.e. please see above citation(s)) to perform one or more operations (Col. 4, Ln. 4-9, FIG. 1-2, i.e. the line of sight of the user is detected when the user has performed nodding action, and an icon toward which the line of sight is oriented is determined to be selected) on the electronic device, the one or more operations (i.e. please see above citation(s)) related to a location (Col. 4, Ln. 4-9, FIG. 1-2, i.e. icon toward which the line of sight is oriented) at which a gaze is directed (i.e. please see above citation(s)).
Regarding Claim 12, (Original)
the method of claim 11, further comprising:
determining that the detected gesture (i.e. please see above citation(s)) is time-aligned (Col. 2, Ln. 23-32, FIG. 1-2, i.e. line of sight … corresponds to a gazing state where eyes remain stable) with the gaze data (i.e. please see above citation(s)); and
in accordance with a determination (Col. 2, Ln. 23-32, FIG. 1-2, i.e. detected line of sight is determined) that the detected gesture is time-aligned with the gaze data (i.e. please see above citation(s)), performing the one or more operations (Col. 4, Ln. 4-9, FIG. 1-2, i.e. icon … to be selected) related to the location at which the gaze is directed (i.e. please see above citation(s)).
Claim Rejections - 35 USC § 103
The following is a quotation of pre-AIA 35 U.S.C. 103(a) which forms the basis for all obviousness rejections set forth in this Office action:
(a) A patent may not be obtained though the invention is not identically disclosed or described as set forth in section 102, if the differences between the subject matter sought to be patented and the prior art are such that the subject matter as a whole would have been obvious at the time the invention was made to a person having ordinary skill in the art to which said subject matter pertains. Patentability shall not be negated by the manner in which the invention was made.
3. Claim(s) 3 and 13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Hongo (US Patent/PGPub. No. 6677969) in view of Beymer et al. (US Patent/PGPub. No. 20050243054).
Regarding Claim 3, (Original)
Hongo teaches
the electronic device of claim 1.
However, Hongo does not explicitly teach
the one or more operations comprise modifying one or more objects displayed at the location at which the gaze is directed.
In the same field of endeavor, Beymer et al. teach
the one or more operations ([0019], FIG. 1-2, i.e. highlights) comprise modifying one or more objects ([0019], FIG. 1-2, i.e. object) displayed at the location at which the gaze is directed ([0019], FIG. 1-2, i.e. one object is located near the gaze position … darkening/lightening the object's color).
It would have been obvious to a person having ordinary skill in the art at the time the invention was made to combine Hongo teaching device of detecting gesture and eye tracking to perform operations with Beymer et al. teaching device of detecting gazing for modifying object’s feature to effectively and easily improve user/machine interface (Beymer et al.’s [0019]).
Regarding Claim 13, (Original)
Hongo teaches
the method of claim 11.
However, Hongo does not explicitly teach
the one or more operations comprise modifying one or more objects displayed at the location at which the gaze is directed.
In the same field of endeavor, Beymer et al. teach
the one or more operations ([0019], FIG. 1-2, i.e. highlights) comprise modifying one or more objects ([0019], FIG. 1-2, i.e. object) displayed at the location at which the gaze is directed ([0019], FIG. 1-2, i.e. one object is located near the gaze position … darkening/lightening the object's color).
It would have been obvious to a person having ordinary skill in the art at the time the invention was made to combine Hongo teaching device detecting gesture and eye tracking to perform operations with Beymer et al. teaching device of detecting gazing for modifying object’s feature to effectively and easily improve user/machine interface (Beymer et al.’s [0019]).
4. Claim(s) 4-10 and 14-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Hongo (US Patent/PGPub. No. 6677969) in view of Wu et al. (US Patent/PGPub. No. 20050052427).
Regarding Claim 4, (Original)
Hongo teaches
the electronic device of claim 1.
However, Hongo does not explicitly teach
detecting the gesture comprises detecting movement of a first hand.
In the same field of endeavor, Wu et al. teach
detecting the gesture ([0018], FIG. 1, i.e. detect hand gestures) comprises detecting movement of a first hand ([0034], FIG. 5, i.e. gesture … with any two fingers 501).
It would have been obvious to a person having ordinary skill in the art at the time the invention was made to combine Hongo teaching device of detecting gesture and eye tracking to perform operations with Wu et al. teaching of device of detecting gesture of one hand to effectively and easily interact with user interface (Wu et al.’s [0034]).
Regarding Claim 5, (Original) the electronic device of claim 4,
Wu et al. teach
the processor further programmed to process ([0018], FIG. 1, i.e. perform computer operations) the first gesture data to detect the gesture being additionally performed by a second hand ([0034], FIG. 5, i.e. "writing" or "drawing" 502 with the other hand 503).
Regarding Claim 6, (Original) the electronic device of claim 5,
Wu et al. teach
the processor further programmed to detect the gesture by detecting movement of the first hand and second hand relative to each other ([0039], FIG. 9, i.e. two hands 901 are placed apart on the touch surface to indicate a piling gesture).
Regarding Claim 7, (Original)
Hongo teaches
the electronic device of claim 1.
However, Hongo does not explicitly teach
the processor further programmed to process the first gesture data to detect an arrangement of finger locations of a first hand.
In the same field of endeavor, Wu et al. teach
the processor further programmed to process the first gesture data ([0018], FIG. 1, i.e. perform computer operations) to detect an arrangement of finger locations of a first hand ([0036], FIG. 7-8, i.e. gesture is identified by touching a document 800 with three or more fingers 701).
It would have been obvious to a person having ordinary skill in the art at the time the invention was made to combine Hongo teaching device of detecting gesture and eye tracking to perform operations with Wu et al. teaching of device of detecting gesture of one hand to effectively and easily interact with user interface (Wu et al.’s [0036]).
Regarding Claim 8, (Original) the electronic device of claim 7,
Wu et al. teach
the processor further programmed to process the first gesture data ([0018], FIG. 1, i.e. perform computer operations) to detect movement of the finger locations relative to each other ([0037], FIG. 7-8, i.e. shrinking of the spread of the fingers).
Regarding Claim 9, (Original) the electronic device of claim 8,
Wu et al. teach
the processor further programmed to:
process the first gesture data (i.e. please see above citation(s)) to detect movement of the finger locations toward each other ([0037], FIG. 7-8, i.e. shrinking of the spread of the fingers); and
in accordance with the detection of movement of the finger locations toward each other (i.e. please see above citation(s)), determine that the gesture is a pinch gesture ([0037], FIG. 7-8, i.e. the size of the bounding box can be changed … shrinking).
Regarding Claim 10, (Original)
Hongo teaches
the electronic device of claim 1.
However, Hongo does not explicitly teach
the processor further programmed to process the first gesture data to identify one or more fingers of a detected first hand.
In the same field of endeavor, Wu et al. teach
the processor further programmed to process ([0018], FIG. 1, i.e. perform computer operations) the first gesture data to identify one or more fingers of a detected first hand ([0034], FIG. 5, i.e. gesture is continued … with the other hand 503 using a finger).
It would have been obvious to a person having ordinary skill in the art at the time the invention was made to combine Hongo teaching device of detecting gesture and eye tracking to perform operations with Wu et al. teaching of device of detecting gesture of one hand to effectively and easily interact with user interface (Wu et al.’s [0034]).
Regarding Claim 14, (Original)
Hongo teaches
the method of claim 11.
However, Hongo does not explicitly teach
detecting the gesture comprises detecting movement of a first hand.
In the same field of endeavor, Wu et al. teach
detecting the gesture ([0018], FIG. 1, i.e. detect hand gestures) comprises detecting movement of a first hand ([0034], FIG. 5, i.e. gesture … with any two fingers 501).
It would have been obvious to a person having ordinary skill in the art at the time the invention was made to combine Hongo teaching method of detecting gesture and eye tracking to perform operations with Wu et al. teaching of method of detecting gesture of one hand to effectively and easily interact with user interface (Wu et al.’s [0034]).
Regarding Claim 15, (Original)
Hongo teaches
the method of claim 11.
However, Hongo does not explicitly teach
processing the first gesture data to detect the gesture being additionally performed by a second hand.
Wu et al. teach
processing ([0018], FIG. 1, i.e. perform computer operations) the first gesture data to detect the gesture being additionally performed by a second hand ([0034], FIG. 5, i.e. "writing" or "drawing" 502 with the other hand 503).
It would have been obvious to a person having ordinary skill in the art at the time the invention was made to combine Hongo teaching method of detecting gesture and eye tracking to perform operations with Wu et al. teaching of method of detecting gesture of both hands to effectively improve user/machine interactions yet additionally provide more operations (Wu et al.’s [0034]).
Regarding Claim 16, (Original) the method of claim 15,
Wu et al. teach
further comprising detecting the gesture by detecting movement of a first hand and second hand relative to each other ([0039], FIG. 9, i.e. two hands 901 are placed apart on the touch surface to indicate a piling gesture).
Regarding Claim 17, (Original)
Hongo teaches
the method of claim 11.
However, Hongo does not explicitly teach
processing the first gesture data to detect an arrangement of finger locations of a first hand.
In the same field of endeavor, Wu et al. teach
processing the first gesture data ([0018], FIG. 1, i.e. perform computer operations) to detect an arrangement of finger locations of a first hand ([0036], FIG. 7-8, i.e. gesture is identified by touching a document 800 with three or more fingers 701).
It would have been obvious to a person having ordinary skill in the art at the time the invention was made to combine Hongo teaching method of detecting gesture and eye tracking to perform operations with Wu et al. teaching of method of detecting gesture of one hand to effectively and easily interact with user interface (Wu et al.’s [0036]).
Regarding Claim 18, (Original) the method of claim 17,
Wu et al. teach
further comprising processing the first gesture data ([0018], FIG. 1, i.e. perform computer operations) to detect movement of the finger locations relative to each other ([0037], FIG. 7-8, i.e. shrinking of the spread of the fingers).
Regarding Claim 19, (Original) the method of claim 18,
further comprising:
processing the first gesture data (i.e. please see above citation(s)) to detect movement of the finger locations toward each other ([0037], FIG. 7-8, i.e. shrinking of the spread of the fingers); and
in accordance with the detection of movement of the finger locations toward each other (i.e. please see above citation(s)), determining that the gesture is a pinch gesture ([0037], FIG. 7-8, i.e. the size of the bounding box can be changed … shrinking).
Regarding Claim 20, (Original)
the method of claim 11.
However, Hongo does not explicitly teach
processing the first gesture data to identify one or more fingers of a detected first hand.
In the same field of endeavor, Wu et al. teach
processing ([0018], FIG. 1, i.e. perform computer operations) the first gesture data to identify one or more fingers of a detected first hand ([0034], FIG. 5, i.e. gesture is continued … with the other hand 503 using a finger).
It would have been obvious to a person having ordinary skill in the art at the time the invention was made to combine Hongo teaching method of detecting gesture and eye tracking to perform operations with Wu et al. teaching of method of detecting gesture of one hand to effectively and easily interact with user interface (Wu et al.’s [0034]).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to VINH TANG LAM whose telephone number is (571) 270-3704. The examiner can normally be reached Monday to Friday 8:00 AM to 5:00 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Nitin K Patel can be reached at (571) 272-7677. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/VINH T LAM/Primary Examiner, Art Unit 2628