Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant's arguments filed 11/25/2025 have been fully considered but they are not persuasive. Applicant contends:
PNG
media_image1.png
170
724
media_image1.png
Greyscale
Kryze teaches a remote control apparatus for communicating with a target device includes: a sensing portion for sensing points of user contact with the apparatus, user gestures, and an acceleration value of the apparatus; a transmitting device for sending signals representative of user commands to the target device; a controller; and a memory including instructions for configuring the controller to perform a self-orientation process based upon at least one of the acceleration value and the points of user contact to determine a forward direction of a plane of operation for defining the user gestures. An axis of the determined plane of operation substantially intersects the apparatus at any angle.(Abstract)
Kryze goes on to state, [a]s shown in FIGS. 10C-10D, the user slides a finger forward or backward and left or right to move the screen display of the target device. [103]
PNG
media_image2.png
244
664
media_image2.png
Greyscale
As shown in fig. 10c and 10d above, the gesture of the “vertical slide” will perform the gesture of the moving the screen in the vertical direction. Also, the second gesture of the “horizonal slide” will perform the gesture of moving the screen in the horizontal direction. Examiner notes, the device is fig 10 is self-orienting, therefore, the vertical swipe moves the display in the vertical direction, and now if the device is rotated 90 deg., the vertical direction swipe is now a horizontal swipe with respect to the original orientation but a vertical direction swipe gestures is performed because the device is self-orienting.
Applicant contends:
PNG
media_image3.png
168
710
media_image3.png
Greyscale
Examiner respectfully disagrees these gestures are unrelated to determining the orientation in which gestures themselves are performed. Pertinent claim language states, “determining, based on a second user input, second information indicating the gesture performed in a second orientation, wherein the second orientation is different than the first orientation;”. Examiner understands the horizontal and vertical gestures are based and dependent on the orientation of the device.
Applicant further contends:
PNG
media_image4.png
140
718
media_image4.png
Greyscale
Examiner respectfully disagrees for the reasons above.
PNG
media_image5.png
314
328
media_image5.png
Greyscale
PNG
media_image6.png
337
242
media_image6.png
Greyscale
Examiner notes the same action will be performed the vertical swipe but there are two different gestures with respect to the original screen orientation.
Furthermore, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986).
The test for obviousness is not whether the features of a secondary reference may be bodily incorporated into the structure of the primary reference; nor is it that the claimed invention must be expressly suggested in any one or all of the references. Rather, the test is what the combined teachings of the references would have suggested to those of ordinary skill in the art. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981).
While Kryze fails to expressly teach rotating and performing the same gestures as annotated above. Weiss does and states the gestures are orientation independent because of using points of the touch which allow for added functionality and more complex gestures than just swiping.
[0097] Thus, embodiments of the present invention can accommodate multiple users without having to add additional hardware devices to a computer system. Gestures can also be orientation independent, permitting users to be in any position relative to a multi-touch input surface. Further, since gestures are based on points, gestures can be entered and processed more efficiently. Accordingly, using only fingers (or similar tools) on a touched based multi-touch input surface, multiple input point gestures facilitate quickly communicating different commands to a computer system.
If Applicant thinks an interview would help expedite prosecution, Examiner is open to an interview.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1, 3, 5-7, 9,11,13,15, 17 and 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kryze (2012/0162073) hereinafter, Kryze in view Weiss et al (2008/0165132) hereinafter, Weiss.
In regards to claim 1, Kryze teaches (Currently Amended) a method comprising (abstract):
determining, based on a first user input, first information indicating a gesture performed in a first orientation (fig.6 608);
[0073] After the initial estimation of the forward direction, if the user makes a user gesture on the operating plane (YES at 608), the controller 506 updates the estimation of the forward direction based upon the user gesture at 610. By estimating the forward direction of the operating plane, the operating plane will be automatically aligned with the orientation of the target device or the base device. Accordingly, a user can pick up the remote control device 500 without being preoccupied about the correct orientation. Another alternative is for the user to explicitly express the forward direction by making a non-symmetrical gesture or generally a wake-up gesture registered in advance into the memory. It no user gesture is made (NO at 608), the process ends.
causing, based on the gesture being performed in the first orientation, an action to be performed (fig. 10c and 10d finger slide) by a media device (fig. 10b a display which displays images/media [0103])
[0103] Referring to FIGS. 10A-10D, a third exemplary use scenario for the remote control device 500 will be discussed. As shown in FIG. 10A, the remote control device 500 is in an idle state and on a horizontal surface such as a table. As shown in FIG. 10B, the user picks up the remote control device 500 and automatically awakes it from the idle state to begin determining the forward direction of the plane of operation. As shown in FIGS. 10C-10D, the user slides a finger forward or backward and left or right to move the screen display of the target device.
determining, based [0069-0078] on a second user input, second information indicating a gesture performed in a second orientation, wherein the second orientation is different than the first orientation [0065-0079];
[0077] At 618, the controller 506 again confirms or updates the estimation of the forward direction based upon the user gesture. Then, the routine returns to 608 to begin operations on another user gesture.
PNG
media_image7.png
776
578
media_image7.png
Greyscale
causing, based on a gesture being performed in the first orientation or the second orientation, the action to be performed (fig. 6a (602-618)(fig. 10c forward/backward) by a media device (fig. 10b a display which displays images [0103])
PNG
media_image8.png
784
568
media_image8.png
Greyscale
Kryze fails to expressly teach the second orientation is different than the first orientation.
However, Weiss teaches the second orientation is different than the first orientation. [0097] (fig. 1b multiple point inputs [0043-0045) and media device [005] Weiss
PNG
media_image9.png
392
620
media_image9.png
Greyscale
It would have been obvious to one of ordinary skill in the art to modify the teachings of Kryze to further include wherein the second orientation is different than the first orientation as taught by Weiss in order to allow for the inputting of gestures more efficiently and orientation independent [0097]
Therefore, Kryze in view of Weiss teaches causing, based on the gesture being performed in the second orientation [0097] Weiss, the action to be performed (fig. 6a (602-618)(fig. 10c forward/backward) by a media device (fig. 10b a display which displays images [0103]) Kryze and media device [005] Weiss
In regards to claim 9, Kryze teaches (Currently Amended) A non-transitory computer-readable storage medium storing computer-readable instructions that, when executed by a processor, cause (abstract):
determining, based on a first user input, first information indicating a gesture performed in a first orientation (fig.6 608);
causing, based on the gesture being performed in the first orientation, an action to be performed (fig. 10c/10d forward and side slide) by a media device (fig. 10b a display which displays images [0103])
determining, based on a second user input, second information indicating the gesture performed in a second orientation, wherein (fig. 10c/10d forward and side slide): [0069-0078]
Kryze fails to expressly teach the second orientation is different than the first orientation.
However, Weiss teaches the second orientation is different than the first orientation. [0097] (fig. 1b multiple point inputs [0043-0045)
It would have been obvious to one of ordinary skill in the art to modify the teachings of Kryze to further include wherein the second orientation is different than the first orientation as taught by Weiss in order to allow for the inputting of gestures more efficiently and orientation independent [0097]
Therefore, Kryze in view of Weiss teaches causing, based on the gesture being performed in the second orientation [0097] Weiss, the action to be performed (fig. 6a (602-618)(fig. 10c forward/backward) Kryze by a media device (fig. 10b a display which displays images [0103]) Kryze and media device [005] Weiss
In regards to claim 15, Kryze teaches (Currently Amended) a device comprising (abstract):
one or more processors; and memory storing instructions that, when executed by the one or more processors, cause the device to (fig. 5 (500, 506, 504)):
determine, based on a first user input, first information indicating a gesture performed in a first orientation [0055-066] (fig. 6a (608-618));
PNG
media_image10.png
298
452
media_image10.png
Greyscale
cause, based on the gesture being performed in the first orientation, an action to be performed (fig. 10c input)) by a media device (fig. 10b a display which displays images [0103]) :
determine, based on a second user input [0069-0078], second information indicating the gesture performed in a second orientation (fig. 6a 618)) ,
Kryze fails to expressly teach: wherein the second orientation is different than the first orientation;
However, Weiss teaches wherein the second orientation is different than the first orientation;[097] (fig. 1b multiple point inputs [0043-0045)
[0097] Thus, embodiments of the present invention can accommodate multiple users without having to add additional hardware devices to a computer system. Gestures can also be orientation independent, permitting users to be in any position relative to a multi-touch input surface. Further, since gestures are based on points, gestures can be entered and processed more efficiently. Accordingly, using only fingers (or similar tools) on a touched based multi-touch input surface, multiple input point gestures facilitate quickly communicating different commands to a computer system.
It would have been obvious to one of ordinary skill in the art to modify the teachings of Kryze to further include wherein the second orientation is different than the first orientation as taught by Weiss in order to allow for the inputting of gestures more efficiently and orientation independent [0097]
Therefore, Kryze in view of Weiss teaches
cause determine, based on the gesture being performed in the first orientation or the second orientation [0097] Weiss, the action to be performed (fig. 6a (602-618)(fig. 10c forward/backward) Kryz. by a media device (fig. 10b a display which displays images [0103]) Kryze and media device [005] Weiss
In regards to claim 3, Kryze in view of Weiss teaches (Currently Amended) the method of claim 1, wherein the gesture comprises a swipe gesture associated with the action to be performed user profile (fig. 10c/10d swipe up and down ). “As shown in FIGS. 10C-10D, the user slides a finger forward or backward and left or right to move the screen display of the target device.”[103]Kryze
In regards to claim 5, Kryze teaches (Currently Amended) the method of claim 1, further comprises: determining, based on a third user input, third information indicating a second gesture; determining, based on the second gesture, a second action to be performed (fig 10c up/down performed in first and second orientation is the first and the second gesture is fig. 10d side swipe Kryze).
In regards to claim 6, Kryze teaches (Currently Amended) the method of claim 5, wherein the second gesture is performed in an orientation associated with the second action to be performed (fig. 10d orientation of forward direction determines gesture) Kryze.
In regards to claim 7, Kryze in view of teaches (Previously Presented) The method of claim 1, wherein the gesture comprises at least one of: a swipe gesture (fig. 10c/10d swipe up and down) Kryze a selection gesture, or a J-shaped gesture. Examiner notes the “or” operator only require one element for the prior art to read on the claims.
1. In regards to claim 11, Kryze in view of Weiss teaches (Currently Amended) the non-transitory computer-readable storage medium of claim 9, wherein the gesture comprises a swipe gesture associated with the action to be performed (fig. 10c/10d swipe up and down [103]).Kryze
2. In regards to claim 13, Kryze in view of Weiss teaches non-transitory computer-readable storage medium of claim 9, wherein the gesture comprises at least one of: a swipe gesture (fig. 10c/10d swipe up and down [103])Kryze, a selection gesture, or a J-shaped gesture.
3. In regards to claim 17, Kryze in view of Weiss teaches (Currently Amended) The device of claim 15, wherein the gesture comprises a swipe gesture associated with the action to be performed. (fig. 10c/10d swipe up and down) Kryze
4. In regards to claim 19, Kryze in view of Weiss teaches Previously Presented) The device of claim 15, wherein the gesture comprises at least one of: a swipe gesture (fig. 10c/10d swipe up and down) Kryze, a selection gesture (fig. 7f selection tap) Kryze, or a J-shaped gesture.
Claim(s) 8, 14, 20, 24-26 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kryze in view Weiss et al (2008/0165132) hereinafter, Weiss further in view of Hope et al (2009/0153288) hereinafter, Hope.
In regards to claim 8, Kryze in view of Weiss teaches (Currently Amended) the method of claim 7, wherein the swipe gesture causes the action to be performed to comprise: causing a greater degree of movement through content on the a media device than the selection gesture or the J-shaped gesture.
However, Hope teaches wherein the swipe gesture causes the action to be performed to comprise: causing a greater degree of movement through content on the a media device than the selection gesture or the J-shaped gesture. [0099-114] (Fig. 9 (99)) (fig. 10 100) (fig. 11 (112 vs 108) Hope).
Handheld electronic devices are provided that have remote control functionality and gesture recognition features. The handheld electronic device may have remote control functionality in addition to cellular telephone, music player, or handheld computer functionality. The handheld electronic devices may have a touch sensitive display screen. The handheld electronic devices may recognize gestures performed by a user on the touch sensitive display screen. The handheld electronic devices may generate remote control signals from gestures that the handheld electronic device may recognize. A media system may receive the remote control signals and may take appropriate action. The touch sensitive display screen may be used to present the user with information about the media system such as the current volume.(abstract)
It would have been obvious to one of ordinary skill in the art to modify the teachings of Kryze and Weiss to further include wherein the swipe gesture causes the action to be performed to comprise: causing a greater degree of movement through content on the a media device than the selection gesture or the J-shaped gesture as taught by Hope in order to provide appropriate remote control signal of a media system [0005-009]
In regards to claim 14, Kryze in view of Weiss and Hope, see rational of claim 8, teaches non-transitory computer-readable storage medium of claim 13, wherein the swipe gesture causes the action to be performed to comprise: causing a greater degree of movement through content on the a media device than the selection gesture or the J-shaped gesture. [0099-114] (Fig. 9 (99)) (fig. 10 100) (fig. 11 (112 vs 108) Hope).
In regards to claim 20, Kryze and Weiss fail to teach (Currently Amended) the device of claim 19, wherein the swipe gesture causes the action to be performed to comprise:
causing causes a greater degree of movement through content on a media device than the selection gesture or the J-shaped gesture.
However, Hope teaches wherein the swipe gesture causes the action to be performed to comprise [0099-114]:
causing causes a greater degree of movement through content on a media device than the selection gesture (Fig. 9 (99)) (fig. 10 100) (fig. 11 (112 vs 108) Hope) or the J-shaped gesture.
PNG
media_image11.png
822
610
media_image11.png
Greyscale
PNG
media_image12.png
532
674
media_image12.png
Greyscale
It would have been obvious to one of ordinary skill in the art to modify the teachings of Kryze and Weiss to further include wherein the swipe gesture causes the action to be performed to comprise: causing causes a greater degree of movement through content on a media device than the selection gesture or the J-shaped gesture as taught by Hope in order to provide appropriate remote control signal of a media system [0005-009]
In regards to claim 24, Kryze in view of Weiss and Hope, see rational of claim 8, teaches (New) The device of claim 15, wherein the action to be performed comprises: causing access to a user profile, causing movement through content on a media device, fast-forwarding through content on a media device, (Fig. 9 (99)) (fig. 10 100) (fig. 11 (112 vs 108) Hope) skipping content on a media device [090, 0092,108-109] Hope , or selecting an icon on a media device.
PNG
media_image13.png
600
886
media_image13.png
Greyscale
In regards to claim 25, Kryze in view of Weiss and Hope, see rational of claim 8, teaches method of claim 1, wherein the action to be performed comprises causing access to a user profile, causing movement through content on a media device, fast-forwarding through content on a media device (Fig. 9 (99)) (fig. 10 100) (fig. 11 (112 vs 108) Hope), skipping content on a media device [090, 0092,108-109] Hope, or selecting an icon on a media device.
In regards to claim 26, Kryze in view of Weiss and Hope, see rational of claim 8, teaches non-transitory computer-readable storage medium of claim 9, wherein the action to be performed comprises: causing access to a user profile, causing movement through content on a media device, fast-forwarding through content on a media device(Fig. 9 (99)) (fig. 10 100) (fig. 11 (112 vs 108) Hope), skipping content on a media device, [090, 0092,108-109] Hope, or selecting an icon on a media device.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to GRANT SITTA whose telephone number is (571)270-1542. The examiner can normally be reached M-F 7:30-4:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Patrick Edouard can be reached at 571-272-6084. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/GRANT SITTA/Primary Examiner, Art Unit 2622