Detailed Action
Notice of Pre-AIA or AIA status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
This Final Office action is responsive to the communication filed under 37 C.F.R. § 1.111 on November 24, 2025 (hereafter “Response”). The amendments to the claims are acknowledged and have been entered.
Claims 1–3, 5–9, and 11–20 are now amended.
Claims 1–20 are pending in the application.
Response to Arguments
The objections to the title and to claim 8 are hereby withdrawn in response to the amendments thereto.
The rejection under 35 U.S.C. § 112(b) is hereby withdrawn in view of the Applicant’s amendment preventing invocation of 35 U.S.C. § 112(f).
Claim(s) 1–14, 17, and 20 stand rejected under 35 U.S.C. § 102(a)(1) as being anticipated by U.S. Patent Application Publication No. 2009/0217211 A1 (“Hildreth”). The Applicant’s remarks have been considered in light of the amendment, but do not persuade the Examiner to withdraw the rejection.
The Applicant contends “that Hildreth at paragraph [0090] requires a separate, specific gesture to terminate the GUI,” (Response 9) and while the Examiner agrees that paragraph 90 does disclose a separate gesture to terminate the GUI, paragraph 80 further discloses that the GUI 114 may be disabled (and thus removed from the screen, per the definition in ¶ 83) simply in response to selecting one of the menu items in the GUI 114. Accordingly, claim(s) 1–14, 17, and 20 stand rejected over Hildreth.
The respective 35 U.S.C. § 102 rejections of the claims based on Van Der Westhuizen and Majid, alone, are hereby withdrawn, in favor of a new ground of rejection under 35 U.S.C. § 103 based on the combination of Van Der Westhuizen and Majid. The Applicant contends that Van Der Westhuizen does not address the termination of GUIs (Response 9), but, in fact, Van Der Westhuizen does address this feature in the FIG. 4 embodiment. Accordingly, the claims are now rejected over the FIG. 4 embodiment in Van Der Westhuizen (not previously discussed), but as combined with Majid to show the obviousness of the newly added “superimposing” limitation.
In addition to the above new ground of rejection, the change in scope also necessitates additional grounds of rejection over the Tsurumi, as discussed herein.
Since all of the claims are rejected the Applicant’s request for a notice of allowance is respectfully denied.
Claim Objections
As a result of the amendment, the singular form of “executes” on line 9 of claim 1 (as printed in the November 24, 2024 amendment) now disagrees with the plural circuitry (and “identify” earlier on line 8).
Appropriate correction is required.
Claim Rejections – 35 U.S.C. § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. § 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
I. Tsurumi discloses claims 1–10, 12, 13, 17, and 20.
Claim(s) 1–10, 12, 13, 17, and 20 are rejected under 35 U.S.C. § 102(a)(1) as being anticipated by U.S. Patent Application Publication No. 2009/0073117 A1 (“Tsurumi”).
Claim 1
Tsurumi discloses:
An information processing system comprising circuitry configured to:
As shown in FIG. 2, Tsurumi discloses an image processing apparatus 11 corresponding to the claimed information processing system, comprising several functional elements 12–13 and 31–39, each of which may be embodied as a program that causes a computer to execute the functionality described for that unit. See Tsurumi ¶¶ 16 and 70–71.
detect a first gesture of a user;
A “trigger detection unit 33 detects, based on [a] captured image” that was captured by camera 12 and stored in frame buffer 32, “a trigger indicating that the user has waved their hand to the camera 12.” Tsurumi ¶ 74; see also Tsurumi ¶ 95 (describing the same in the example of FIGS. 4–5).
cause a graphic user interface (GUI) responsive to detection of the first gesture to be displayed that is superimposed on currently displayed video content while the video content continues to be displayed;
“The trigger detection unit 33 supplies area information indicating an area where the waving of the hand of the user has been detected within the captured image to [a] feature point extraction unit 34,” Tsurumi ¶ 74, initiating a cascade of steps that culminate with a “combining unit 39” combining content obtained from a tuner or via a network” together with “UI images” that are generated by a UI drawing unit 36, the UI images being responsive to the trigger. Tsurumi ¶¶ 86–88.
For example, “as shown in FIG. 5, on the display unit 13, a mirrored image of a captured image is displayed in a translucent manner on an image-viewing screen on which an image of the television program of channel 1 is being currently displayed, and the feature-point pointer 14 and the UIs 15 are displayed on the basis of the feature point.” Tsurumi ¶ 96.
identify an operation presented on the GUI on a basis of a second gesture made following the first gesture and executes a control command corresponding to the identified operation;
“Then, as shown in FIG. 9, the user moves their right hand grabbing the feature-point pointer 14 to the right with respect to the camera 12. Then, as shown in FIG. 10, the user moves their right hand grabbing the feature-point pointer 14 downward with respect to the camera 12 so as to move the feature-point pointer 14 outside the circle defined by the UIs 15. Thus, a command for terminating the interface display process is issued.” Tsurumi ¶ 100.
and terminate the GUI when the second gesture ends.
“As a result, as shown in FIG. 11, the captured image, the feature-point pointer 14, and the UIs 15 are deleted from the display unit 13. That is, only the image of the target, i.e., the television program of channel 3, is displayed on the image-viewing screen of the display unit 13.” Tsurumi ¶ 101.
Claim 2
Tsurumi discloses the information processing system according to claim 1, wherein the circuitry is further configured to
cause the GUI in which a plurality of command icons corresponding to operation content of a device is arranged to be displayed.
“The UI drawing unit 36 determines, based on the extracted-position information supplied from the feature point extraction unit 34, an arrangement of UIs so that the UIs can be arranged around a feature point. The UI drawing unit 36 supplies UI information indicating the arrangement of UIs and the UI images stored therein in advance to the combining unit 39,” Tsurumi ¶ 82, which then displays the “UIs” on the display unit 13 as described in the rejection of claim 1 above. Importantly, the term “UIs” in this context refers to the “UIs 15-1 to 15-7” shown throughout the figures. Tsurumi ¶ 69. For example, “as shown in FIG. 1, a feature-point pointer 14 indicating a feature point of the captured image and user interfaces (UIs) 15-1 to 15-7 to be operated to select each channel are displayed over an image of the television program of channel 1 on the display unit 13.” Tsurumi ¶ 65.
Claim 3
Tsurumi discloses the information processing system according to claim 2, wherein
the circuitry is further configured to cause the GUI that includes a first command icon, which is the command icon arranged at a position in a first direction with a reference position as a center, and a second command icon, which is the command icon arranged at a position in a second direction opposite to the first direction, to be displayed,
“The UIs 15-1 to 15-7 are arranged along a circumference of a predetermined circle surrounding the original feature-point pointer 14.” Tsurumi ¶ 65.
and the circuitry is further configured to accept the action toward the first direction or the action toward the second direction as the second gesture.
“The image processing apparatus 11 detects the position of the feature point indicated by the feature-point pointer 14 grabbed by the user from the image captured using the camera 12, and a command corresponding to one of the UIs 15-1 to 15-7 that is located at the position of the feature point is issued.” Tsurumi ¶ 67.
Claim 4
Tsurumi discloses the information processing system according to claim 3,
wherein the first command icon and the second command icon are arranged linearly.
As shown in FIG. 1 (among others), there are several examples of pairs of “UIs” that are arrange linearly, including UI 15-1 with UI 15-5, UI 15-2 with UI 15-6, and UI 15-3 with UI 15-7. Tsurumi FIG. 1.
Claim 5
Tsurumi discloses the information processing system according to claim 2, wherein the circuitry is further configured to
cause a boundary of a region assigned to the operation indicated by each of the command icons to be displayed on the GUI.
“The UIs 15-1 to 15-7 are arranged along a circumference of a predetermined circle surrounding the original feature-point pointer 14.” Tsurumi ¶ 65.
Claim 6
Tsurumi discloses the information processing system according to claim 2, wherein the circuitry is further configured to
identify the operation presented on the GUI in response to an action of moving a hand in a predetermined direction being performed as the second gesture following the first gesture made using the hand.
“Then, as shown in FIG. 9, the user moves their right hand grabbing the feature-point pointer 14 to the right with respect to the camera 12. Then, as shown in FIG. 10, the user moves their right hand grabbing the feature-point pointer 14 downward with respect to the camera 12 so as to move the feature-point pointer 14 outside the circle defined by the UIs 15. Thus, a command for terminating the interface display process is issued.” Tsurumi ¶ 100.
Claim 7
Tsurumi discloses the information processing system according to claim 6, wherein the circuitry is further configured to
identify the operation corresponding to the command icon arranged in the same direction as the predetermined direction.
“Then, as shown in FIG. 9, the user moves their right hand grabbing the feature-point pointer 14 to the right with respect to the camera 12. Then, as shown in FIG. 10, the user moves their right hand grabbing the feature-point pointer 14 downward with respect to the camera 12 so as to move the feature-point pointer 14 outside the circle defined by the UIs 15. Thus, a command for terminating the interface display process is issued.” Tsurumi ¶ 100.
Claim 8
Tsurumi discloses the information processing system according to claim 6,
wherein the first gesture includes a different finger movement from another finger movement performed as the second gesture.
“As shown in FIG. 4, first, a user waves their hand to the camera 12 during the watching of, for example, a television program on channel 1” as the trigger, Tsurumi ¶ 95, in contrast to the moving and grasping gesture discussed earlier.
Claim 9
Tsurumi discloses the information processing system according to claim 6, wherein the circuitry is further configured to
cause an icon that represents the first gesture to move in the same direction as the predetermined direction in response to the second gesture being made.
“Then, for example, the user moves their right hand grabbing the feature-point pointer 14 to the left with respect to the camera 12 in a manner shown in FIG. 8, while viewing the feature-point pointer 14 and UIs 15 being currently displayed on the display unit 13, so that the feature-point pointer 14 can be superimposed on the UI 15-2 on the image-viewing screen of the display unit 13.” Tsurumi ¶ 98.
Claim 10
Tsurumi discloses the information processing system according to claim 6, wherein the circuitry is further configured to
present the direction in which the second gesture is made by an image that indicates a track of movement of an icon that represents the first gesture or by an image that indicates the predetermined direction.
“As the user moves the hand grabbing the feature-point pointer 14, an optical flow indicating the motion of the hand is calculated as the optical flow of the feature point. As a result, the user can use the hand grabbing the feature-point pointer 14 as a pointing device.” Tsurumi ¶ 91. “Accordingly, the image processing apparatus 11 is configured to, instead of recognizing a user's hand based on the shape thereof or the like to detect a motion of the hand, recognize a feature point that is indicated by the feature-point pointer 14 grabbed with a user's hand and that moves along with the user's hand, and to detect a motion of the feature point as a motion of the user's hand.” Tsurumi ¶ 92.
Claim 12
Tsurumi discloses the information processing system according to claim 2, wherein the circuitry is further configured to
switch a type of the command icons included in the GUI depending on a state of the device to be controlled.
“In FIGS. 4 to 11, the target to be viewed is a television program by way of example, and the use of a pointing device to change a channel of a television program to be viewed has been described. However, the target to be viewed is not limited to a television program,” Tsurumi ¶ 103, and instead, “as shown in FIGS. 12 to 20, the target to be viewed may be a Web page.” Tsurumi ¶ 104. In the case of the web page, “UIs 91 to 93 are displayed on the basis of the feature point,” in a menu displayed along the bottom of the display unit 13. Tsurumi ¶ 105 and FIG. 13.
Claim 13
Tsurumi discloses the information processing system according to claim 1, wherein the circuitry is further configured to
terminate the display of the GUI in a case where an action different from the second gesture is performed during the display of the GUI.
In addition to Tsurumi’s circuitry being configured to terminate the interface in the manner discussed in the rejection of claim 1 (e.g., by moving the feature-point pointer 14 outside the circle), Tsurumi’s circuitry is also configured, in the case of the web browser interface shown in FIG. 13, “to terminate the interface display process” in response to an operation moving the feature-point pointer 83 to UI 93. Tsurumi ¶ 105.
Claim 17
Tsurumi discloses the information processing system according to claim 1, wherein the circuitry is further configured to
present the second gesture being recognized.
“[A]s shown in FIG. 10, the user moves their right hand grabbing the feature-point pointer 14 downward with respect to the camera 12 so as to move the feature-point pointer 14 outside the circle defined by the UIs 15.” Tsurumi ¶ 100.
Claim 20
Claim 20 is directed to the same method that the apparatus of claim 1 performs as part of its normal operation. Since Tsurumi discloses the apparatus for the reasons given in the rejection of claim 1, Tsurumi necessarily discloses the method that the apparatus performs. See MPEP § 2112.02. Additionally, Tsurumi explicitly discloses that method as a process. See Tsurumi FIG. 21.
II. Hildreth discloses claims 1–14, 17, and 20.
Claim(s) 1–14, 17, and 20 are rejected under 35 U.S.C. § 102(a)(1) as being anticipated by U.S. Patent Application Publication No. 2009/0217211 A1 (“Hildreth”).
Claim 1
Hildreth discloses:
An information processing system comprising circuitry configured to:
“FIG. 2 is a block diagram of a device 200 used to implement enhanced input.” Hildreth ¶ 32.
detect a first gesture a user;
There are two elements of Hildreth’s device 200 that independently anticipate the claimed detection unit, depending on its interpreted scope.
The first element is camera 204, which is “configured to capture images of an object or user interacting with an application.” Hildreth ¶ 35. The second element is the section of computer programming responsible for analyzing the captured images in order to detect a user’s gesture depicted therein. See Hildreth ¶¶ 42 and 94.
cause a graphic user interface (GUI) responsive to detection of the first gesture be displayed
Device 200 further includes a “user interface 201” that “may provide a mechanism for both input and output, allowing a user to manipulate the device or for the device to produce the effects of the user's manipulation.” Hildreth ¶ 32. “The device 200 may utilize any type of user interface 201, such as a graphical user interface (GUI).” Hildreth ¶ 32.
One such GUI is “a control that further includes interaction elements disposed radially in relation to the central region,” which device 200 invokes “when the enabling gesture is performed and recognized.” Hildreth ¶ 57. Specifically, as shown in FIG. 1, “[a]n enhanced control 114 is displayed.” Hildreth ¶ 28.
that is superimposed on currently displayed video content while the video content continues to be displayed;
Enhanced control 114 is not displayed on its own, but rather, “is displayed in a user interface 112,” Hildreth ¶ 28, while “the user interface 112 displays other objects than the enhanced control.” Hilderth ¶ 30. Enhanced control 114 is displayed in the user interface 112 a manner such that, prior to a correction, “other objects than the enhanced control or regions of the user interface 112 become obfuscated” by the enhanced control 114. Hildreth ¶ 30.
identify an operation presented on the GUI on a basis of a second gesture made following the first gesture and executes a control command corresponding to the identified operation;
Device 200 further includes programming for performing the function of step S305 shown in FIG. 3. Exactly as claimed, “an interaction with the control occurs” during this step, “[b]ased on the recognized user’s gesture.” Hildreth ¶ 78.
and terminate the GUI when the second gesture ends.
“Similar to a mouse event, when the representation overlaps or selects a particular interaction element,” in addition to invoking the underlying function of the particular interaction element, “the control may become disabled.” Hildreth ¶ 80. “Disabled,” in this context, means the control is removed from the screen. See Hildreth ¶ 83.
Claim 2
Hildreth discloses the information processing system according to claim 1,
wherein the circuitry is further configured to cause the GUI in which a plurality of command icons corresponding to operation content of a device is arranged to be displayed.
“[A] representation of the user may be displayed in a central region of a control that further includes interaction elements disposed radially in relation to the central region, such as when the enabling gesture is performed and recognized (S304).” Hildreth ¶ 57.
Claim 3
Hildreth discloses the information processing system according to claim 2,
wherein the circuitry is further configured to cause the GUI that includes a first command icon, which is the command icon arranged at a position in a first direction with a reference position as a center, and a second command icon, which is the command icon arranged at a position in a second direction opposite to the first direction, to be displayed,
“FIG. 6 illustrates several example shapes and configurations of the enhanced control. Control 601 is a circular control including eight icons 602a to 602h emanating, hub-and-spoke fashion, from a central region 604.” Hildreth ¶ 61. It should be understood, by looking at FIG. 6 (and also explicitly via the “hub-and-scope” disclosure in the text), that for any given icon on the wheel mapped to the claimed first command icon, the claimed second command icon likewise reads on whichever icon is opposite to that first icon.
and the circuitry is further configured to accept the action toward the first direction or the action toward the second direction as the second gesture.
“Based on the recognized user's gesture, an interaction with the control occurs (S305). The recognized user's gesture may cause the representation to move away from the center region of the control in a direction and magnitude based on the direction and magnitude of the users motion in free-space, causing the representation to overlap one or more interaction elements.” Hildreth ¶ 78.
Claim 4
Hildreth discloses the information processing system according to claim 3,
wherein the first command icon and the second command icon are arranged linearly.
“Although the enhanced control 114 is illustrated with a two-dimensional wheel with a hub-and-spoke appearance, in other implementations other shapes can be used. For instance, the enhanced control 114 may be linear.” Hildreth ¶ 29.
That said, claim 4 also reads on the main embodiment where “circular control including eight icons 602a to 602h emanating, hub-and-spoke fashion, from a central region 604,” Hildreth ¶ 61, because any two icons on opposite spokes are necessarily arranged linearly. For example, looking at FIG. 6, the mail icon 602a is clearly on a perfect line in opposite directions with the Vol- icon.
Claim 5
Hildreth discloses the information processing system according to claim 2,
wherein the circuitry is further configured to cause a boundary of a region assigned to the operation indicated by each of the command icons to be displayed on the GUI.
“Control 611 is a square-shaped control including eight icons 612a to 612h located in block-shaped interaction regions 614a to 614h around a center region that is generally aligned with the center of the user interface 615.” Hildreth ¶ 62 (referring to FIG. 6).
Claim 6
Hildreth discloses the information processing system according to claim 2,
wherein the circuitry is further configured to identify the operation presented on the GUI in response to an action of moving a hand in a predetermined direction being performed as the second gesture following the first gesture made using the hand.
“Based on the recognized user's gesture, an interaction with the control occurs (S305). The recognized user's gesture may cause the representation to move away from the center region of the control in a direction and magnitude based on the direction and magnitude of the users motion in free-space, causing the representation to overlap one or more interaction elements.” Hildreth ¶ 78. Careful readers should observe that the gesture recognized in S305 is a different gesture performed subsequent to the one mentioned in the rejection of claim 1 for invoking the menu.
Claim 7
Hildreth discloses the information processing system according to claim 6,
wherein the circuitry is further configured to identify the operation corresponding to the command icon arranged in the same direction as the predetermined direction.
“FIG. 8 illustrates an exemplary gesture and concomitant control interaction. Specifically, a user 801 gesticulates his arm from a first position 802 to a second position 803, thereby causing representation 805 in user interface 806 to move right from the center position 807 and to highlight icon 809 of an interaction element disposed to the right of the center region.” Hildreth ¶ 81.
Claim 8
Hildreth discloses the information processing system according to claim 6,
the first gesture includes a different finger movement from another finger movement performed as the second gesture.
For the first gesture, “[t]he media hub 103 recognizes the palm-forward, finger extended pose of the user's right hand 106 as signifying that a gesture-based control input is forthcoming.” Hildreth ¶ 26; see also Hildreth ¶¶ 50 and 56.
For the second gesture, instead of the finger extended pose, the user makes a “pointing finger hand pose 904,” and points in the direction of the desired icon. Hildreth ¶¶ 82–83.
Claim 9
Hildreth discloses the information processing system according to claim 6,
wherein the circuitry is further configured to cause an icon that represents the first gesture to move in the same direction as the predetermined direction in response to the second gesture being made.
As shown in FIG. 8, “a user 801 gesticulates his arm from a first position 802 to a second position 803, thereby causing representation 805 in user interface 806 to move right from the center position 807 and to highlight icon 809 of an interaction element disposed to the right of the center region.” Hildreth ¶ 81.
Claim 10
Hildreth discloses the information processing system according to claim 6,
wherein the circuitry is further configured to present the direction in which the second gesture is made by an image that indicates a track of movement of an icon that represents the first gesture or by an image that indicates the predetermined direction.
As shown in FIG. 8, “a user 801 gesticulates his arm from a first position 802 to a second position 803, thereby causing representation 805 in user interface 806 to move right from the center position 807 and to highlight icon 809 of an interaction element disposed to the right of the center region.” Hildreth ¶ 81.
Claim 11
Hildreth discloses the information processing system according to claim 6,
wherein the circuitry is further configured to repeatedly execute the control command in a case where a state in which the hand is moved in the predetermined direction is maintained.
“As illustrated in FIG. 12, to effect a mouse-drag event, the user may keep their hand in the ‘mouse down’ pose while moving their hand around the tracking region to move the mouse accordingly. Specifically, a user 1201 moves his arm down and to the left from a first position 1202 to a second position 1204 while holding the mouse-down, finger-pointed pose, to grab the desktop icon 1205 and move it toward the center of the user interface 1206.” Hildreth ¶ 88.
Claim 12
Hildreth discloses the information processing system according to claim 2,
wherein the circuitry is further configured to switch a type of the command icons included in the GUI depending on a state of the device to be controlled.
In one example, “the face of the user 101 is detected and recognized, identifying the user 101 as ‘Bob.’” Hildreth ¶ 27. Thus, with device 200 (which is the device to be controlled, per the rejection of claim 1) in a state where “Bob” is the current recognized user, “interaction element 115c is associated with a user-specific photo album function,” and “interaction element 115g is associated with a user-specific music function.” Hildreth ¶ 28. That is, as shown in FIG. 1, interaction element 115c is displayed as “Bob’s Music,” while interaction element 115g is displayed as “Bob’s Photos.”
Claim 13
Hildreth discloses the information processing system according to claim 1,
wherein the circuitry is further configured to terminate the display of the GUI in a case where an action different from the second gesture is performed during the display of the GUI.
“In FIG. 13, for example, the user 1301 drops his arms from a controlling, first position 1302 to a collapsed or relaxed position adjacent to his torso 1305. Such a motion causes the control to disappear from the user interface 1306.” Hildreth ¶ 90.
Claim 14
Hildreth discloses the information processing system according to claim 1,
wherein the circuitry is further configured to switch a display position of the GUI depending on content of video on which the GUI is superimposed and displayed.
“[A]s the user interface 112 displays other objects than the enhanced control or regions of the user interface 112 become obfuscated, the enhanced control may dynamically reposition itself, change [its] shape, or change the number of interaction elements displayed.” Hildreth ¶ 30.
Claim 17
Hildreth discloses the information processing system according to claim 1,
wherein the circuitry is further configured to present the second gesture being recognized.
“[W]hen the representation overlaps or selects a particular interaction element, the control may become disabled [and] the underlying interaction element or icon may become highlighted.” Hildreth ¶ 80.
Claim 20
Claim 20 recites exactly the same method that the information processing system of claim 1 is programmed to perform as part of its normal operation. “Under the principles of inherency, if a prior art device, in its normal and usual operation, would necessarily perform the method claimed, then the method claimed will be considered to be anticipated by the prior art device.” MPEP § 2112.02. Therefore, claim 20 is rejected as anticipated by the same findings and rationale as provided above for claim 1.
Claim Rejections – 35 U.S.C. § 103
The following is a quotation of 35 U.S.C. § 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned at the time any inventions covered therein were effectively filed absent any evidence to the contrary. Applicant is advised of the obligation under 37 C.F.R. § 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned at the time a later invention was effectively filed in order for the examiner to consider the applicability of 35 U.S.C. § 102(b)(2)(C) for any potential 35 U.S.C. § 102(a)(2) prior art against the later invention.
Claim(s) 1, 2, 15, 16, and 18–20 are rejected under 35 U.S.C. § 103 as being unpatentable over U.S. Patent Application Publication No. 2017/0108998 A1 (“Van Der Westhuizen”) in view of U.S. Patent Application Publication No. 2014/0362294 A1 (“Majid”).
Claim 1
Van Der Westhuizen teaches:
An information processing system comprising circuitry configured to:
As shown in FIGS. 4A to 4I, Van Der Westhuizen teaches a mobile device 401 that falls within the scope of the claimed invention. Van Der Westhuizen ¶ 91.
detect a first gesture a user;
“At a first stage (420), illustrated in FIG. 4B, the pointer (402) enters the control region (403),” Van Der Westhuizen ¶ 94, and “[a]t a next stage (430), the pointer (402) enters a predefined zone (422) in the control region (403).” Van Der Westhuizen ¶ 95.
cause a graphic user interface (GUI) responsive to detection of the first gesture be displayed ;
“Entering of the predefined zone (422) is a predefined interaction which initiates display of the categories (415, 416, 417, 418) on the display of the mobile phone (401). The spawning of the categories is illustrated in FIG. 4C. The predefined interaction may also be provided by the pointer simply entering the control region (403).” Van Der Westhuizen ¶ 95.
identify an operation presented on the GUI on a basis of a second gesture made following the first gesture
Next, the mobile device 401 detects when “the pointer (402) is moved away from the virtual button (408) along the navigation region (410), it crosses a second category threshold (412) at a next stage (450).” Van Der Westhuizen ¶ 99.
and executes a control command corresponding to the identified operation;
In response, the user interface “designates the second category threshold (412) as the active threshold, designates the first category threshold (411) as a deactivated threshold, and populates the display region (404) with interactive items associated with the second category threshold (412).” Van Der Westhuizen ¶ 99.
and terminate the GUI when the second gesture ends.
The user may end the above category designation gesture by moving pointer (402) past the terminal threshold (419), and in response “interactive items are removed from the display region (404) and a home screen is displayed once more.” Van Der Westhuizen ¶ 106.
Van Der Westhuizen does not appear to explicitly disclose whether the category interface is actually superimposed over the home screen, or if the category interface simply replaces it.
Majid, however, teaches an analogous information processing system with circuitry configured to:
detect a first gesture a user;
“As shown, the input selection process is initiated by the receipt of an input selection command (operation 602), such as with the depression of a dedicated input button or selection of an input option in a graphical user interface.” Majid ¶ 51.
cause a graphic user interface (GUI) responsive to detection of the first gesture be displayed that is superimposed on currently displayed video content while the video content continues to be displayed;
The input selection command at operation 602 causes operations 604–606 to obtain information relevant to displaying a user interface, see Majid ¶ 51, and then, “[t]he information obtained for the on-screen display is [] provided within a display of the on-screen interface (operation 608).” Majid ¶ 52. For example, in FIG. 4, “the video display 404 overlays the user interface over the active audiovisual source with a series of input source indications and previews.” Majid ¶ 34.
identify an operation presented on the GUI on a basis of a second gesture made following the first gesture
“With the on-screen display, the display device can receive an input selection command for a particular source (operation 610).” Majid ¶ 53.
and executes a control command corresponding to the identified operation.
“Upon receipt of the input selection command, the display device can then switch to the particular input source (operation 612).” Majid ¶ 53.
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to improve Van Der Westhuizen’s user interface with Majid’s technique of overlaying the user interface onto the underlying input source, so that the user could continue to consume content while navigating the menu. One would have been motivated to apply Majid’s technique to Van Der Westhuizen’s user interface because this would prevent the user from missing out on content while he is navigating the menu.
Claim 2
Van Der Westhuizen and Majid teach the information processing system according to claim 1,
wherein the circuitry is further configured to cause the GUI in which a plurality of command icons corresponding to operation content of a device is arranged to be displayed.
“Entering of the predefined zone (422) is a predefined interaction which initiates display of the categories (415, 416, 417, 418) on the display of the mobile phone (401). The spawning of the categories is illustrated in FIG. 4C. The predefined interaction may also be provided by the pointer simply entering the control region (403).” Van Der Westhuizen ¶ 95.
Claim 15
Van Der Westhuizen and Majid teach the information processing system according to claim 2,
wherein the circuitry is further configured to change a size of the GUI depending on a distance to a part of the user used for the first gesture.
“Entering of the predefined zone (422) is a predefined interaction which initiates display of the categories (415, 416, 417, 418) on the display of the mobile phone (401),” Van Der Westhuizen ¶ 95, but “once the pointer (402) reaches the virtual button (408), the categories are fully displayed on the display (406).” Van Der Westhuizen ¶ 96 (emphasis added).
Claim 16
Van Der Westhuizen and Majid teach the information processing system according to claim 15,
wherein the circuitry is further configured to switch a type of the command icons or a number of the command icons included in the GUI depending on the distance to the part.
“Entering of the predefined zone (422) is a predefined interaction which initiates display of the categories (415, 416, 417, 418) on the display of the mobile phone (401),” Van Der Westhuizen ¶ 95, whereas, “once the pointer (402) reaches the virtual button (408), the categories are fully displayed on the display (406), with a first category (415) being displayed in the display region (404), as indicated in FIG. 4D.” Van Der Westhuizen ¶ 96. With this full display, “[e]ach category contains a subset of interactive items.” Van Der Westhuizen ¶ 97.
Claim 18
Van Der Westhuizen and Majid teach the information processing system according to claim 2, wherein
in a case where the command icon related to control of an external device that serves as a source of video is selected by the second gesture, the circuitry is further configured to cause an icon that represents the external device to be displayed together with the GUI.
FIG. 4 illustrates a case where operation 610 (the claimed second gesture) triggers operation 612 of switching to HDMI 2 as the current source of video. See Majid ¶ 35. In this state, “[n]ames of the respective input and output sources are provided in selectable indications arranged according to the position of the input/output source,” Majid ¶ 36, including the selectable indication 434A for HDMI 2. Majid ¶ 37 and FIG. 4. The display also provides “an active input source indication 406 (e.g., a label indicating that the current source is ‘HDMI 2’ with a known ‘Satellite’ label).” Majid ¶ 35.
To be clear, either indication 434A or active input source indication 406 individually and independently anticipate the claimed icon.
Claim 19
Van Der Westhuizen and Majid teach the information processing system according to claim 18,
wherein the circuitry is further configured to cause a preview image of the video output from the external device or an instruction command for the external device to be displayed together with the GUI.
“In the example of FIG. 4, the video display 404 overlays the user interface over the active audiovisual source with a series of input source indications and previews.” Majid ¶ 34.
Claim 20
Claim 20 recites exactly the same method that the information processing system of claim 1 is programmed to perform as part of its normal operation. “Under the principles of inherency, if a prior art device, in its normal and usual operation, would necessarily perform the method claimed, then the method claimed will be considered to be anticipated by the prior art device.” MPEP § 2112.02. Therefore, claim 20 is rejected as anticipated by the same findings and rationale as provided above for claim 1.
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer.
Claims 1 and 20 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1 and 6, respectively of U.S. Patent No. 8,643,598 B2 in view of Majid.
Claim 1
The elements of claim 1 correspond to the patented claim as follows:
Pending Claim 1
Patented Claim 1
1. An information processing system comprising circuitry configured to:
An image processing apparatus comprising:
detect a first gesture of a user;
extracting means for extracting a feature point from a captured image; recognizing means for recognizing a position of the feature point;
cause a graphic user interface (GUI) responsive to detection of the first gesture to be displayed
display control means for performing control, based on the position of the feature point, to display a feature-point pointer indicating the feature point, and a plurality of predetermined user interfaces;
identify an operation presented on the GUI on a basis of a second gesture made following the first gesture and executes a control command corresponding to the identified operation; and
gesture recognizing means for recognizing a grab gesture, wherein a volume of space is enclosed within a user's palm, the volume of space corresponding to a volume of space surrounding the feature-point pointer; and issuing means for issuing, based on the position of the feature point
terminate the GUI when the second gesture ends.
wherein the command issued by the issuing means is capable of terminating the display of the plurality of user interfaces
As mentioned above, Patented Claim 1 does not explicitly recite that the user interface “is superimposed on currently displayed video content while the video content continues to be displayed;
Majid, however, teaches an analogous information processing system with circuitry configured to:
detect a first gesture a user;
“As shown, the input selection process is initiated by the receipt of an input selection command (operation 602), such as with the depression of a dedicated input button or selection of an input option in a graphical user interface.” Majid ¶ 51.
cause a graphic user interface (GUI) responsive to detection of the first gesture be displayed that is superimposed on currently displayed video content while the video content continues to be displayed;
The input selection command at operation 602 causes operations 604–606 to obtain information relevant to displaying a user interface, see Majid ¶ 51, and then, “[t]he information obtained for the on-screen display is [] provided within a display of the on-screen interface (operation 608).” Majid ¶ 52. For example, in FIG. 4, “the video display 404 overlays the user interface over the active audiovisual source with a series of input source indications and previews.” Majid ¶ 34.
identify an operation presented on the GUI on a basis of a second gesture made following the first gesture
“With the on-screen display, the display device can receive an input selection command for a particular source (operation 610).” Majid ¶ 53.
and executes a control command corresponding to the identified operation.
“Upon receipt of the input selection command, the display device can then switch to the particular input source (operation 612).” Majid ¶ 53.
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to improve the patented user interface with Majid’s technique of overlaying the user interface onto the underlying input source, so that the user could continue to consume content while navigating the menu. One would have been motivated to apply Majid’s technique to the patented user interface because this would prevent the user from missing out on content while he is navigating the menu.
Claim 20
Claim 20 is rejected over patented claim 6 as combined with Majid, based on the substantially the same mapping and rationale as provided above for claim 1 with respect to patented claim 1.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Justin R. Blaufeld whose telephone number is (571)272-4372. The examiner can normally be reached M-F 9:00am - 4:00pm ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, James K Trujillo can be reached at (571) 272-3677. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
Justin R. Blaufeld
Primary Examiner
Art Unit 2151
/Justin R. Blaufeld/Primary Examiner, Art Unit 2151