DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on January 16, 2026, has been entered.
Response to Amendment3. In an amendment dated, January 16, 2026, claims 1,2, 4, 5, , 8, 9, 12, 13, 15, 16 and 19 are amended. Currently claims 1-20 are pending.
Response to Arguments
Applicant’s arguments, see page 8-10, filed on 07/15/2025, with respect to the rejection(s) of claim(s) 1, 12 and 19 under Yu have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of Broughton et al.
On Page 9 of the remark’s applicant cited U.S. Provisional Application number relayed upon as 64/409, 600; however the correct provisional application number is 63/409,600 not 64/409,600.
On page 10 of the remarks filed applicant argues prior art of record fails to disclose the claimed limitation wherein " "in response to the display position of the cursor being located on the target application window, adjusting the target application window based on a movement trajectory of the cursor". (Emphasis added). That is, the target application window is adjusted based on a movement trajectory of the cursor, on the premise that the display position of the cursor that is visible being within a display region of the target application window. The office respectfully disagrees. Prior art of record Broughton et al discloses the grabber that is part of application window and said cursor is displayed within a display region of target application window (e.g. see fig 7Y).
[0295] FIG. 7Y illustrates that, in response to the user's movement of the user's hand 7020 while a grabber 706-5 is selected the application window 702 moves in the three-dimensional environment in accordance with a position and/or amount of the movement of the user's hand 7020. Thus the user can adjust the target application window based on user selected movement of user interface object.
Applicant’s arguments with respect to newly amended claim limitation claim(s) 1, 12 and 19 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-3, 10-14, 17-20 is/are rejected under 35 U.S.C. 102(a)(1)/102(a)(2) as being anticipated by Broughton et al (PG Pub NO 2024/0152245) in view of Algreatly (PG Pub NO 2012/0075181).
As in claim 1, Broughton et al discloses an application window adjustment method, comprising:
displaying a target application window in a virtual space in response to a selection instruction for a target application; (Fig 7A-7C & 7F2 item 702 and Par 0233) discloses displaying target application window (702) in a three-dimensional environment (e.g., an environment 7000′, a virtual three-dimensional environment, an augmented reality environment, a pass-through view of a physical environment, or a camera view of a physical environment). The three-dimensional environment is an augmented reality environment that includes one or more virtual objects (e.g., an application window 702 and/or a virtual object 7028) and a representation of at least a portion of a physical environment (e.g., representations 7004′, 7006′ of walls, a representation 7008′ of a floor, and/or a representation 7014′ of a physical object 7014 in the physical environment 7000) surrounding the display generation component 7100.
wherein the virtual space is a virtual environment that is provided by the electronic device by combining a real scene and a virtual scene and that allows for human-computer interaction, and displaying the target application window in the virtual space includes displaying the target application window of the target application in the virtual environment; (Fig 7F2 & 7F3 and Par 0072, 0233) discloses [0072] Augmented reality: An augmented reality (AR) environment refers to a simulated environment in which one or more virtual objects are superimposed over a physical environment. [0233] discloses the three-dimensional environment is an augmented reality environment that includes one or more virtual objects (e.g., an application window 702 and/or a virtual object 7028) and a representation of at least a portion of a physical environment (e.g., representations 7004′, 7006′ of walls, a representation 7008′ of a floor, and/or a representation 7014′ of a physical object 7014 in the physical environment 7000) surrounding the display generation component 7100.
when a trigger operation for a target button on a controller is detected, determining a display position of a cursor corresponding to the controller; (Fig 7F-7AD2, 7AU2, 7BD2, 17 and Par 0710 & 0715) discloses displaying a cursor corresponding to the controller. The user's attention 1710 represents a gaze input and/or the location of a cursor for an input device (e.g., optionally an external input device such as a touch input (e.g., on a touch-sensitive display), a mouse, keyboard, stylus, trackpad, or other input device). [0715] FIG. 17C illustrates in response to detecting a user input (e.g., performed using the user's hand 7020 or another input device) to resize the application window 1702, for example, a drag user input (e.g., an air drag user input), optionally in combination with a selection user input (e.g., an air gesture, such as a pinch and drag gesture, or other user input such as movement of a cursor or other input device), the application window 1702 is resized in accordance with the user input.
and in response to the display position of the cursor that is visible being within a display region of the target application window [(Fig 7V-7Y and Par 0295) discloses cursor that is visible being within a display region of the target application window], adjusting the target application window based on a movement trajectory of the cursor, wherein the display position of the cursor varies with a position of the controller, (Fig 7X-7Z and Par 0294) discloses detecting a user input (i.e. location of cursor on the target application) and in response to detecting the user input continues by moving from its respective position to another position within the three-dimensional environment changes the position of the application window 702 in a horizontal direction (e.g., left and/or right), in a vertical direction (e.g., up and/or down), and/or in a depth (e.g., forwards and/or backwards) in the three-dimensional environment.
and in response to the movement trajectory of the cursor being a curve, rotating the target application window according to the movement trajectory of the cursor. (Fig 7E-7F) discloses the movement trajectory of an input method (i.e. cursor) that allow the user to manipulate position and direction of target application window based on the movement trajectory of the input (i.e. cursor) and (Par 0349, 0359 and Fig 7AQ) discloses the three-dimensional object 770a (i.e. target application window) moves in rotational direction based on user’s input. Given the operation of cursor in prior art Broughton et al, it would have been obvious and well known in the art at the time of the filing that the movement trajectory of the cursor could be linear or curved.
Furthermore prior art Algreatly (Fig 9-14, 32-36 and Par 0019-0025) discloses The 3D cursor can be rotated vertically, horizontally or in a semi-circle fashion to respectively rotate displayed object (i.e. . target application window) in a similar manner mimicking user input. [0020] For example, FIG. 3 illustrates rotating the 3D cursor vertically to simultaneously rotate the object vertically on the computer display. Therefore it would have been obvious to an ordinary skill person in the art at the time of the filing to modify Broughton et al with the teaching of Algreatly such that a well-known method of moving controller in rotational motion can be incorporated in order to adjust target application window to desired location/orientation from start point to end point giving the user the freedom of manipulating displayed object (6 DOF) as desired.
As in claim 2, Broughton et al in view of Algreatly discloses the method according to claim 1, wherein the adjusting the target application window further comprises making at least one of the following: a size adjustment, a position adjustment, and a direction adjustment. (Broughton et al; Fig 7X-7Z and par 0294-0296) discloses position adjustment of the window 702.
As in claim 3, Broughton et al in view of Algreatly discloses the method according to claim 1, wherein the adjusting the target application window based on a movement trajectory of the cursor comprises: determining a motion type, a start point of movement, and an end point of movement of the cursor based on the movement trajectory of the cursor; and adjusting the target application window based on the motion type, the start point of movement, and the end point of movement of the cursor. (Broughton et al; Fig 7X-7Z and par 0294-0296) disclose the user is enabled to move the grabber 706-4 and associated application window 702 in three dimensions, including changing the position of the application window 702 in a horizontal direction (e.g., left and/or right), in a vertical direction (e.g., up and/or down), and/or in a depth (e.g., forwards and/or backwards) in the three-dimensional environment. For example, if the user moves the user's hand to the left, the application window 702 moves to the left in accordance with the user's hand movement. And in response to detecting an end of a user input, the application window 702 has been repositioned (e.g., using the grabber 706-5) to a different position in the three-dimensional environment.
As in claim 10, Broughton et al in view of Algreatly discloses the method according to claim 1, wherein the method further comprises: when it is detected that the target button on the controller is released, stopping the adjustment operation on the target application window. (Broughton et al; Fig 7X-7Z and par 0294-0296) discloses the application window 702 moves in the three-dimensional environment in accordance with a position and/or amount of the movement of the user's hand 7020. For example, if the user moves the user's hand to the left, the application window 702 moves to the left in accordance with the user's hand movement. And in response to detecting an end of a user input, the application window 702 has been repositioned (e.g., using the grabber 706-5) to a different position in the three-dimensional environment.
As in claim 11, Broughton et al in view of Algreatly discloses the method according to claim 1, wherein the target application is a 2D application. (Broughton et al; Fig 7X-7Z item 702)
As in claim 12, Broughton et al discloses an electronic device (Fig 1), comprising:
a processor and a memory, wherein the memory is configured to store a computer program, and the processor is configured to call and run the computer program stored in the memory, to perform:
displaying a target application window in a virtual space in response to a selection instruction for a target application; (Fig 7A-7C & 7F2 item 702 and Par 0233) discloses displaying target application window (702) in a three-dimensional environment (e.g., an environment 7000′, a virtual three-dimensional environment, an augmented reality environment, a pass-through view of a physical environment, or a camera view of a physical environment). The three-dimensional environment is an augmented reality environment that includes one or more virtual objects (e.g., an application window 702 and/or a virtual object 7028) and a representation of at least a portion of a physical environment (e.g., representations 7004′, 7006′ of walls, a representation 7008′ of a floor, and/or a representation 7014′ of a physical object 7014 in the physical environment 7000) surrounding the display generation component 7100.
wherein the virtual space is a virtual environment that is provided by the electronic device by combining a real scene and a virtual scene and that allows for human-computer interaction, and displaying the target application window in the virtual space includes displaying the target application window of the target application in the virtual environment; (Fig 7F2 & 7F3 and Par 0072, 0233) discloses [0072] Augmented reality: An augmented reality (AR) environment refers to a simulated environment in which one or more virtual objects are superimposed over a physical environment. [0233] discloses the three-dimensional environment is an augmented reality environment that includes one or more virtual objects (e.g., an application window 702 and/or a virtual object 7028) and a representation of at least a portion of a physical environment (e.g., representations 7004′, 7006′ of walls, a representation 7008′ of a floor, and/or a representation 7014′ of a physical object 7014 in the physical environment 7000) surrounding the display generation component 7100.
when a trigger operation for a target button on a controller is detected, determining a display position of a cursor corresponding to the controller; (Fig 7F-7AD2, 7AU2, 7BD2, 17 and Par 0710 & 0715) discloses displaying a cursor corresponding to the controller. The user's attention 1710 represents a gaze input and/or the location of a cursor for an input device (e.g., optionally an external input device such as a touch input (e.g., on a touch-sensitive display), a mouse, keyboard, stylus, trackpad, or other input device). [0715] FIG. 17C illustrates in response to detecting a user input (e.g., performed using the user's hand 7020 or another input device) to resize the application window 1702, for example, a drag user input (e.g., an air drag user input), optionally in combination with a selection user input (e.g., an air gesture, such as a pinch and drag gesture, or other user input such as movement of a cursor or other input device), the application window 1702 is resized in accordance with the user input.
and in response to the display position of the cursor that is visible being within a display region of the target application window [(Fig 7V-7Y and Par 0295) discloses cursor that is visible being within a display region of the target application window], adjusting the target application window based on a movement trajectory of the cursor, wherein the display position of the cursor varies with a position of the controller, (Fig 7X-7Z and Par 0294) discloses detecting a user input (i.e. location of cursor on the target application) and in response to detecting the user input continues by moving from its respective position to another position within the three-dimensional environment changes the position of the application window 702 in a horizontal direction (e.g., left and/or right), in a vertical direction (e.g., up and/or down), and/or in a depth (e.g., forwards and/or backwards) in the three-dimensional environment.
(Fig 7E-7F) discloses the movement trajectory of an input method (i.e. cursor) that allow the user to manipulate position and direction of target application window based on the movement trajectory of the input (i.e. cursor) and (Par 0349, 0359 and Fig 7AQ) discloses the three-dimensional object 770a (i.e. target application window) moves in rotational direction based on user’s input. Given the operation of cursor in prior art Broughton et al, it would have been obvious and well known in the art at the time of the filing that the movement trajectory of the cursor could be linear or curved.
Furthermore prior art Algreatly (Fig 9-14, 32-36 and Par 0019-0025) discloses The 3D cursor can be rotated vertically, horizontally or in a semi-circle fashion to respectively rotate displayed object (i.e. . target application window) in a similar manner mimicking user input. [0020] For example, FIG. 3 illustrates rotating the 3D cursor vertically to simultaneously rotate the object vertically on the computer display. Therefore it would have been obvious to an ordinary skill person in the art at the time of the filing to modify Broughton et al with the teaching of Algreatly such that a well-known method of moving controller in rotational motion can be incorporated in order to adjust target application window to desired location/orientation from start point to end point giving the user the freedom of manipulating displayed object (6 DOF) as desired.
As in claim 13, Broughton et al in view of Algreatly discloses the electronic device according to claim 12, wherein the adjusting the target application window further comprises making at least one of the following: a size adjustment, a position adjustment, and a direction adjustment,. (Broughton et al; Fig 7X-7Z and par 0294-0296) discloses position adjustment of the window 702.
As in claim 14, Broughton et al in view of Algreatly discloses the electronic device according to claim 12, wherein the adjusting the target application window based on a movement trajectory of the cursor comprises: determining a motion type, a start point of movement, and an end point of movement of the cursor based on the movement trajectory of the cursor; and adjusting the target application window based on the motion type, the start point of movement, and the end point of movement of the cursor. (Broughton et al; Fig 7X-7Z and par 0294-0296) disclose the user is enabled to move the grabber 706-4 and associated application window 702 in three dimensions, including changing the position of the application window 702 in a horizontal direction (e.g., left and/or right), in a vertical direction (e.g., up and/or down), and/or in a depth (e.g., forwards and/or backwards) in the three-dimensional environment. For example, if the user moves the user's hand to the left, the application window 702 moves to the left in accordance with the user's hand movement. And in response to detecting an end of a user input, the application window 702 has been repositioned (e.g., using the grabber 706-5) to a different position in the three-dimensional environment.
As in claim 17, Broughton et al in view of Algreatly discloses the electronic device according to claim 12, wherein the processor is further configured to perform: when it is detected that the target button on the controller is released, stopping the adjustment operation on the target application window. (Broughton et al; Fig 7X-7Z and par 0294-0296) discloses the application window 702 moves in the three-dimensional environment in accordance with a position and/or amount of the movement of the user's hand 7020. For example, if the user moves the user's hand to the left, the application window 702 moves to the left in accordance with the user's hand movement. And in response to detecting an end of a user input, the application window 702 has been repositioned (e.g., using the grabber 706-5) to a different position in the three-dimensional environment.
As in claim 18, Broughton et al in view of Algreatly discloses the electronic device according to claim 12, wherein the target application is a 2D application. (Broughton et al; Fig 7X-7Z item 702)
As in claim 19, Broughton et al discloses a computer-readable storage medium, which is configured to store a computer program, wherein the computer program causes a computer to perform:
displaying a target application window in a virtual space in response to a selection instruction for a target application; (Fig 7A-7C & 7F2 item 702 and Par 0233) discloses displaying target application window (702) in a three-dimensional environment (e.g., an environment 7000′, a virtual three-dimensional environment, an augmented reality environment, a pass-through view of a physical environment, or a camera view of a physical environment). The three-dimensional environment is an augmented reality environment that includes one or more virtual objects (e.g., an application window 702 and/or a virtual object 7028) and a representation of at least a portion of a physical environment (e.g., representations 7004′, 7006′ of walls, a representation 7008′ of a floor, and/or a representation 7014′ of a physical object 7014 in the physical environment 7000) surrounding the display generation component 7100.
wherein the virtual space is a virtual environment that is provided by the electronic device by combining a real scene and a virtual scene and that allows for human-computer interaction, and displaying the target application window in the virtual space includes displaying the target application window of the target application in the virtual environment; (Fig 7F2 & 7F3 and Par 0072, 0233) discloses [0072] Augmented reality: An augmented reality (AR) environment refers to a simulated environment in which one or more virtual objects are superimposed over a physical environment. [0233] discloses the three-dimensional environment is an augmented reality environment that includes one or more virtual objects (e.g., an application window 702 and/or a virtual object 7028) and a representation of at least a portion of a physical environment (e.g., representations 7004′, 7006′ of walls, a representation 7008′ of a floor, and/or a representation 7014′ of a physical object 7014 in the physical environment 7000) surrounding the display generation component 7100.
when a trigger operation for a target button on a controller is detected, determining a display position of a cursor corresponding to the controller; (Fig 7F-7AD2, 7AU2, 7BD2, 17 and Par 0710 & 0715) discloses displaying a cursor corresponding to the controller. The user's attention 1710 represents a gaze input and/or the location of a cursor for an input device (e.g., optionally an external input device such as a touch input (e.g., on a touch-sensitive display), a mouse, keyboard, stylus, trackpad, or other input device). [0715] FIG. 17C illustrates in response to detecting a user input (e.g., performed using the user's hand 7020 or another input device) to resize the application window 1702, for example, a drag user input (e.g., an air drag user input), optionally in combination with a selection user input (e.g., an air gesture, such as a pinch and drag gesture, or other user input such as movement of a cursor or other input device), the application window 1702 is resized in accordance with the user input.
and in response to the display position of the cursor that is visible being within a display region of the target application window [(Fig 7Y and Par 0295) discloses cursor that is visible being within a display region of the target application window], adjusting the target application window based on a movement trajectory of the cursor, wherein the display position of the cursor varies with a position of the controller, (Fig 7X-7Z and Par 0294) discloses detecting a user input (i.e. location of cursor on the target application) and in response to detecting the user input continues by moving from its respective position to another position within the three-dimensional environment changes the position of the application window 702 in a horizontal direction (e.g., left and/or right), in a vertical direction (e.g., up and/or down), and/or in a depth (e.g., forwards and/or backwards) in the three-dimensional environment,
(Fig 7E-7F) discloses the movement trajectory of an input method (i.e. cursor) that allow the user to manipulate position and direction of target application window based on the movement trajectory of the input (i.e. cursor) and (Par 0349, 0359 and Fig 7AQ) discloses the three-dimensional object 770a (i.e. target application window) moves in rotational direction based on user’s input. Given the operation of cursor in prior art Broughton et al, it would have been obvious and well known in the art at the time of the filing that the movement trajectory of the cursor could be linear or curved.
Furthermore prior art Algreatly (Fig 9-14, 32-36 and Par 0019-0025) discloses The 3D cursor can be rotated vertically, horizontally or in a semi-circle fashion to respectively rotate displayed object (i.e. . target application window) in a similar manner mimicking user input. [0020] For example, FIG. 3 illustrates rotating the 3D cursor vertically to simultaneously rotate the object vertically on the computer display. Therefore it would have been obvious to an ordinary skill person in the art at the time of the filing to modify Broughton et al with the teaching of Algreatly such that a well-known method of moving controller in rotational motion can be incorporated in order to adjust target application window to desired location/orientation from start point to end point giving the user the freedom of manipulating displayed object (6 DOF) as desired.
As in claim 20, Broughton et al in view of Algreatly discloses a non-transitory computer program product comprising program instructions, wherein the program instructions, when run on an electronic device, cause the electronic device to perform the method according to claim 1. (Broughton et al; Fig 1-3 and Par 0006, 0180) discloses the computer system has a graphical user interface (GUI), one or more processors, memory and one or more modules, programs or sets of instructions stored in the memory for performing multiple functions. This program may, for example, move and modify images presented on the display generation component 120, or perform other functions, in response to the pose and/or gesture information.
Claim(s) 4-9 and 15-16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Broughton et al (PG Pub NO 2024/0152245) in view of Algreatly (PG Pub NO 2012/0075181) and in further view of Jung et al (PG Pub NO 2016/0224134).
As in claim 4, Broughton et al in view of Algreatly discloses the method according to claim 3, wherein the determining a motion type of the cursor based on the movement trajectory of the cursor comprises: in response to the movement trajectory of the cursor is a curve, determining that the motion type of the cursor is a curvilinear motion; or in response to the movement trajectory of the cursor is a straight line, determining that the motion type of the cursor is a rectilinear motion. (Broughton et al; Fig 7E-7F) discloses the movement trajectory of the cursor is a straight line thus motion type of the cursor is a linear /rectilinear motion. Given the operation of cursor in prior art Broughton et al, it is obvious and well known in the art at the time of the filing that the movement trajectory of the cursor could be linear or curved. Furthermore prior art Jung et al (Fig 5B,10) discloses movement of the controller in a curvilinear motion to control the motion of displayed cursor. Therefore it would have been obvious to an ordinary skill person in the art at the time of the filing to modify Broughton et al in view of Algreatly with the teaching of Jung et al such that a well-known method of moving controller in curvilinear motion can be incorporated in order to adjust target application window to desired location from start point to end point movement based on determined angle of movement giving the user a free hand operation (6 DOF).
As in claim 5, Broughton et al in view of Algreatly in further view of Jung et al discloses the method according to claim 4, wherein the adjusting the target application window based on the motion type, the start point of movement, and the end point of movement of the cursor comprises:
when the motion type of the cursor is the rectilinear motion, determining whether the start point of movement of the cursor is located on a window border of the target application window; and in response to the start point of movement of the cursor is located on the window border, making a size adjustment for the target application window based on the start point of movement and the end point of movement; or in response to the start point of movement of the cursor is located on a main body of the window, making a position adjustment and a direction adjustment for the target application window based on the start point of movement and the end point of movement. (Broughton et al; Fig 7AU2-7BA1) discloses method of controlling displayed object based on movement of pointer on the display screen wherein based on pointer location near displayed object the user can move application window or change the size of application window (i.e. size adjustment).
As in claim 6, Broughton et al in view of Algreatly in further view of Jung et al discloses the method according to claim 5, wherein the making a size adjustment for the target application window based on the start point of movement and the end point of movement comprises: with the start point of movement as a start point, controlling the window border to move from the start point of movement to the end point of movement, to make the size adjustment for the target application window. (Broughton et al; Fig 17E-17F) discloses making size adjustment for the target application window (1702) based on the start point of movement (Fig 17E) and the end point of movement (Fig 17E) wherein window border is controlled to move.
As in claim 7, Broughton et al in view of Algreatly in further view of Jung et al discloses the method according to claim 5, wherein the making a position adjustment and a direction adjustment for the target application window based on the start point of movement and the end point of movement comprises: with the start point of movement as a start point, controlling the target application window to move from the start point of movement to the end point of movement, to make the position adjustment and the direction adjustment for the target application window. (Broughton et al; Fig 7Y-7Z) discloses making a position adjustment and a direction adjustment for the target application window by controlling the target application window (702) to move from the start point of movement (Fig 7Y) to the end point of movement (Fig 7Z).
As in claim 8, Broughton et al in view of Algreatly in further view of Jung et al discloses the method according to claim 4, wherein the adjusting the target application window based on the motion type, the start point of movement, and the end point of movement of the cursor comprises: when the motion type of the cursor is the curvilinear motion, determining an angle of movement between the start point of movement and the end point of movement; and making the angle adjustment for the target application window based on the angle of movement. (Broughton et al; Fig 7E-7F, 7K3-7L, &Y-7Z and 7AV- 7AW) discloses adjusting the target application window (720/770) based on the motion type, the start point of movement, and the end point of movement of the cursor wherein the size or location of target application window (720/770) is adjusted. But fails to explicitly disclose the motion type of the cursor is the curvilinear motion. However, it is obvious and well known in the art that when moving curser in a displayed environment one would be able to perform an input (i.e. adjust the target application window) from start point of movement to an end point of movement by moving said cursor in a curvilinear motion in relation to virtual space. Furthermore prior art Jung et al (Fig 10) discloses movement of the controller in a curvilinear motion to control the motion of displayed cursor. Therefore it would have been obvious to an ordinary skill person in the art at the time of the filing to modify Broughton et al in view of Algreatly with the teaching of Jung et al such that a well-known method of moving controller in curvilinear motion can be incorporated in order to adjust target application window to desired location from start point to end point movement based on determined angle of movement giving the user free hand operation (6 DOF).
As in claim 9, Broughton et al in view of Algreatly in further view of Jung et al discloses the method according to claim 8, wherein the making the angle adjustment for the target application window based on the angle of movement comprises: with the start point of movement as a start point, controlling the target application window to rotate by the angle of movement from a current angle corresponding to the start point, to make the angle adjustment for the target application window. Broughton et al (Fig 7E-7F, 7K3-7L, &Y-7Z and 7AV- 7AW) discloses adjusting the target application window (720/770) based on the motion type, the start point of movement, and the end point of movement of the cursor wherein the size or location of target application window (720/770) is adjusted. Jung et al (Fig 10) discloses movement of the controller in a curvilinear motion to control the motion of displayed cursor. Therefore, it is obvious and well known in the art that when moving curser in a display environment one would be able to perform an input (i.e. adjust the target application window) from start point of movement to an end point of movement by moving said cursor in a different orientation in order to move the target application window in angular motion or rotational motion or curvilinear motion to yield same predictable result (i.e. move the target application window).
As in claim 15, Broughton et al in view of Algreatly in further view of Jung et al discloses the electronic device according to claim 14, wherein the determining a motion type of the cursor based on the movement trajectory of the cursor comprises: if the movement trajectory of the cursor is a curve, determining that the motion type of the cursor is a curvilinear motion; or if the movement trajectory of the cursor is a straight line, determining that the motion type of the cursor is a rectilinear motion. Broughton et al (Fig 7E-7F) discloses the movement trajectory of the cursor is a straight line thus motion type of the cursor is a linear /rectilinear motion. Given the operation of cursor in prior art Broughton et al, it is obvious and well known in the art at the time of the filing that the movement trajectory of the cursor could be linear or curved. Furthermore prior art Jung et al (Fig 10) discloses movement of the controller in a curvilinear motion to control the motion of displayed cursor.
As in claim 16, Broughton et al in view of Algreatly in further view of Jung et al discloses the electronic device according to claim 15, wherein the adjusting the target application window based on the motion type, the start point of movement, and the end point of movement of the cursor comprises: when the motion type of the cursor is the rectilinear motion, determining whether the start point of movement of the cursor is located on a window border of the target application window; [Broughton et al (Fig 7K3-7L) discloses motion type of the cursor is the rectilinear motion]; and if the start point of movement of the cursor is located on the window border, making a size adjustment for the target application window based on the start point of movement and the end point of movement; or if the start point of movement of the cursor is located on a main body of the window, making a position adjustment and a direction adjustment for the target application window based on the start point of movement and the end point of movement. Broughton et al (Fig 7E-7F, 7K3-7L, &Y-7Z and 7AV- 7AW) discloses method of controlling displayed object based on movement of pointer on the display screen wherein based on pointer location near displayed object the user can move application window or change the size of application window (i.e. size adjustment).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to BENYAM KETEMA whose telephone number is (571)270-7224. The examiner can normally be reached 9AM-5PM (M-F).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Temesghen Ghebretinsae can be reached on 571-272-3017. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/BENYAM KETEMA/Primary Examiner, Art Unit 2626