Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The information disclosure statement (IDS) submitted on March 20, 2025 has been considered by the examiner.
Specification
The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed.
Claim Rejections - 35 USC § 112
Claims 2-5, 12-15 and 17-20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Specifically, in line 2 of claim 2, “preset type” is indefinite because the word "type" renders the claims indefinite because the claims include elements not actually disclosed (those encompassed by "type"), thereby rendering the scope of the claims unascertainable.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1, 6-11 and 16 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Bao (US 2022/0164101).
As per claim 1 Bao discloses: An interaction method, wherein the method is applied to an electronic device, the electronic device comprises a display component and one or more input devices, the method comprises:
displaying a computer-generated three-dimensional environment 3-1 & 4-1 3-1 & 4-1 at the display component; presenting an interaction object 3-11 & 4-11 within the three-dimensional environment 3-1 & 4-1; acquiring interaction information input by a user through the input devices { [0046] For example, FIG. 3 is a first schematic diagram of game interaction in a shooting game in the related art. Referring to FIG. 3, in a game interface 3-1, the player needs to tap an attack button 3-11 on a left side, to implement a simultaneous interaction of any two of the moving, aiming, and shooting. However, in this case, the player can only use one weapon, and the player needs to manually tap at the bottom to switch between different weapons. FIG. 4 is a second schematic diagram of game interaction in a shooting game in the related art. Referring to FIG. 4, a right joystick mechanism is added to an attack button 4-11 on a right side of a game interface 4-1, to implement an aiming operation during attacking.};
inputting the interaction information to an interaction manager {figure 6};
determining a target interaction object corresponding to the interaction information when the interaction manager judges that the interaction information satisfies a current interaction scene; performing an operation corresponding to the interaction information on the target interaction object when the target interaction object accepts the interaction information. {[0066] FIG. 8 shows an interaction interface of a target game application displayed in a mobile terminal in a normal state according to an embodiment according to the present disclosure. As shown in FIG. 8, a virtual joystick and a virtual button are displayed on a left side of the interaction interface. The player may tap the virtual button to perform a shooting operation, and move the virtual joystick to perform movement. In addition, five virtual buttons are further displayed on a right side of the interaction interface, and the player may perform operations such as crawling, aiming, squatting, bullet loading, and jumping by using the five virtual buttons.}
As per claim 6 Bao discloses: The interaction method according to claim 1, wherein after the performing the operation corresponding to the interaction information on the target interaction object, the method further comprises: detecting, by an interaction scene manager, whether the interaction information satisfies a condition for a transition of the current interaction scene to a target interaction scene; in response to detecting that the interaction information satisfies the condition for the transition of the current interaction scene to the target interaction scene, transitioning the current interaction scene to the target interaction scene {[0075] In the interface shown in FIG. 12, the player taps a shooting key on a left side of a back surface of the folded screen with the left hand, to perform a shooting operation in the game. The electronic device will display a tap feedback in response to the shooting operation. In the interface shown in FIG. 12, a gun currently used by the player is a gun on the left side. FIG. 13 shows a second interface of a two-hand operation mode. The interface is located on a right side region of the back display region. In the interface shown in FIG. 13, the player taps a shooting key (a touch key marked by a bullet) on a right side with the right hand, and the electronic device will display a tap feedback in response to the shooting operation. In the interface shown in FIG. 13, a gun currently used by the player is a weapon on the right side. In the two-hand operation mode, an interaction manner is changed to that the left hand controls the movement of the virtual operation object by using a joystick, and the right hand controls a visual field range of the virtual operation object on a front surface of the folded screen.}.
As per claim 7 Bao discloses: The interaction method according to claim 6, wherein the current interaction scene comprises a far-field interaction scene, the target interaction scene comprises a near-field interaction scene; the detecting, by the interaction scene manager, whether the interaction information satisfies the condition for the transition of the current interaction scene to the target interaction scene comprises: detecting, by the interaction scene manager, whether an interaction position corresponding to the interaction information is located in a near-field interaction region for the target interaction object; in response to detecting that the interaction position corresponding to the interaction information is located in the near-field interaction region for the target interaction object, the interaction scene manager judging that the interaction information satisfies a condition for a transition of the far-field interaction scene to the near-field interaction scene {[0074] For example, the player may trigger a two-hand operation mode (in which the back display region is displayed) after touching and holding a region of a front display screen of a folded screen shown in FIG. 8 with left and right hands for a specific time, and a countdown indicating that the two-hand operation is to be triggered may be displayed. FIG. 9 shows a schematic diagram of an interface in which an electronic device displays two touch regions according to an embodiment according to the present disclosure. As shown in FIG. 9, the player displays two touch regions. The player may trigger the countdown indicating that the two-hand operation mode is to be triggered after simultaneously tapping, for a preset time, the two touch regions with two hands in a phone holding gesture for triggering the two-hand operation as shown in FIG. 10. After the player triggers the two-hand operation, two progress countdowns appear in the interface. The two-hand operation mode is triggered and an interface layout may be adjusted after a progress bar is completed. An example interaction interface provided in FIG. 11 may further display a countdown prompt, and the two-hand operation mode is displayed after the countdown ends. A layout diagram in which a layout of a target game interface changed after the countdown ends, as shown in FIG. 12, is a first interface of a two-hand operation mode. The interface is located on a left side region of the back display region.}.
As per claim 8 Bao discloses: The interaction method according to claim 6, wherein the method further comprises: setting a new interaction scene in the interaction manager, and setting a transition condition for a transition between the new interaction scene and another interaction scene in the interaction scene manager {figures 8-9 and 11-13}.
As per claim 9 Bao discloses: The interaction method according to claim 1, wherein the input devices comprise at least one of a handle, a mouse, a keyboard, a touch pad, a head-mounted device, an image collection device and a voice collection device; the interaction information comprises at least one of ray information, key information and trigger information input through the handle, mouse information input through the mouse, keyboard information input through the keyboard, touch information input through the touch pad, key information input through the head-mounted device, voice information input through the voice collection device, and gesture information and eye movement information input through the image collection device {figure 10}.
As per claim 10 Bao discloses: The interaction method according to claim 1, wherein the current interaction scene comprises any one of a near-field interaction scene, a far-field interaction scene, a far-field pure ray interaction scene and a pure eye-hand interaction scene {figures 8-9 and 11-13}.
As per claim 11 Bao discloses: An electronic device, comprising a memory, a processor, a computer program stored in the memory and executable on the processor, a display component, and one or more input devices, wherein the computer program, when executed by the processor, implements interaction method comprising:
displaying a computer-generated three-dimensional environment 3-1 & 4-1 3-1 & 4-1 at the display component; presenting an interaction object 3-11 & 4-11 within the three-dimensional environment 3-1 & 4-1; acquiring interaction information input by a user through the input devices { [0046] For example, FIG. 3 is a first schematic diagram of game interaction in a shooting game in the related art. Referring to FIG. 3, in a game interface 3-1, the player needs to tap an attack button 3-11 on a left side, to implement a simultaneous interaction of any two of the moving, aiming, and shooting. However, in this case, the player can only use one weapon, and the player needs to manually tap at the bottom to switch between different weapons. FIG. 4 is a second schematic diagram of game interaction in a shooting game in the related art. Referring to FIG. 4, a right joystick mechanism is added to an attack button 4-11 on a right side of a game interface 4-1, to implement an aiming operation during attacking.};
inputting the interaction information to an interaction manager {figure 6};
determining a target interaction object corresponding to the interaction information when the interaction manager judges that the interaction information satisfies a current interaction scene; performing an operation corresponding to the interaction information on the target interaction object when the target interaction object accepts the interaction information {[0066] FIG. 8 shows an interaction interface of a target game application displayed in a mobile terminal in a normal state according to an embodiment according to the present disclosure. As shown in FIG. 8, a virtual joystick and a virtual button are displayed on a left side of the interaction interface. The player may tap the virtual button to perform a shooting operation, and move the virtual joystick to perform movement. In addition, five virtual buttons are further displayed on a right side of the interaction interface, and the player may perform operations such as crawling, aiming, squatting, bullet loading, and jumping by using the five virtual buttons.}.
As per claim 16 Bao discloses: A non-transitory computer readable storage medium on which a computer program is stored, wherein the computer program, when executed by a processor, implements an interaction method comprising:
displaying a computer-generated three-dimensional environment 3-1 & 4-1 3-1 & 4-1 at the display component; presenting an interaction object 3-11 & 4-11 within the three-dimensional environment 3-1 & 4-1; acquiring interaction information input by a user through the input devices { [0046] For example, FIG. 3 is a first schematic diagram of game interaction in a shooting game in the related art. Referring to FIG. 3, in a game interface 3-1, the player needs to tap an attack button 3-11 on a left side, to implement a simultaneous interaction of any two of the moving, aiming, and shooting. However, in this case, the player can only use one weapon, and the player needs to manually tap at the bottom to switch between different weapons. FIG. 4 is a second schematic diagram of game interaction in a shooting game in the related art. Referring to FIG. 4, a right joystick mechanism is added to an attack button 4-11 on a right side of a game interface 4-1, to implement an aiming operation during attacking.};
inputting the interaction information to an interaction manager {figure 6};
determining a target interaction object corresponding to the interaction information when the interaction manager judges that the interaction information satisfies a current interaction scene; performing an operation corresponding to the interaction information on the target interaction object when the target interaction object accepts the interaction information {[0066] FIG. 8 shows an interaction interface of a target game application displayed in a mobile terminal in a normal state according to an embodiment according to the present disclosure. As shown in FIG. 8, a virtual joystick and a virtual button are displayed on a left side of the interaction interface. The player may tap the virtual button to perform a shooting operation, and move the virtual joystick to perform movement. In addition, five virtual buttons are further displayed on a right side of the interaction interface, and the player may perform operations such as crawling, aiming, squatting, bullet loading, and jumping by using the five virtual buttons.}.
Allowable Subject Matter
Claims 2-5, 12-15 and 17-20 would be allowable if rewritten to overcome the rejection(s) under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), 2nd paragraph, set forth in this Office action and to include all of the limitations of the base claim and any intervening claims.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DAVID D DAVIS whose telephone number is (571)272-7572. The examiner can normally be reached Monday - Friday, 8 a.m. - 4 p.m..
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ke Xiao can be reached at 571-272-7776. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DAVID D DAVIS/Primary Examiner, Art Unit 2627
DDD