DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first
inventor to file provisions of the AIA .
Status of Claims
This action is in reply to the response filed on December 23, 2025.
Claim(s) 4, 6, 10, 16, and 20 have been canceled.
Claims 33-36 have been added
Claims 1-3, 5, 7, 8, 11-13, 15, 17, and 18 have been amended.
Claim(s) 1-3, 5, 7-9, 11-13, 15, 17-19, and 31-36 are currently pending and have been examined.
This action is made Final.
Response to Arguments
Applicant argued that the prior art did not teach or suggest identifying, in the virtual space, a plurality of virtual payment locations, wherein each of the plurality of virtual payment locations are projected onto a different physical location within the physical space. Examiner disagrees. The Sinha reference discloses, “…as the user moves his or her arm, the client AR device captures the movement via the cameras and then determines that the user is selecting a particular box, window, or other element displayed to the user in the AR experience.”; “By using this or other selection mechanisms…the user is determined to have clicked or selected the ‘Buy Now’ box.” (See Sinha, col 8, lines 35-45, 50-55). The cited disclosure in Sinha teaches or suggests identifying, in the virtual space, a virtual payment location, wherein the virtual payment location is projected onto a physical location within the physical space. The disclosure in Sinha is not limited to one virtual location and physical location. Therefore, Examiner finds Applicant’s argument non-persuasive.
Applicant argued that the prior art did not teach or suggest detecting a collision between a virtual object presented by the extended reality device and a target virtual payment location of the plurality of virtual payment locations. Examiner disagrees. The Tian reference discloses the detection of collisions between virtual objects and between virtual objects and real-world objects (Tian: pgh 3), which teaches detecting a collision between a virtual object presented by the extended reality device and a target virtual payment location of the plurality of virtual payment locations. Therefore, Examiner finds Applicant’s argument non-persuasive.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-3, 5, 7-9, 11-13, 15, 17-19, and 31-36 are rejected under 35 U.S.C. 103 as being unpatentable over Chainkin (WO2023/205440) in view of Tian (US 2022/0230383) in further view of Sukhija (US 2021/0065175) in further view of Sinha (US 11,037,116).
Regarding claim(s) 1 and 11:
Chaikin teaches:
a communication port; (Chaikin: pgh 31, “…the virtual memory may be provided inworld in communication through the inworld/physical world intersection via an intermediary processor…”)
a memory storing instructions; and (Chaikin: pgh 61, “A computing device…may include one or more processors, memory, interfaces…and software.”)
initiate, based on the collision, a payment; (Chaikin: pgh 50, “A transaction may be initiated by a virtual terminal causing the virtual terminal to create a transaction initiation signal in operation. For example, the transaction may be initiated by an avatar inworld.”)
transmit, from the extended reality device and via the communication port, a payment request; (Chaikin: pgh 56, “…the intermediary processor may generate an authorization request signal in operation…The authorization request signal may be transmitted from the intermediary processor to an authorization processor…”)
receive, at the extended reality device, confirmation of the payment; and generate, for output, the confirmation of the payment. (Chaikin: pgh 56, “…the intermediary processor may generate/send communications to the physical user’s typical card transaction process so as to receive commitment of payment on the physical user’s bank/credit account by an issuing bank through a card network.”)
Chaikin does not teach, however, Tian teaches:
configured to execute instructions to: map, at an extended reality device positioned within a physical space, a virtual space to the physical space; (Tian: pgh 37, “The computer system is configured for depth map processing that removes outlier, densifies the depth map, and enables blending at the occlusion boundaries between real objects and virtual objects.”)
detect a collision between a virtual object presented by the extended reality device and a target virtual payment location of the plurality of target virtual payment locations, (Tian: pgh 34, “Embodiments of the present disclosure are directed to, among other things, accurate and real-time detection of occlusions and collisions between virtual objects and between virtual objects and real-world objects…”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified Chaikin to include the teachings of Tian because it is important that the placement is accurate and performed in real time (Tian: pgh 3). Chaikin/Tian does not teach, however, Sukhija teaches:
control circuitry communicably coupled to the memory and the communication port and (Sukhija: pgh 10, “The payment terminal includes a processor and a memory communicatively coupled to the processor.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified Chaikin/Tian to include the teachings of Sukhija because “Another issue with the existing techniques is that payments are made in a virtual environment and a vendor and/or manufacturer of the VR device may have to initiate the transaction to merchant in the physical environment, hence increasing intermediaries.” (Sukhija: pgh 5). Chaikin/Tian/Sukhija does not teach, however, Sinha teaches:
identify, in the virtual space, a plurality of virtual payment locations, wherein each of the plurality of virtual payment locations is projected onto a different physical location within the physical space; (Sinha: col 8, lines 25-30, “The user interface also includes a ‘buy now’ button, which may be selected to initiate the payment process…”; col 8, lines 35-45, “…as the user moves his or her arm, the client AR device captures the movement via the cameras and then determines that the user is selecting a particular box, window, or other element displayed to the user in the AR experience.”)
wherein control circuitry is configured to detect the collision between the virtual object and the target virtual payment location by: capturing sensor data of the physical space; (Sinha: col 7, lines 55-65, “…the user is able to add or remove furniture from a selection of available items to a physical room so as to see a preview or a demonstration of how the furniture would look in his or her own home.”)
generating a spatial map of the physical space based on the sensor data, and (Sinha: col 5, lines 20-25, “The optical component is substantially transparent, whereby the wearer can see through it to view a real-world environment in which they are located simultaneously with the projected mage, thereby providing an augmented reality experience.”)
analyzing the spatial map using a processor of the extended reality device to identify the collision between the virtual object and the target virtual payment location; (Sinha: col 8, lines 40-45, “…as the user closes his hand, the user is determined to be selecting the box at the location corresponding to the AR element shown to the user in the AR experience.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified Chaikin/Tian/Sukhija to include the teachings of Sinha to allow users to “…make a one-touch payment through the merchant app of the client AR client device, without requiring the user to exit the merchant app.” (Sinha: col 3, lines 15-20).
Regarding claim(s) 2 and 12:
The combination of Chaikin/Tian/Sukhija/Sinha, as shown in the rejection above, discloses the limitations of claims 1 and 11, respectively. Tian further teaches:
wherein the control circuitry configured to detect the collision is further configured to identify a collision between a virtual element and the target virtual payment location. (Tian: pgh 94, “Once the collision is detected between the static scene and the moving virtual object, the motion of the object is stopped to simulate the visual effect of collision avoidance.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified Chaikin/Sukhija/Sinha to include the teachings of Tian because it is important that the placement is accurate and performed in real time (Tian: pgh 3).
Regarding claim(s) 3 and 13:
The combination of Chaikin/Tian/Sukhija/Sinha, as shown in the rejection above, discloses the limitations of claims 1 and 11, respectively. Chaikin further teaches:
generate, for output, a virtual element associated with the target virtual payment location. (Chaikin: pgh 40, “The method may be provided for generating the virtual terminal to be used inworld in a virtual world.”)
Chaikin/Sukhija/Sinha does not teach, however, Tian teaches:
determine, based on the spatial map of the physical space, a proximity of the extended reality device to the target virtual payment location; (Tian: pgh 35, “For instance, a ToF camera measures the round trip time of emitted light and resolves the depth value (distance) for a point in the real-world scene.”)
determine that the proximity of the extended reality device to the target virtual payment location is within a threshold distance; and (Tian: pgh 121, “Collision can be detected only when the number of collided voxels is larger than a threshold number.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified Chaikin/Sukhija/Sinha to include the teachings of Tian because it is important that the placement is accurate and performed in real time (Tian: pgh 3).
Regarding claim(s) 5 and 15:
The combination of Chaikin/Tian/Sukhija/Sinha, as shown in the rejection above, discloses the limitations of claims 1 and 11, respectively. Chaikin further teaches:
generate, for output, a virtual element associated with the target virtual payment location. (Chaikin: pgh 40, “The method may be provided for generating the virtual terminal to be used inworld in a virtual world.”)
Chaikin/Tian/Sinha does not teach, however, Sukhija teaches:
wherein the control circuitry is further configured to: identify, in physical space, a location of the extended reality device; (Sukhija: pgh 28, “In an embodiment, the VR device sends 360-degree virtual coordinates of the virtual user to the payment terminal.”)
identify a proximity of the extended reality device to pre-determined coordinates in physical space; (Sukhija: pgh 64, “Upon receiving the virtual coordinates of the virtual user the payment terminal can compute a distance between the virtual coordinates (XYZ1) and virtual coordinates of each of the one or more establishments in the virtual environment.”)
determine that the proximity of the extended reality device to the pre-determined coordinates is within a threshold distance; and (Sukhija: pgh 28, “If location information of the virtual user matches location information of an establishment, the VR device can infer that the virtual user is in the establishment.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified Chaikin/Tian/Sinha to include the teachings of Sukhija because “Another issue with the existing techniques is that payments are made in a virtual environment and a vendor and/or manufacturer of the VR device may have to initiate the transaction to merchant in the physical environment, hence increasing intermediaries.” (Sukhija: pgh 5).
Regarding claim(s) 8 and 18:
The combination of Chaikin/Tian/Sukhija/Sinha, as shown in the rejection above, discloses the limitations of claims 1 and 11, respectively. Chaikin further teaches:
generate, for output, a virtual element associated with the target virtual payment location, (Chaikin: pgh 40, “The method may be provided for generating the virtual terminal to be used inworld in a virtual world.”)
Chaikin/Tian does not teach, however, Sukhija teaches:
wherein the control circuitry is further configured to: identify that the payment is in progress; (Sukhija: pgh 72, “…the VR device sends the merchant and associated details to a payment terminal upon the physical user initiating payment for the transaction…”)
lock the target virtual payment location associated with the payment; and (Sukhija: pgh 75, “Further, the payment terminal chooses the ‘SHOP3’ having a shortest distance with the virtual coordinates of virtual user.”)
wherein the virtual element indicates that the target virtual payment location is locked. (Sukhija: pgh 75, “The payment terminal extracts the merchant and the associated details of the ‘SHOP3’ from the mapping table…”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified Chaikin/Tian/Sinha to include the teachings of Sukhija because “Another issue with the existing techniques is that payments are made in a virtual environment and a vendor and/or manufacturer of the VR device may have to initiate the transaction to merchant in the physical environment, hence increasing intermediaries.” (Sukhija: pgh 5).
Regarding claim(s) 9 and 19:
The combination of Chaikin/Tian/Sukhija/Sinha, as shown in the rejection above, discloses the limitations of claims 1 and 11, respectively. Sukhija further teaches:
the control circuitry is further configured to: generate, for output, a payment authorization request; (Sukhija: pgh 23, “…sending a transaction message comprising the merchant and associated details to an issuer system via a gateway associated with the merchant for authorization.”)
receive a payment authorization; validate the payment authorization; and (Sukhija: pgh 23, “…the method may include receiving a result of authorization of the transaction message from the issuer system via the gateway.”)
the control circuitry configured to receive confirmation of the payment is further configured to receive confirmation of the payment in response to the payment authorization being validated. (Sukhija: pgh 33, “…the acquirer forwards the authorization response to the merchant system. Finally, the merchant system completes the transaction by displaying an appropriate message on the payment terminal.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified Chaikin/Tian/Sinha to include the teachings of Sukhija because “Another issue with the existing techniques is that payments are made in a virtual environment and a vendor and/or manufacturer of the VR device may have to initiate the transaction to merchant in the physical environment, hence increasing intermediaries.” (Sukhija: pgh 5).
Regarding claim(s) 33 and 34:
The combination of Chaikin/Tian/Sukhija, as shown in the rejection above, discloses the limitations of claims 1 and 11, respectively. Tian further teaches:
determining respective crowding information for each of the plurality of virtual payment locations; and (Tian: pgh 42, “…AR module can detect occlusion and collision to properly render the AR scene.”)
providing an indication that the target virtual payment location corresponds to the respective crowding information indicating that the target virtual payment is least crowded with respect to the plurality of virtual payment locations. (Tian: pgh 47, “Embodiments of the present disclosure involve occlusion and collision detection to support the rendering of correct AR scenes…”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified Chaikin/Sukhija/Sinha to include the teachings of Tian because it is important that the placement is accurate and performed in real time (Tian: pgh 3).
Regarding claim(s) 35 and 36:
The combination of Chaikin/Tian/Sukhija, as shown in the rejection above, discloses the limitations of claims 1 and 11, respectively. Chaikin further teaches:
wherein the payment is a first payment and each of the plurality of virtual payment locations except the target virtual payment location is used for providing a second respective payment. (Chaikin: pgh 40, “The method may be provided for generating the virtual terminal to be used inworld in a virtual world.”)
Claims 7 and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Chaikin/Tian/Sukhija/Sinha in view of Boesel (US 2024/0112419).
Regarding claim(s) 7 and 17:
The combination of Chaikin/Tian/Sukhija, as shown in the rejection above, discloses the limitations of claims 1 and 11, respectively. Sukhija further teaches:
identify that the location of at least a part of the user limb in virtual space corresponds to at least a part of the target virtual payment location. (Sukhija: pgh 71, “Based on the virtual coordinates…of the virtual user the VR device can compute distance with the virtual coordinates of the one or more establishments in the virtual environment and identify the establishment…”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified Chaikin/Tian/Sinha to include the teachings of Sukhija because “Another issue with the existing techniques is that payments are made in a virtual environment and a vendor and/or manufacturer of the VR device may have to initiate the transaction to merchant in the physical environment, hence increasing intermediaries” (Sukhija: pgh 5). Chaikin/Tian/Sukhija/Sinha does not teach, however, Boesel teaches:
wherein the control circuitry configured to detect the collision is further configured to: track, using limb tracking, a location of a user limb in the physical space; (Boesel: pgh 33, “…the data obtainer is configured to obtain data (e.g., captured image frames of the physical environment…hand/limb tracking information…) from at least one of the I/O devices…”)
map the location of the user limb to virtual space; and (Boesel: pgh 28, “…the controller and/or the electronic device cause an XR representation of the user to move within the XR environment based on movement information (e.g., body pose data, eye tracking data, hand/limb tracking data, etc.) from the electronic device…”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified Chaikin/Tian/Sukhija/Sinha to include the teachings of Boesel because it is difficult to present extended reality in proper proportions (Boesel: pgh 3).
Conclusion
Pertinent Art
The prior art made of record and not relied upon is considered pertinent to Applicant’s disclosure. Soon-Shiong (US 11,869,160) discloses interference based augmented reality hosting platforms.
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JOHN O PRESTON whose telephone number is (571)270-3918. The examiner can normally be reached 12:00 pm - 8:00 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Michael W Anderson can be reached on 571-270-0508. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JOHN O PRESTON/Examiner, Art Unit 3693
January 10, 2026
/Mike Anderson/Supervisory Patent Examiner, Art Unit 3693