Prosecution Insights
Last updated: April 19, 2026
Application No. 18/084,685

SYSTEMS AND METHODS FOR CONCEPTUALIZING A VIRTUAL OR LIVE OBJECT

Non-Final OA §103
Filed
Dec 20, 2022
Examiner
BEARD, CHARLES LLOYD
Art Unit
2611
Tech Center
2600 — Communications
Assignee
Adeia Guides Inc.
OA Round
3 (Non-Final)
67%
Grant Probability
Favorable
3-4
OA Rounds
2y 11m
To Grant
99%
With Interview

Examiner Intelligence

Grants 67% — above average
67%
Career Allow Rate
235 granted / 350 resolved
+5.1% vs TC avg
Strong +36% interview lift
Without
With
+36.1%
Interview Lift
resolved cases with interview
Typical timeline
2y 11m
Avg Prosecution
37 currently pending
Career history
387
Total Applications
across all art units

Statute-Specific Performance

§101
5.5%
-34.5% vs TC avg
§103
70.2%
+30.2% vs TC avg
§102
6.2%
-33.8% vs TC avg
§112
15.4%
-24.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 350 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 11/20/2025 has been entered. Response to Amendment Received 11/20/2025 Claim(s) 1-7, 12, 13, 15-22, 27, 28, and 30-32 is/are pending. Claim(s) 1 and 16 has/have been amended. Claim(s) 8-11, 14, 23-26, and 29 has/have been cancelled. Claim(s) 31 and 32 has/have been added. The 35 U.S.C § 103 rejection to claim(s) 1-7, 12, 13, 15-22, 27, 28, and 30-32 have been fully considered in view of the amendments received on 11/20/2025 and are fully addressed in the prior art rejection below. Response to Arguments Received 11/20/2025 Regarding independent claim(s) 1 and 16: Applicant’s arguments (Remarks, Page 7: ¶ 4 to Page 8: ¶ 1), filed 11/20/2025, with respect to the rejection(s) of claim(s) 1 and 16 under 35 U.S.C § 103 have been fully considered and are persuasive. Wherein, Osotio et al. (US PGPUB No. 20190096105 A1) fails to disclose a user biomarker. Moreover, the user gaze tracking as taught by Osotio et al. is less limiting than the subject matter of a biomarker (i.e. biological markers corresponding to measurements of an organism). Therefore, the rejection has been withdrawn, necessitated by Applicant's amendments. However, upon further consideration, a new ground(s) of rejection is made in view of Osotio et al., in view of Mathis et al. (US PGPUB No. 20220012789 A1), in view of Hiranandani et al. (US PGPUB No. 20180260843 A1), and further in view of Ouderkirk et al. (US PGPUB No. 10983591 B1). Moreover, the limitation of “user biomarker” is found to be further limiting based on the manner in which a user is measured. Although, a gaze (e.g. user activity) is being tracked in relation with a user’s eye(s) (Osotio; [¶ 0046-0047 and ¶ 0099]; moreover, gaze metric [¶ 0111]), Osotio et al. fails to explicitly teach a user measured biological response/event (e.g. pupil dilation, eye opening amount) which are associated with biomarkers (e.g. user metric). Thus, the failure of Osotio et al. to explicitly teach measuring a user’s eye(s), or the user’s biological feature(s) that are being measured/tracked to determine a gaze. Although, tracking eye movement is a know method to one of ordinary skill in the art, Osotio et al. fails to teach analysis of a user’s biomarker(s). Applicant’s arguments (Remarks, Page 8: ¶ 2), filed 11/20/2025, with respect to the rejection(s) of claim(s) 1 and 16 under 35 U.S.C § 103 have been fully considered and are persuasive. However, upon further consideration, a new ground(s) of rejection is made in view of the prior art as mentioned above. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1, 2, 4-7, 13, 16-17, 19-22, 28, and 30-32 is/are rejected under 35 U.S.C. 103 as being unpatentable over Osotio et al., US PGPUB No. 20190096105 A1, hereinafter Osotio, in view of Mathis et al., US PGPUB No. 20220012789 A1, hereinafter Mathis, in view of Hiranandani et al., US PGPUB No. 20180260843 A1, hereinafter Hiranandani, and further in view of Ouderkirk et al., US PGPUB No. 10983591 B1, hereinafter Ouderkirk. Regarding claim 16, Osotio discloses a system (Osotio; system [¶ 0146-0148], as illustrated within Fig. 11; additionally, bot architecture [¶ 0051], as illustrated within Fig. 5) comprising: communications circuitry configured to access an electronic device (Osotio; the system [as addressed above] comprises communications circuitry configured to access an electronic device (i.e., hardware, sensor(s), UI, and/or display) [¶ 0146-0148], as illustrated within Fig. 11; moreover, AR/VR headset [¶ 0035]); and control circuitry (Osotio; the system [as addressed above] comprises control circuitry (i.e. processor) [¶ 0146-0148], as illustrated within Fig. 11) configured to: receive an indication of an initial object being displayed on a display of the electronic device (Osotio; control circuitry (i.e. processor) [as addressed above] (is) configured to receive an indication of an initial object [¶ 0067-068 and ¶ 0080-0081] being displayed on a display of the electronic device [¶ 0035-0036], as illustrated within Fig. 1); determine a set of characteristics of the initial object (Osotio; control circuitry (i.e. processor) [as addressed above] (is) configured to determine a set of characteristics of the initial object [¶ 0079-0081 and ¶ 0085-0086]; wherein, contextual information drives operation (of the bot service) [¶ 0060 and ¶ 0068-0077] for determining one or more characteristics of an object [¶ 0080-0081]; additionally, the properties of the object also correspond to a set of characteristics [¶ 0036]); for each of a plurality of reference objects, calculate a relevancy score (Osotio; control circuitry (i.e. processor) [as addressed above] (is) configured to calculate a relevancy score [¶ 0120-0122 and ¶ 0124] for each of a plurality of reference objects (i.e. objects associated with bots) [¶ 0082, ¶ 0126-0127, and ¶ 0131]; wherein, one or more reference objects are associated with bots [¶ 0020 and ¶ 0079-0081]) based on: a degree to which the set of characteristics of the initial object is exhibited by the reference object (Osotio; calculate a relevancy score [as addressed above] based on a degree to which the set of characteristics of the initial object is exhibited by the reference object [¶ 0120-0121, ¶ 0124, and ¶ 0127-0129]); and a determined user interaction with the reference object (Osotio; calculate a relevancy score [as addressed above] based on a determined user interaction with the reference object [¶ 0035-0044 and ¶ 0121]; additionally, rendering fidelity based on interest of a user [¶ 0093-0097 and ¶ 0099], as illustrated within Fig. 7; wherein, interest of a user is monitored [¶ 0045-0047]) based on a user gaze (Osotio; based on a user gaze [¶ 0046-0047, ¶ 0099]; moreover, gaze metric [¶ 0111]); select, from the plurality of reference objects, a selected reference object based on the calculated relevancy score for the selected reference object (Osotio; control circuitry (i.e. processor) [as addressed above] (is) configured to select, from the plurality of reference objects, a selected reference object based on the calculated relevancy score for the selected reference object [¶ 0124, ¶ 0126-0127, and ¶ 0131]); and display the selected reference object on the display of the electronic device (Osotio; control circuitry (i.e. processor) [as addressed above] (is) configured to display the selected reference object implicitly on the display of the electronic device [¶ 0045, ¶ 0084-0086, and ¶ 0090-0091], as illustrated within Figs. 1-4; such that, the selection of the reference object is associated with determining one or more bots [as addressed above] wherein, augmented content is injected/placed into the AR/VR environment [¶ 0054 and ¶ 0063]). Osotio fails to explicitly disclose displaying content on the display of the electronic device; reference objects that are distinct from the initial object; and determined user interaction with the reference object based on a user biomarker. However, Mathis teaches display the selected reference object on the display of the electronic device (Mathis; display the selected reference object on the display of the electronic device [¶ 0026-0027, ¶ 0031, and ¶ 0038]). Osotio and Mathis are considered to be analogous art because both pertain to generating and/or managing data in relation with providing media data to a user, wherein one or more computerized units are utilized in order to produce an augmented reality effect. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing of the claimed invention was made to modify Osotio, to incorporate display the selected reference object on the display of the electronic device (as taught by Mathis), in order to provide an improved detection and interaction with products of interest for a user (Mathis; [¶ 0003-0005]). Osotio as modified by Mathis fails to disclose reference objects that are distinct from the initial object; and determined user interaction with the reference object based on a user biomarker. Hiranandani teaches for each of a plurality of reference objects that are distinct from the initial object (Hiranandani; for each of a plurality of reference objects that are distinct from the initial object [¶ 0057-0060]; wherein, distinctive corresponds to the AR analysis platform selects one or more products to recommend for use with a real object, the AR analysis platform selects an product based on the tuple-combination scores [¶ 0134-0135 and ¶ 0190-0192]; and, recommending different products [¶ 0136 and ¶ 0155-0156]), calculate a relevancy score (Hiranandani; calculate a relevancy score (i.e. object-label-confidence score, color-compatibility score, and/or tuple-combination score) [¶ 0061 and ¶ 0067-0068]; wherein, the AR analysis platform optionally generates scores that facilitate distinguishing among real-object tuples as they relate to various endorsed products [¶ 0116-0119]; additionally a relevancy score corresponds to an association value [¶ 0120-0123]; moreover, color-compatibility score [¶ 0108-0110 and ¶ 0112-0114], tuple-combination score [¶ 0129-0130], and association value [¶ 0120] of a data table [¶ 0131-0134] correlate to one or more products [¶ 0149-0152] for product recommendations [¶ 0156-0158]) based on: a degree to which the set of characteristics of the initial object is exhibited by the reference object (Hiranandani; calculate a relevancy score [as addressed above] is based on a degree/value to which the set of characteristics of the initial object is exhibited by the reference object [¶ 0116-0120]; wherein, a virtual object is based on extracted characteristics from a real object [¶ 0186-0187]; moreover, degree corresponds to value set [¶ 0123]); and a determined factors with the reference object (Hiranandani; calculate a relevancy score [as addressed above] is based on a determined factors with the reference object [¶ 0121-0122]; wherein, association value uses an association rule [¶ 0120-0123]; additionally, a user interacts with the reference object in the form of a message [¶ 0158-0159 and ¶ 0162-0163] and selection [¶ 0161]); and select, from the plurality of reference objects, a selected reference object that is distinct from the initial object based on the calculated relevancy score for the selected reference object (Hiranandani; select a selected reference object that is distinct from the initial object based on the calculated relevancy score (i.e. object-label-confidence score, color-compatibility score, and/or tuple-combination score) for the selected reference object from the plurality of reference objects [¶ 0149-0150, ¶ 0153-0154, and ¶ 0156]; wherein, the relevancy score corresponds to one or more scores/values to lead to an end result [as addressed above]). Osotio in view of Mathis and Hiranandani are considered to be analogous art because they pertain to generating and/or managing data in relation with providing media data to a user, wherein one or more computerized units are utilized in order to produce an augmented reality effect. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing of the claimed invention was made to modify Osotio as modified by Mathis, to incorporate for each of a plurality of reference objects that are distinct from the initial object, calculate a relevancy score based on: a degree to which the set of characteristics of the initial object is exhibited by the reference object; and a determined factors with the reference object; and select, from the plurality of reference objects, a selected reference object that is distinct from the initial object based on the calculated relevancy score for the selected reference object (as taught by Hiranandani), in order to provide an improved placement and/or recommend of product in a bespoke manner from information obtained other than a digital footprint (Hiranandani; [¶ 0001-0003 and ¶ 0030-0032]). Osotio in view of Mathis and Hiranandani fails to explicitly determined user interaction with the reference object based on a user biomarker. However, Ouderkirk teaches determining user interaction with the reference object based on a user biomarker (Ouderkirk; determining user interaction/gaze with the reference object based on a user biomarker [Col. 12, line 12 to Col. 13, line 13], as illustrated within Fig. 6; wherein, interest is determined for an object by analyzing a user’s biomarker(s) [Col. 10, line 40 to Col. 11, line 30], in relation with computer vision/recognition and sensor data [Col. 8, lines 28-67 Col. 9, lines 24-45]; moreover, the measured biomarker(s) is/are of a user’s response to one or more objects/events within an environment [Col. 17, line 55 to Col. 18, line 11 and Col. 18, lines 36-59]). Osotio in view of Mathis and Hiranandani and Ouderkirk are considered to be analogous art because they pertain to generating and/or managing data in relation with providing media data to a user, wherein one or more computerized units are utilized in order to produce an information based on a user’s environment and/or field of view. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing of the claimed invention was made to modify Osotio as modified by Mathis and Hiranandani, to incorporate determining user interaction with the reference object based on a user biomarker (as taught by Ouderkirk), in order to provide an improved user-interface experience based on a user’s gaze (Ouderkirk; [Col. 1, lines 49-64 and Col. 2, lines 12-51]). Regarding claim 17, Osotio in view of Mathis, Hiranandani, and Ouderkirk further discloses the system of claim 16, user interaction with the reference object using the electronic device that is associated with a user (Osotio; user interaction with the reference object [¶ 0045-0048] using the electronic device that is associated with a user [¶ 0035-0036 and ¶ 0146]). Mathis further teaches wherein the user interaction is a capturing of an image of the reference object using the electronic device that is associated with a user (Mathis; the user interaction is a capturing of an image of the reference object [¶ 0026-0028] using the electronic device that is associated with a user [¶ 0020, ¶ 0025, and ¶ 0038]). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing of the claimed invention was made to modify Osotio as modified by Mathis, Hiranandani, and Ouderkirk, to incorporate the user interaction is a capturing of an image of the reference object using the electronic device that is associated with a user (as taught by Mathis), in order to provide an improved detection and interaction with products of interest for a user (Mathis; [¶ 0003-0005]). Regarding claim 19, Osotio in view of Mathis, Hiranandani, and Ouderkirk further disclose the system of claim 16, calculating the relevancy score includes the control circuitry (Osotio; calculating the relevancy score includes the control circuitry [as addressed within the parent claim(s)]). Hiranandani further teaches wherein calculating the relevancy score includes the control circuitry configured to reward when a depiction of a user of the electronic device is included in a same image as the reference object (Hiranandani; calculating the relevancy score [¶ 0111-0114] includes the control circuitry [¶ 0177-0180, ¶ 0183, and ¶ 0185] configured to (subjectively) reward (i.e. score) [¶ 0116 and ¶ 0123-0125] when a depiction of a user of the electronic device is included in a same image as the reference object [¶ 0044-0047 and ¶ 0049]; wherein, a real object may correspond to a person/user [¶ 0041]; moreover, AR client environment [¶ 0164, ¶ 0166, and ¶ 0222-0223] of a computer device [¶ 0168]). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing of the claimed invention was made to modify Osotio as modified by Mathis, Hiranandani, and Ouderkirk, to incorporate wherein calculating the relevancy score includes the control circuitry configured to reward when a depiction of a user of the electronic device is included in a same image as the reference object (as taught by Hiranandani), in order to provide relevant targeted content that is personalized based on information about a user and/or user’s environment (Hiranandani; [¶ 0001-0004 and ¶ 0006]). Regarding claim 20, Osotio in view of Mathis, Hiranandani, and Ouderkirk further discloses the system of claim 16, wherein the initial object is a virtual object (Osotio; the initial object [as addressed within the parent claim(s)] is a virtual object (i.e. AR/VR content) [¶ 0019 and ¶ 0035-0036]). Regarding claim 21, Osotio in view of Mathis, Hiranandani, and Ouderkirk further discloses the system of claim 16, wherein the initial object is displayed by the control circuitry via a live image viewed via a camera of the electronic device (Osotio; the initial object [as addressed within the parent claim(s)] is displayed by the control circuitry (i.e. processor) via an implicit live image viewed via a camera of the electronic device (given AR/VR headset, camera, and gaze monitoring and response) [¶ 0035, ¶ 0045-0047, ¶ 0063, ¶ 0146, and ¶ 0148], as illustrated within Figs. 1-4; moreover, if a user focuses their gaze on the augmented content, the user has expressed increased interest (from a state of not focusing their gaze on the augmented content) [¶ 0099]). Mathis further teaches a live image viewed via a camera of the electronic device (Mathis; a live image viewed via a camera of the electronic device [¶ 0017, ¶ 0021, and ¶ 0042]). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing of the claimed invention was made to modify Osotio as modified by Mathis, Hiranandani, and Ouderkirk, to incorporate a live image viewed via a camera of the electronic device (as taught by Mathis), in order to provide an improved detection and interaction with products of interest for a user (Mathis; [¶ 0003-0005]). Regarding claim 22, Osotio in view of Mathis, Hiranandani, and Ouderkirk further discloses the system of claim 16, wherein the user interaction includes any one of 1) gaze of a user directed at the reference object (Osotio; the user interaction [as addressed within the parent claim(s)] includes gaze of a user directed at the reference object [¶ 0045-0047 and ¶ 0099]), 2) selection of the reference object on a website, or 3) consumption of the reference object on a social media feed associated with a user. Regarding claim 28, Osotio in view of Mathis, Hiranandani, and Ouderkirk further discloses the system of claim 16, wherein the user interaction is determined, by the control circuitry, based on the reference object being stored in a photo-gallery of the electronic device associated with a user (Osotio; the user interaction is determined based on the reference object being stored in a bot architecture of the electronic device associated with a user [¶ 0054, ¶ 0060, and ¶ 0068-0077] by the control circuitry [as addressed within the parent claim(s)]; moreover, bot architecture provides data and/or functionality [¶ 0051-0053]). Mathis further teaches the reference object being stored in a photo-gallery of the electronic device associated with a user (Mathis; the reference object (i.e. product) being stored in a photo-gallery (i.e. product catalog) of the electronic device associated with a user [¶ 0004 and ¶ 0037]; moreover, data from object detection is stored for later use as the input to the visual search process [¶ 0017 and ¶ 0028]). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing of the claimed invention was made to modify Osotio as modified by Mathis, Hiranandani, and Ouderkirk, to incorporate the reference object being stored in a photo-gallery of the electronic device associated with a user (as taught by Mathis), in order to provide an improved detection and interaction with products of interest for a user (Mathis; [¶ 0003-0005]). Regarding claim 30, Osotio in view of Mathis, Hiranandani, and Ouderkirk further discloses the system of claim 16, wherein calculating the relevancy score comprises, the control circuitry (Osotio; calculating the relevancy score [as addressed within the parent claim(s)] comprises the control circuitry [as addressed within the parent claim(s)]) configured to: determine a first weight for a similarity score representing the degree to which the set of characteristics of the initial object are exhibited by the reference object (Osotio; the control circuitry [as addressed above] configured to determine a 1st weight [¶ 0127-0131] for a similarity score representing the degree to which the set of characteristics of the initial object are exhibited by the reference object [¶ 0121-0122]); determine a second weight for a user interaction score determined based on the determined user interaction with the reference object (Osotio; the control circuitry [as addressed above] configured to determine a 2nd weight [¶ 0099 and ¶ 0104-0111] for a user interaction score determined based on the determined user interaction with the reference object [¶ 0045-0048 and ¶ 0084-0085], as illustrated within Fig. 7); and calculate the relevancy score based on the first weight, the similarity score, the second weight, and the user interaction score (Osotio; the control circuitry [as addressed above] configured to calculate the relevancy score [¶ 0120-0121] based on the 1st weight, the similarity score, the 2nd weight, and the user interaction score [¶ 0078-0085 and ¶ 0091-0092], as illustrated within Fig. 6; moreover, as illustrated within Figs. 7-10; additionally, calculated scores [as addressed above]). Regarding claim 1, the rejection of claim 1 is addressed within the rejection of claim 16, due to the similarities claim 1 and claim 16 share, therefore refer to the rejection of claim 16 regarding the rejection of claim 1. Regarding claim 2, the rejection of claim 2 is addressed within the rejection of claim 17, due to the similarities claim 2 and claim 17 share, therefore refer to the rejection of claim 17 regarding the rejection of claim 2. Regarding claim 4, the rejection of claim 4 is addressed within the rejection of claim 19, due to the similarities claim 4 and claim 19 share, therefore refer to the rejection of claim 19 regarding the rejection of claim 4. Regarding claim 5, the rejection of claim 5 is addressed within the rejection of claim 20, due to the similarities claim 5 and claim 20 share, therefore refer to the rejection of claim 20 regarding the rejection of claim 5. Regarding claim 6, the rejection of claim 6 is addressed within the rejection of claim 21, due to the similarities claim 6 and claim 21 share, therefore refer to the rejection of claim 21 regarding the rejection of claim 6. Regarding claim 7, the rejection of claim 7 is addressed within the rejection of claim 22, due to the similarities claim 7 and claim 22 share, therefore refer to the rejection of claim 22 regarding the rejection of claim 7. Regarding claim 13, the rejection of claim 13 is addressed within the rejection of claim 28, due to the similarities claim 13 and claim 28 share, therefore refer to the rejection of claim 28 regarding the rejection of claim 13. Regarding claim 15, the rejection of claim 15 is addressed within the rejection of claim 30, due to the similarities claim 15 and claim 30 share, therefore refer to the rejection of claim 30 regarding the rejection of claim 15. Regarding claim 31, Osotio in view of Mathis, Hiranandani, and Ouderkirk further discloses the method of claim 1, wherein the user biomarker (Ouderkirk; the user biomarker [as addressed within the parent claim(s)]) includes at least one of: a change in heartrate, pupil dilation, eye opening size (Ouderkirk; the user biomarker [as addressed above]) includes (at least one of: ) a change in heartrate, pupil dilation, or eye opening size (i.e. subtle eye movements and/or blink rate) [Col. 17, line 55 to Col. 18, line 11 and Col. 18, lines 36-59]; moreover, indicating a level of interest using a user’s blink rate, pupil dilation, or eye movements [Col. 12, line 64 to Col. 13, line 13]). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing of the claimed invention was made to modify Osotio as modified by Mathis and Hiranandani, to incorporate the user biomarker includes at least one of: a change in heartrate, pupil dilation, eye opening size (as taught by Ouderkirk), in order to provide an improved user-interface experience based on a user’s gaze (Ouderkirk; [Col. 1, lines 49-64 and Col. 2, lines 12-51]). Regarding claim 32, the rejection of claim 32 is addressed within the rejection of claim 31, due to the similarities claim 32 and claim 31 share, therefore refer to the rejection of claim 31 regarding the rejection of claim 32. Claim(s) 3 and 18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Osotio in view of Mathis, Hiranandani, and Ouderkirk as applied to claim(s) 2 and 17 above, and further in view of Ayush et al., US PGPUB No. 20190378204 A1, hereinafter Ayush. Regarding claim 18, Osotio in view of Mathis, Hiranandani, and Ouderkirk further discloses the system of claim 17, wherein calculating the relevancy score includes, the control circuitry configured to reward the selected reference object in response to determining that the image of the reference object was captured by the electronic device associated with the user (Osotio; calculating the relevancy score includes, the control circuitry (i.e. processor) configured to reward the selected reference object [as addressed within the parent claim(s)] in response to determining that the image of the reference object was searchable by the electronic device associated with the user [¶ 0054, ¶ 0060, and ¶ 0068-0077] by the control circuitry [as addressed within the parent claim(s)]; moreover, user search [¶ 0035, ¶ 0037-0040, and ¶ 0080-0081] bot architecture provides data and/or functionality [¶ 0051-0053]). Mathis further teaches calculating the relevancy score (Mathis; calculating the relevancy score (i.e. interest level) [¶ 0026-0028]) includes, the control circuitry configured to reward the selected reference object in response to determining that the image of the reference object was captured by the electronic device associated with the user (Mathis; the control circuitry [¶ 0023] configured select reference object in response to determining that the image of the reference object was captured [¶ 0030-0031, ¶ 0036-0037, and ¶ 0042] by the electronic device associated with the user [¶ 0019-0021]). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing of the claimed invention was made to modify Osotio as modified by Mathis, Hiranandani, and Ouderkirk, to incorporate calculating the relevancy score includes, the control circuitry configured to reward the selected reference object in response to determining that the image of the reference object was captured by the electronic device associated with the user (as taught by Mathis), in order to provide an improved detection and interaction with products of interest for a user (Mathis; [¶ 0003-0005]). Ouderkirk further teaches calculating the relevancy score (Ouderkirk; calculating the relevancy score (i.e. interest ranking) [Col. 3, lines 12-32]; moreover, calculating ranking [Col. 8, lines 28-48 and Col. 10, line 60 to Col. 11, line 54]). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing of the claimed invention was made to modify Osotio as modified by Mathis, Hiranandani, and Ouderkirk, to incorporate calculating the relevancy score (as taught by Ouderkirk), in order to provide an improved user-interface experience based on a user’s gaze (Ouderkirk; [Col. 1, lines 49-64 and Col. 2, lines 12-51]). Osotio as modified by Mathis, Hiranandani, and Ouderkirk fails to explicitly disclose to reward the selected reference object. However, Ayush teaches the control circuitry configured to reward the selected reference object in response to determining that the image of the reference object was captured by the electronic device associated with the user (Ayush; the control circuitry [¶ 0128 and ¶ 0165] configured to (subjectively) reward (i.e. sore) the selected reference object in response to determining that the image of the reference object was captured by the electronic device associated with the user [¶ 0131-0133, ¶ 0142-0143, and ¶ 0150-0151], as illustrated within Fig. 11 and Fig. 12; moreover, similarity score [¶ 0095-0099 and ¶ 0102-0104]; additionally, similarity score [as addressed] correlates with overall scores for recommended products [¶ 0117-0120] and related training weights [¶ 0122-0123], as illustrated within Fig. 8). Osotio in view of Mathis, Hiranandani, and Ouderkirk and Ayush are considered to be analogous art because they pertain to generating and/or managing data in relation with providing media data to a user, wherein one or more computerized units are utilized in order to produce an augmented reality effect. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing of the claimed invention was made to modify Osotio as modified by Mathis, Hiranandani, and Ouderkirk, to incorporate the control circuitry configured to reward the selected reference object in response to determining that the image of the reference object was captured by the electronic device associated with the user (as taught by Ayush), in order to provide more accurately targeted content that is personalized based on information about a user and/or user’s environment in a less resource intensive manner (Ayush; [¶ 0001-0004]). Regarding claim 3, the rejection of claim 3 is addressed within the rejection of claim 18, due to the similarities claim 3 and claim 18 share, therefore refer to the rejection of claim 18 regarding the rejection of claim 3. Claim(s) 12 and 27 is/are rejected under 35 U.S.C. 103 as being unpatentable over Osotio in view of Mathis, Hiranandani, and Ouderkirk as applied to claim(s) 1 and 16 above, and further in view of Mayerle et al., US Patent No. 9449343 B2, hereinafter Mayerle. Regarding claim 27, Osotio in view of Mathis, Hiranandani, and Ouderkirk further discloses the system of claim 16, wherein the user interaction is an interaction with the reference object by a contact of a user of the electronic device (Osotio; the user interaction is an interaction with the reference object and a user of the electronic device [¶ 0035-0036 and ¶ 0044-0047]). Osotio as modified by Mathis, Hiranandani, and Ouderkirk fails to disclose a contact of a user. However, Mayerle teaches the user interaction is an interaction with the reference object by a contact of a user of the electronic device (Mayerle; the user interaction is an interaction with the reference object by a contact (i.e. friends or social network) of a user [Col. 14, lines 43-67, Col. 20, line 42 to Col. 21, line 8, and Col. 22, lines 1-15] of the electronic device [Col. 5, lines 12-50]). Osotio in view of Mathis, Hiranandani, and Ouderkirk and Mayerle are considered to be analogous art because they pertain to generating and/or managing data in relation with providing media data to a user, wherein one or more computerized units are utilized in order to produce an augmented reality effect. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing of the claimed invention was made to modify Osotio as modified by Mathis, Hiranandani, and Ouderkirk, to incorporate the user interaction is an interaction with the reference object by a contact of a user of the electronic device (as taught by Mayerle), in order to provide an enhanced shopping environment that better informs a user (Mayerle; [Col. 2, line 43 to Col. 3, line 13 and Col. 5, lines 3-11]). Regarding claim 12, the rejection of claim 12 is addressed within the rejection of claim 27, due to the similarities claim 12 and claim 27 share, therefore refer to the rejection of claim 27 regarding the rejection of claim 12. Conclusion Prior art to be further consider, regarding the amended subject matter: Sztuk et al. (US PGPUB No. 20200209624 A1), see ¶ 0035; Seo et al. (US PGPUB No. 20200166996 A1), see ¶ 0029 & ¶ 0091; and Asukai et al. (US PGPUB No.20080129839 A1), see ¶ 0137 & ¶ 0213. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Refer to PTO-892, Notice of Reference Cited for a listing of analogous art. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Charles Lloyd Beard whose telephone number is (571)272-5735. The examiner can normally be reached Monday - Friday, 8:00 AM - 5: 00 PM, alternate Fridays EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Tammy Goddard can be reached at (571) 272-7773. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. CHARLES LLOYD. BEARD Primary Examiner Art Unit 2611 /CHARLES L BEARD/ Primary Examiner, Art Unit 2611
Read full office action

Prosecution Timeline

Dec 20, 2022
Application Filed
Mar 31, 2025
Non-Final Rejection — §103
Jun 24, 2025
Response Filed
Aug 18, 2025
Final Rejection — §103
Nov 20, 2025
Request for Continued Examination
Dec 01, 2025
Response after Non-Final Action
Dec 09, 2025
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12579729
VOLUMETRIC VIDEO SUPPORTING LIGHT EFFECTS
2y 5m to grant Granted Mar 17, 2026
Patent 12548225
AUDIO OR VISUAL INPUT INTERACTING WITH VIDEO CREATION
2y 5m to grant Granted Feb 10, 2026
Patent 12519924
MULTI-PERSPECTIVE AUGMENTED REALITY EXPERIENCE
2y 5m to grant Granted Jan 06, 2026
Patent 12511801
GENERATING VIDEO STREAMS TO DEPICT BOT PERFORMANCE DURING AN AUTOMATION RUN
2y 5m to grant Granted Dec 30, 2025
Patent 12513279
STEREOSCOPIC VIDEO DISPLAY DEVICE, STEREOSCOPIC VIDEO DISPLAY METHOD, AND COMPUTER-READABLE STORAGE MEDIUM
2y 5m to grant Granted Dec 30, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
67%
Grant Probability
99%
With Interview (+36.1%)
2y 11m
Median Time to Grant
High
PTA Risk
Based on 350 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month