Prosecution Insights
Last updated: April 19, 2026
Application No. 18/111,218

Extended Reality Methods and Systems for Handling Estate Dispositions

Non-Final OA §101§103
Filed
Feb 17, 2023
Examiner
WASAFF, JOHN S.
Art Unit
3629
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
State Farm Mutual Automobile Insurance Company
OA Round
3 (Non-Final)
33%
Grant Probability
At Risk
3-4
OA Rounds
4y 1m
To Grant
77%
With Interview

Examiner Intelligence

Grants only 33% of cases
33%
Career Allow Rate
124 granted / 373 resolved
-18.8% vs TC avg
Strong +44% interview lift
Without
With
+44.2%
Interview Lift
resolved cases with interview
Typical timeline
4y 1m
Avg Prosecution
37 currently pending
Career history
410
Total Applications
across all art units

Statute-Specific Performance

§101
25.4%
-14.6% vs TC avg
§103
39.3%
-0.7% vs TC avg
§102
11.1%
-28.9% vs TC avg
§112
20.4%
-19.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 373 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claims 1-2, 4, 6, 8-11, 13, 15, and 17-18 are pending. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-2, 4, 6, 8-11, 13, 15, and 17-18 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception without significantly more. Step 1 (The Statutory Categories): Is the claim to a process, machine, manufacture or composition of matter? MPEP 2106.03. Per Step 1, claim 1 is to a method (i.e., a process), claim 10 to a system (i.e., a machine), and claim 18 to a non-transitory computer-readable medium (i.e., a manufacture or machine). Thus, the claims are directed to statutory categories of invention. However, the claims are rejected under 35 U.S.C. 101 because they are directed to an abstract idea, a judicial exception, without reciting additional elements that integrate the judicial exception into a practical application. The analysis proceeds to Step 2A Prong One. Step 2A Prong One: Does the claim recite an abstract idea, law of nature, or natural phenomenon? MPEP 2106.04. The abstract idea of claims 1, 10, and 18 is (claim 1 being representative): obtaining one or more XR preferences for a first person; obtaining a death certificate associated with a deceased person for which the first person is a legal executor or beneficiary, wherein obtaining the death certificate includes receiving data representing the death certificate after the first person has captured the death certificate; authenticating the first person and the death certificate; identifying one or more estate assets associated with the deceased person; determining one or more possible disposition options for the one or more estate assets; [indicating] the one or more possible disposition options; receiving a selection by the first person of a disposition option for an estate asset of the one or more estate assets; updating estate data to include a disposition record for the selected disposition option; and causing the updated estate data to be stored. The abstract idea above describes managing estate data, which is a process that, under its broadest reasonable interpretation, covers managing personal behavior relationships, interactions between people. This is further supported by [0002]-[0003] of applicant’s specification as filed. If a claim limitation, under its broadest reasonable interpretation, covers managing personal behavior relationships, interactions between people, including social activities, teaching, and/or following rules or instructions, then it falls within the Certain Methods of Organizing Human Activity – Managing Personal Behavior Relationships, Interactions Between People grouping of abstract ideas. Accordingly, the claim recites an abstract idea. Step 2A Prong Two: Does the claim recite additional elements that integrate the judicial exception into a practical application? MPEP 2106.04. Claims 1, 10, and 18 recites the following additional elements (claim 1 being representative, with slight differences in claim language): generating, using one or more processors, a collaborative XR environment for a virtual meeting with a second person in accordance with the one or more XR preferences; from the first person via another XR environment; digital; using an XR device associated with the first person; using one or more processors; providing, in the collaborative XR environment using the XR device associated with the first person and another XR device associated with the second person, one or more user interfaces; such that at least the first person can use virtual gestures to select disposition options via elements of the one or more user interfaces; via a user interface of the one or more user interfaces; on a distributed ledger. Claim 1 also recites the following additional elements: computer-implemented. Claim 10 also recites the following additional elements: a communication interface; one or more processors. Claim 18 also recites the following additional elements: non-transitory computer-readable storage medium storing instructions; one or more processors. The additional elements above are merely instructions to apply the abstract idea to a computer, per MPEP 2106.05(f). Applicant has only described generic computing elements in their specification, as seen in [00143]-[00150] of applicant’s specification as filed, for example. The combination of these additional elements are no more than mere instructions to apply the exception using generic computing components, as described in MPEP 2106.05(f). Accordingly, whether viewed alone or in combination, these additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. Therefore, per Step 2A Prong Two, the additional elements, alone and in combination, do not integrate the judicial exception into a practical application. The claim is directed to an abstract idea. Step 2B (The Inventive Concept): Does the claim recite additional elements that amount to significantly more than the judicial exception? MPEP 2106.05. Step 2B involves evaluating the additional elements to determine whether they amount to significantly more than the judicial exception itself. The examination process involves carrying over identification of the additional element(s) in the claim from Step 2A Prong Two and carrying over conclusions from Step 2A Prong Two pertaining to MPEP 2106.05(f), (h). The additional elements and their analysis are therefore carried over: the additional elements are no more than mere instructions to apply the exception using generic computing components, as described in MPEP 2106.05(f). When the claim elements above are considered, alone and in combination, they do not amount to significantly more. Therefore, per Step 2B, the additional elements, alone and in combination, are not significantly more. The claims are not patent eligible. Further, the analysis takes into consideration all dependent claims as well: Claims 2, 4, 11, and 13 further narrow the abstract idea above. This does not integrate the abstract idea into practical application and is not significantly more. Claims 6, 8-9, 15, and 17 further narrow the abstract idea above and/or recite further additional elements (claims 6 and 15: using one or more XR devices; claims 8 and 17: providing a virtual meeting of avatars of the person and another person via respective XR devices; claim 9: wherein the XR device includes at least one of (i) an augmented reality (AR), mixed reality (MR), or virtual reality (VR) headset, or (ii) AR, MR, or VR smart glasses). Similar to above, these additional elements are simply being used for the tasks of the abstract idea, as described in MPEP 2106.05(f). Whether viewed alone or in combination, these further additional elements do not integrate the abstract idea into practical application and are not significantly more. Accordingly, claims 1-2, 4, 6, 8-11, 13, 15, and 17-18 are rejected under 35 USC § 101 as being directed to non-statutory subject matter. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-2, 4, 6, 8-11, 13, 15, and 17-18 are rejected under 35 U.S.C. 103 as being unpatentable over Bryant (US 20230177777) in view of Racanelli (US 20100063908), Haque (US 20180205546), and Fields (US 10373387). Claims 1, 10, and 18 Bryant discloses: [A computer-implemented method via extended reality (XR) environments {[0001], [0004], [0031]}, the method comprising:] [A system via extended reality (XR) environments {[0001], [0004], [0031]}, comprising: a communication interface {[0070], [0071]}; and one or more processors {[0070], [0071]} configured to:] [A non-transitory computer-readable storage medium storing instructions via extended reality (XR) environments {[0004], [0031]} that, when executed by one or more processors {[0071]}, cause a system to:] obtaining one or more XR preferences for a first person {[0045] In this way, the same AR system 120 may be used to load multiple user configurations remotely via the augmented display system 300 such that it can be used by multiple different users (e.g., multiple users may share an AR system 120 and alternate using the AR system 120 in a given time period, but may load their specific preferences and workstation attributes via user configurations stored on the augmented display system 300 or entity system 200).}; generating, using one or more processors, a collaborative XR environment for a virtual meeting with a second person in accordance with the one or more XR preferences {[0038] In other embodiments, the invention may be utilized to replace a user's typical office surroundings rather than emulate them. In some embodiments, this may be dynamically altered according to user preferences, user data history, or situational demands. For instance, a user may prefer a beach setting, or a peaceful wooded porch setting, rather than their actual office space on a given day, depending on their mood. [0070] In some embodiments, the AR system 120 is considered to be a specialized subset of user device 130, and as such, may contain the same or similar components as described with regard to user device 130, and is used to route information from one or more user devices 130 to the augmented display system 130.}; obtaining, from the first person via another XR environment, [information], wherein obtaining the [information] includes receiving digital data representing the [information] after the first person has captured the [information] using an XR device associated with the first person {[0058] The augmented display system 300 may also contain a machine learning engine 366 and machine learning dataset(s) 368. The machine learning engine 366 may store instructions and/or data that may cause or enable the augmented display system 300 to receive, store, and/or analyze data received by the managing entity system 200, user's device 130, or AR system 120. The machine learning engine 366 and machine learning dataset 368 may store instructions and/or data that cause or enable the augmented display system 300 to determine patterns and correlations within received user data. In some embodiments, the machine learning dataset(s) 368 may contain data relating to user activity or device information, which may be stored in a user account managed by the managing entity system 200. [0070] Furthermore, it should be known that multiple user device(s) 130 may be owned by or accessed by the user 110 within the system environment 100 of FIG. 1, and these separate user device(s) 130 may be in network communication with each other and the other systems and devices of the system environment 100, such as augmented display system 300, managing entity system 200, and AR system 120. For example, a first user device 130 may comprise a mobile phone of the user 110 that includes an interface for working in concert with a second user device 130 that comprises a personal computer of the user 110 or an AR system 120 of the user 110. For instance, in some embodiments, a first user device 130 may be used for biometric authentication of a specific user, a second user device 130 may act as a desktop or laptop workstation of the specific user, and a AR system 120 may be enabled to augment the details transmitted to be displayed via the first or the second user device 130. In some embodiments, the AR system 120 may be configured to display virtual representations of one or more devices. For example, a user device 130 such as a mobile phone may reside in a user's pocket, and may be displayed via the AR system 120 as a desk phone situated on the user's workstation. In such embodiments, the AR system 120 may interface with the user device 130 via wireless communication, such as a local area network, Bluetooth connection, or the like, in order to receive data from the user device 130 and display status information, incoming call information, messages, or the like, in a visual manner in the user's field of view. As such, any or all of the described components herein with regard to FIG. 4 may exist in the first user device 130, the second user device 130, and so on. In some embodiments, the AR system 120 is considered to be a specialized subset of user device 130, and as such, may contain the same or similar components as described with regard to user device 130, and is used to route information from one or more user devices 130 to the augmented display system 130.}. Bryant, which describes the use of AR technology in a financial planning context (e.g., retirement), doesn’t explicitly disclose, however, Racanelli, in a similar field of endeavor directed to estate financial planning, teaches: for secure transactions to dispose estate assets {[0030] In yet another embodiment, alternative or additional estate planning strategies may be identified and tested in the model 133. For example, based on the estimated component tax liability of a particular conveyance 131, the system may generate possible alternative conveyances (for example, bequeathing an asset to a different beneficiary or establishing a living trust to hold particular assets).}; identifying, using one or more processors, one or more estate assets associated with the deceased person [for which the person is a legal executor or beneficiary] {[0028] FIG. 1 illustrates a schematic diagram according to a disclosed embodiment. Generally, various disclosed embodiments receive balance sheet 103 and estate document inputs 105 corresponding to an estate planning client, apply conveyance 109, taxation 111 and other 107 logic to the client's inputs to generate one or more customizable disposition flowcharts 121 visually depicting the flow of estate components to beneficiaries and/or an estimate of the estate's future liquidity status 123. Taxation logic includes information associating what assets are taxed, the extent to which these assets are taxed, under what conditions the tax is applied, and how the tax is applied. See also [0030].}; determining, using or more processors, one or more possible disposition options for the one or more estate assets {[0064] FIG. 13 illustrates an exemplary output feature of the disclosed model related to generating alternative estate planning strategies. In one embodiment, the model generates a base case for the estate disposition tax and liquidity estimates S1301. Then, by varying one or more estate planning inputs, alternative gifting and trust strategies are input to generate alternative estate disposition and liquidity estimates S1303.}. It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify Bryant to include the features of Racanelli. Given that Bryant is directed to the use of AR technology in a financial planning context, one of ordinary skill in the art would have been motivated to include the features of Racanelli, in order to facilitate reconciling and combining the tax implications of a complex estate plan with a clear visual depiction of the estate's disposition at a future time {[0005] of Racanelli}. The combination of Bryant and Racanelli doesn’t explicitly disclose, however, Haque, in a similar field of endeavor directed to inheritance related matters, teaches: a death certificate associated with a deceased person for which the first person is a legal executor or beneficiary {[0145] 19. When the customer/testator has died, then the unencrypted database will store a data element indicating that the ewill is active. This may include public documents, for example, a data record is updated to include a death indication and a reference to a stored file containing the scanned death certificate or other document, for example, a judge’s order that a missing person is dead or an order that the person is incapacitated.}; authenticating, using one or more processors, the first person and the death certificate {[0089] Note that executors are also specified in the digital wills and the host server 300 executes a flow which validates that the appointed executors and witnesses agree to performing their roles and are authenticated via two factor authentication, 2FA biometrics, or other known or new security means. The flows for notifying and authorizing and appointed executor of a will are further illustrated in FIG. 10A-10G. [0116] The system verifies the prior execution of the will by checking that the time stamp on the will entry in the ledger is the latest one. In this embodiment, the system obtains the time stamp from the latest change stored in the will data record, and then verifies that the data record has not been tampered or revised by using the encryption token stored in the ledger document. The executor fetches the unencrypted electronic will document using their private key, and uses the ledger to confirm that the document is genuine. In order to protect the integrity of the data, the Executor only has read only permission.}; receiving, via a user interface of the one or more user interfaces, a selection by the first person of a disposition option for an estate asset of the one or more estate assets {[0087] In one embodiment, the digitally created will (e.g., living will) can include free text space so the user may add assets manually outside of the host server 300 to include cataloged and non-cataloged assets. Engine 386 allows users to select beneficiaries for subcategories within each category. And to split assets across beneficiaries. In one embodiment, user input and/or third party (e.g., lawyer, advisor) input of the will can be facilitated.}; updating estate data to include a disposition record for the selected disposition option {[0113] The electronic will data record is in essence a set of rules because each asset and its disposition can be considered a rule, conditioned on a logical event, for example, the demise of the testator, and the age of the beneficiary, for example. The rules results are determined and its result may also be the re-allocation of an asset token from the will to the beneficiaries' will data record, an update to the ledger with that change. [0115] If the will specified beneficiaries that are also users of the system, the system uses the disposition rules encoded in the will to update the asset lists of the beneficiaries with the assets designated in the will that has been triggered.}; causing the updated estate data to be stored on a distributed ledger {[0113]}. It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the combination of Bryant and Racanelli to include the features of Haque. Given that Bryant is directed to the use of AR technology in a financial planning context, one of ordinary skill in the art would have been motivated to include the features of Haque, in order to intelligently provide predictive actionable data for use in managing, leveraging and/or protecting assets in inheritance-related, i.e., financial planning, matters {[0003] of Haque}. The combination of Bryant, Racanelli, and Haque doesn’t explicitly disclose, however, Fields, in a similar field of endeavor directed to scene visualizations shared between users, teaches (note that Racanelli and/or Haque teach disposition options): providing, in the collaborative XR environment using the XR device associated with the first person and another XR device associated with the second person, [and via the communication interface,] one or more user interfaces that indicate the one or more possible options such that at least the first person can use virtual gestures to select options via elements of the one or more user interfaces {Col. 9, line 60 to col. 10, line 15: In various embodiments, the input controls of the VR devices, such as first VR device 130 and second VR device 154, allow a user to interact with the VR visualization, where the user, wearing a VR device, such as first VR device 130 or second VR device 154, can provide input to analyze, review, augment, annotate, or otherwise interact with the VR visualization as described herein. In some embodiments, a user may use the input controls to select from a menu or list displayed within the VR visualization. For example, the displayed menu or list may include options to navigate or highlight certain views or features of the VR visualization. In other embodiments, graphics or items may be interactive or selectable with the VR visualization. Still further, in other embodiments, the user may provide textual, graphical, video or other input to the VR visualization in order to augment, or annotate the VR visualization. In some embodiments, augmentation or annotation of the VR visualization will cause the same augmentation or annotations to appear in the immersive multimedia image(s), upon which the VR visualization is based, and/or vice versa.}. It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the combination of Bryant, Racanelli, and Haque to include the features of Fields. Given that Bryant is directed to the use of AR technology, one of ordinary skill in the art would have been motivated to include the features of Fields, in order to facilitate the visualization of immersive multimedia image(s) at remote locations using a virtual reality (VR) device {Col. 3, lines 30-35 of Fields}. Claims 2 and 11 Racanelli further teaches: causing selected disposition options to be executed {[0061] Optionally, these components 1111, 1121 may take into other factors 1197, 1198 such as account income taxes payable upon exercise of options and distribution of qualified assets or stock options (depending on whether an executor exercises the options immediately or chooses to delay until expiration). Although the combined liquidity of the estate in FIG. 11B is much larger than that of the estate in FIG. 11A, in both situations, the cash required exceeds the cash available, resulting in an estimated liquidity deficit 1101, 1102. }. The motivation and rationale to modify the combination of references to include the additional features of Racanelli is the same as set forth previously. Claims 4 and 13 Racanelli further teaches: wherein the estate data includes a plurality of asset records for respective ones of a plurality of assets associated with the deceased person, wherein one or more of the assets are dispensable upon death of the deceased person {[0032] FIG. 2 illustrates a flowchart representing an exemplary method for modeling estate disposition. In one embodiment, balance sheet inputs are received S201. Estate document inputs are also received S203. The conveyance logic contained within the estate documents is applied to the balance sheet and estate document inputs S205. Application of the conveyance logic ascertains which assets, or portions thereof, from the client's balance sheet are conveyed to a particular spouse, child or other beneficiary.}. The motivation and rationale to modify the combination of references to include the additional features of Racanelli is the same as set forth previously. Claims 6 and 15 Bryant further discloses: wherein obtaining the one or more XR preferences includes obtaining the one or more XR preferences from the first person using one or more XR devices, and wherein the one or more XR preferences represent one or more of profile data for the first person, virtual interaction preferences, metaverse preferences, or avatar preferences {[0090] The user may create or be assigned login credentials for logging into the system 300 and configuring their unique user configuration settings, or the like. For instance, the user may be able to manage compatible user device(s) 130, set preferences for environment items, configure notification settings, configure accessibility settings, configure gesture settings, or the like.}. Claims 8 and 17 Bryant further discloses: wherein providing the one or more user interfaces includes providing a virtual meeting of avatars of the first person and the second person via respective XR devices {[0046] Rather, the user is able to receive additional information via the AR system 120 in a seamless, personalized fashion wherein the information is overlaid on or otherwise augments the user's view or perspective of their existing environment. It is understood that any graphical depictions generated by the augmented display system 300 may be designed to be displayed and interacted with a number of devices, including user device(s) 130 and one or more AR system(s) 120 (e.g., multiple AR systems 120 may be integrated remotely to display common information to multiple users in a conference setting, or the like, and may be oriented with respect to one another in a consistent manner in the augmented environment). [0098] In some embodiments, the users may be represented to one another via avatars, or the like, wherein the avatars may be rendered continually to reflect the users' gaze, speech, or the like.}. Claim 9 Bryant further discloses: wherein the XR device includes at least one of (i) an augmented reality (AR), mixed reality (MR), or virtual reality (VR) headset, or (ii) AR, MR, or VR smart glasses {[0041] FIG. 1 provides a diagram illustrating a system environment, in accordance with an embodiment of the invention. As illustrated in FIG. 1, the system environment 100 includes a managing entity system 200, an augmented display system 300, one or more user device(s) 130, an augmented reality (AR) System 120, and one or more third party systems 140. }. Response to Arguments Applicant's arguments filed 1/2/26 have been fully considered. Examiner’s response follows, with applicant’s headings and page numbers used for consistency. I. Status of the Application; II. Interview Summary Record; III. Amendments to the Claims Applicant is thanked for their comprehensive summary of the claim amendments and previous interview. IV. Rejection under 35 U.S.C. § 101 On pages 8-11, applicant offers remarks regarding the rejections under 35 U.S.C. § 101, after restating the present claim amendments and the additional elements (as defined by applicant): These elements go well beyond merely linking the alleged abstract idea to a particular technological environment or field of use (i.e., beyond merely linking in a general sense to the claimed "XR environment"), and go well beyond merely applying any alleged abstract idea using generic computing components. Instead, the above-noted features of amended claim 1 (and similarly of amended claims 10 and 18) establish a specific process for collaboratively entering and interacting with estate information in a virtual meeting environment. Thus, in addition to the technological problems addressed by the security-enhancing features noted in Applicant's previous response, the above-noted features provide a technological solution to the technological problem of inefficient or ineffective dispositions of assets. See Applicant's specification at [0003] (stating that "[i]n commercial settings, conventional approaches to customer interactions (e.g., for collecting customer information and/or providing information to customers) have various drawbacks, such as inefficient or ineffective relaying of information..."). Applicant respectfully submits that, because the invention of the amended claims provides a technological solution to a technological problem, the invention provides an improvement in the functioning of a technology or technical field, and thus integrates any alleged abstract idea into a practical application. See MPEP 2106.04(d)(I) ("Limitations the courts have found indicative that an additional element (or combination of elements) may have integrated the exception into a practical application include... [a]n improvement in the functioning of a computer, or an improvement to other technology or technical field"). Accordingly, regardless of whether the claims recite an abstract idea, Applicant respectfully submits that the claims are not directed to an abstract idea. See MPEP 2106.04(II)(A)(2) ("If the additional elements in the claim integrate the recited exception into a practical application of the exception, then the claim is not directed to the judicial exception..."). Each of claims 2, 4, 6, 8, 9, 11, 13, 15, and 17 depends from amended independent claim 1 or 10, and thus is patent-eligible for at least the same reasons as its respective base claim. Accordingly, Applicant respectfully requests that the rejection under 35 U.S.C. § 101 be withdrawn. While well taken, examiner’s position, which was communicated previously, is that any improvement applicant is attempting to claim resides with the abstract idea. The thrust of applicant’s claimed invention lies with the management of estate data, which is facilitated via XR technology and a distributed ledger. This is not an improvement to technology or a technical field, as applicant has suggested. Applicant’s own specification is supportive of examiner’s position, as seen in [0037] of the specification, which describe the XR technology in an off-the-shelf manner: In some embodiments, described XR devices may be any commercial XR device, such as a Google Glass@ device, a Google Cardboard@ device, a Google Daydream@ device, a Microsoft Hololens® device, a Magic Leap@ device, an Oculus® device, an Oculus Rift@ device, a Gear VRO device, a PlayStation® VR device, or an HTC Vive® device, to name a few. In general, each of these example XR devices may use one or more processors or graphic processing units (GPUs) capable of visualizing multimedia content in a partial or wholly virtual environment. As seen in the cited portion of the specification, applicant is applying generic computers and machinery to the tasks of the abstract idea, i.e., estate planning. This is not enough to demonstrate integration into practical application and/or add significantly more, based on MPEP 2106.05(f). Examiner therefore maintains that any potential improvement resides with the abstract idea vs. the technology. Accordingly, the claims are ineligible. VII. Rejections under 35 U.S.C. § 103 With respect to applicant’s remarks concerning the rejections under 35 U.S.C. § 103, examiner notes that they are moot, given that they are predicated on the present amendments, which required an updated search and consideration of an additional reference. Examiner directs applicant to the claim analysis above. In summary, examiner has responded to all of applicant’s arguments. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure: US 10510054, which teaches: A system implemented on an augmented reality electronic device includes scanning an item using the augmented reality device. Financial information is obtained from the scanned item. The financial information and an indication of user authentication are sent to a server computer. A confirmation is displayed on the augmented reality device that a financial transaction using the financial information has been completed. US 20200143481, which teaches: Techniques and architectures for providing notifications regarding events, such as hurricanes, tornados, fires, floods, earthquakes, and so on, are discussed herein. For example, a user interface may be displayed with a map of a geographical area, an event visual representation representing an event, and an impact visual representation indicating an impact area where the event is estimated to impact. A request may be received to notify users associated with the impact area and customized notifications may be sent to users associated with the impact area. The customized notifications may be based on policy data for the users. US 11023977, which teaches: A computer-implemented method, device, and system for communicating an impact of a financial decision or an investment strategy to a user through the use of virtual reality are provided. Providing a context for figuratively ‘seeing and feeling’ the results for a proposed financial strategy may assist the user in better understanding the impact of the financial strategy. Information provided in this context may educate the user with regards to the amount of income or savings required to achieve a goal over a time period. A goal for a user may be displayed as an image in a virtual reality scene based on gathered user information. The image representing the goal may be displayed according to a percent of clarity based on the financial amount needed to achieve the goal. The image may be changed based on the amount over a perceived virtual reality time period. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JOHN SAMUEL WASAFF whose telephone number is (571)270-5091. The examiner can normally be reached Monday through Friday 8:00 am to 6:00 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, SARAH MONFELDT can be reached at (571) 270-1833. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. JOHN SAMUEL WASAFF Primary Examiner Art Unit 3629 /JOHN S. WASAFF/Primary Examiner, Art Unit 3629
Read full office action

Prosecution Timeline

Feb 17, 2023
Application Filed
May 21, 2025
Non-Final Rejection — §101, §103
Aug 13, 2025
Applicant Interview (Telephonic)
Aug 13, 2025
Examiner Interview Summary
Aug 26, 2025
Response Filed
Oct 01, 2025
Final Rejection — §101, §103
Dec 23, 2025
Applicant Interview (Telephonic)
Dec 23, 2025
Examiner Interview Summary
Jan 02, 2026
Request for Continued Examination
Jan 16, 2026
Response after Non-Final Action
Mar 02, 2026
Non-Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602710
ENSEMBLE OF LANGUAGE MODELS FOR IMPROVED USER SUPPORT
2y 5m to grant Granted Apr 14, 2026
Patent 12555122
OMNI-CHANNEL CONTEXT SHARING
2y 5m to grant Granted Feb 17, 2026
Patent 12548095
Artificial Intelligence for Sump Pump Monitoring and Service Provider Notification
2y 5m to grant Granted Feb 10, 2026
Patent 12547996
COMPUTING SYSTEM FOR SHARING NETWORKS PROVIDING SHARED RESERVE FEATURES AND RELATED METHODS
2y 5m to grant Granted Feb 10, 2026
Patent 12541775
UNIQUE METHOD OF PROCESSING API DATA SUPPORTING WIDE VARIETY OF DATA TYPES AND MULTIPLE/SINGULAR FORMATS WITHOUT DATA DUPLICATION
2y 5m to grant Granted Feb 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
33%
Grant Probability
77%
With Interview (+44.2%)
4y 1m
Median Time to Grant
High
PTA Risk
Based on 373 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month