Prosecution Insights
Last updated: April 18, 2026
Application No. 18/385,882

APPLICATION STATE MANAGEMENT FOR ELECTRONIC DEVICES

Non-Final OA §101§102
Filed
Oct 31, 2023
Examiner
HU, SELINA ELISA
Art Unit
2193
Tech Center
2100 — Computer Architecture & Software
Assignee
Apple Inc.
OA Round
1 (Non-Final)
67%
Grant Probability
Favorable
1-2
OA Rounds
3y 3m
To Grant
99%
With Interview

Examiner Intelligence

Grants 67% — above average
67%
Career Allow Rate
2 granted / 3 resolved
+11.7% vs TC avg
Strong +100% interview lift
Without
With
+100.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
32 currently pending
Career history
35
Total Applications
across all art units

Statute-Specific Performance

§101
24.4%
-15.6% vs TC avg
§103
53.5%
+13.5% vs TC avg
§102
12.0%
-28.0% vs TC avg
§112
10.1%
-29.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 3 resolved cases

Office Action

§101 §102
CTNF 18/385,882 CTNF 100903 DETAILED ACTION Notice of Pre-AIA or AIA Status 07-03-aia AIA 15-10-aia The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA. Claim Rejections - 35 USC § 101 07-04-01 AIA 07-04 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to (an) abstract idea(s) without significantly more. Claims 1, 14, and 19 recite: A method comprising: providing, by an electronic device to an application running on the electronic device in a first mode, updates to physical environment information that corresponds to a current state of a physical environment of the electronic device while the application is in the first mode; determining, by the electronic device, that the application has entered a second mode; responsive to determining that the application has entered the second mode: ceasing providing, to the application, the updates to the physical environment information; and storing a subset of the updates to the physical environment information that occur while the application is in the second mode; determining, by the electronic device, that the application has entered a third mode; and providing, to the application responsive to determining that the application has entered the third mode, the subset of the updates. Step 1: Is the claim to a process, machine, manufacture, or composition of matter? Yes. Claim 1 is a process. Claim 14 is a machine. Claim 19 is a manufacture. Step 2A, Prong I: Does the claim recite an abstract idea, law of nature, or natural phenomenon? Yes: (an) abstract idea(s). The ‘determining’ limitation in #2 above, as claimed and under broadest reasonable interpretation (BRI), is a mental process that covers performance of the limitation in the mind. The limitation “determining” in the context of this claim encompasses a person analyzing, evaluating, or determining whether the application has entered a second mode, including comparison or judgement. The ‘determining’ limitation in #5 above, as claimed and under broadest reasonable interpretation (BRI), is a mental process that covers performance of the limitation in the mind. The limitation “determining” in the context of this claim encompasses a person analyzing, evaluating, or determining whether the application has entered a third mode, including comparison or judgement. Step 2A, Prong II: Does the claim recite additional elements that integrate the judicial exception into a practical application? No. The ‘providing’ limitation in #1 above, as claimed and under broadest reasonable interpretation (BRI), is an additional element that is insignificant extra-solution activity . The limitation “providing” in the context of this claim encompasses mere data gathering. See MPEP 2106.05(g). The ‘providing’ limitation in #3 above, as claimed and under broadest reasonable interpretation (BRI), is an additional element that is insignificant extra-solution activity . The limitation “providing” in the context of this claim encompasses mere data gathering. See MPEP 2106.05(g). The ‘storing’ limitation in #4 above, as claimed and under broadest reasonable interpretation (BRI), is an additional element that is insignificant extra-solution activity . The limitation “storing” in the context of this claim encompasses merely storing information in memory. See MPEP 2106.05(g). The ‘providing’ limitation in #6 above, as claimed and under broadest reasonable interpretation (BRI), is an additional element that is insignificant extra-solution activity . The limitation “providing” in the context of this claim encompasses mere data gathering. See MPEP 2106.05(g). Additionally, one or more of the claims recite the following additional elements: a memory (Claim 14) one or more processors (Claims 14 and 19) instructions (Claim 19) These additional elements are recited at a high level of generality (i.e., as generic computer components) such that they amount to no more than components comprising mere instructions to apply the exception . Accordingly, these additional elements do not integrate the abstract idea(s) into a practical application because they do not impose any meaningful limits on practicing the abstract ideas(s). Step 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception? No. As discussed above with respect to integration of the abstract idea(s) into a practical application, the aforementioned additional elements amount to no more than components for obtaining or gathering data and comprising mere instructions to apply the exception which is evidently seen in MPEP 2106.05(g). Mere instructions to apply an exception using generic computer components cannot provide an inventive concept. Additionally, with regards to #4 above, per MPER 2106.05(d)(II), the courts have recognized the following computer functions as well-understood, routine, and conventional functions when they are claimed in a merely generic matter (e.g., at a high level of generality) or as insignificant extra-solution activity: Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); OIP Techs., 788 F.3d at 1363, 115 USPQ2d at 1092-93; Claim 3 merely further describes the third mode of Claim 1. The claim does not include additional elements that integrate into practical application or are sufficient to amount to significantly more than the judicial exception. Claims 4 and 16 merely further describe the first and second modes of Claims 1 and 14 respectively. The claims do not include additional elements that integrate into practical application or are sufficient to amount to significantly more than the judicial exception. Claim 8 merely further describes the subset of the updates to the physical environment of Claim 1. The claim does not include additional elements that integrate into practical application or are sufficient to amount to significantly more than the judicial exception. Claim 9 merely further describes the subset of the updates to the physical environment of Claim 8. The claim does not include additional elements that integrate into practical application or are sufficient to amount to significantly more than the judicial exception. Claim 10 merely further describes the first and second portion of the physical environment information of Claim 9. The claim does not include additional elements that integrate into practical application or are sufficient to amount to significantly more than the judicial exception. Therefore, Claims 1, 3, 4, 8-10, 14, 16, and 19 are directed to (an) abstract idea(s) without significantly more. Claims 2, 15, and 20 recite: providing, for display while the application is in the first mode, a user interface of the application; and providing, for display while the application is in the third mode, the user interface with at least one modification that is based on the subset of the updates to the physical environment information. Step 1: Is the claim to a process, machine, manufacture, or composition of matter? Yes. Claim 2 is a process. Claim 15 is a machine. Claim 20 is a manufacture. Step 2A, Prong II: Does the claim recite additional elements that integrate the judicial exception into a practical application? No. The ‘providing’ limitation in #7 above, as claimed and under broadest reasonable interpretation (BRI), is an additional element that is insignificant extra-solution activity . The limitation “providing” in the context of this claim encompasses merely displaying information to a user. See MPEP 2106.05(g). The ‘providing’ limitation in #8 above, as claimed and under broadest reasonable interpretation (BRI), is an additional element that is insignificant extra-solution activity . The limitation “providing” in the context of this claim encompasses merely displaying information to a user. See MPEP 2106.05(g). Step 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception? No. As discussed above with respect to integration of the abstract idea(s) into a practical application, the aforementioned additional elements amount to no more than components for obtaining or gathering data and comprising mere instructions to apply the exception which is evidently seen in MPEP 2106.05(g). Mere instructions to apply an exception using generic computer components cannot provide an inventive concept. Additionally, with regards to #7 and #8 above, per MPER 2106.05(d)(II), the courts have recognized the following computer functions as well-understood, routine, and conventional functions when they are claimed in a merely generic matter (e.g., at a high level of generality) or as insignificant extra-solution activity: Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); OIP Techs., 788 F.3d at 1363, 115 USPQ2d at 1092-93; Therefore, Claims 2, 15, and 20 are directed to (an) abstract idea(s) without significantly more. Claims 5 and 17 recite: determining that a second application running on the electronic device is in the first mode while the application is in the second mode; and providing, to the second application while storing the subset of the updates to the physical environment information that occur while the application is in the second mode, the updates to the physical environment information that occur while the application is in the second mode to the second application. Step 1: Is the claim to a process, machine, manufacture, or composition of matter? Yes. Claim 5 is a process. Claim 17 is a machine. Step 2A, Prong I: Does the claim recite an abstract idea, law of nature, or natural phenomenon? Yes: (an) abstract idea(s). The ‘determining’ limitation in #9 above, as claimed and under broadest reasonable interpretation (BRI), is a mental process that covers performance of the limitation in the mind. The limitation “determining” in the context of this claim encompasses a person analyzing, evaluating, or determining whether the second application is in a first mode while the application is in a second mode, including comparison or judgement. Step 2A, Prong II: Does the claim recite additional elements that integrate the judicial exception into a practical application? No. The ‘providing’ limitation in #10 above, as claimed and under broadest reasonable interpretation (BRI), is an additional element that is insignificant extra-solution activity . The limitation “providing” in the context of this claim encompasses mere data gathering. See MPEP 2106.05(g). Step 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception? No. As discussed above with respect to integration of the abstract idea(s) into a practical application, the aforementioned additional elements amount to no more than components for obtaining or gathering data and comprising mere instructions to apply the exception which is evidently seen in MPEP 2106.05(g). Mere instructions to apply an exception using generic computer components cannot provide an inventive concept. Additionally, with regards to #10 above, per MPER 2106.05(d)(II), the courts have recognized the following computer functions as well-understood, routine, and conventional functions when they are claimed in a merely generic matter (e.g., at a high level of generality) or as insignificant extra-solution activity: Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); OIP Techs., 788 F.3d at 1363, 115 USPQ2d at 1092-93; Therefore, Claims 5 and 17 are directed to (an) abstract idea(s) without significantly more. Claim 6 recites: determining, by the electronic device, that the second application has entered the second mode; and responsive to determining that the second application has entered the second mode: ceasing providing, to the second application, the updates to the physical environment information; and storing, for the second application, a second subset of the updates to the physical environment information that occur while the second application is in the second mode. Step 1: Is the claim to a process, machine, manufacture, or composition of matter? Yes. Claim 6 is a process. Step 2A, Prong I: Does the claim recite an abstract idea, law of nature, or natural phenomenon? Yes: (an) abstract idea(s). The ‘determining’ limitation in #11 above, as claimed and under broadest reasonable interpretation (BRI), is a mental process that covers performance of the limitation in the mind. The limitation “determining” in the context of this claim encompasses a person analyzing, evaluating, or determining whether the second application has entered a second mode, including comparison or judgement. Step 2A, Prong II: Does the claim recite additional elements that integrate the judicial exception into a practical application? No. The ‘providing’ limitation in #12 above, as claimed and under broadest reasonable interpretation (BRI), is an additional element that is insignificant extra-solution activity . The limitation “providing” in the context of this claim encompasses mere data gathering. See MPEP 2106.05(g). The ‘storing’ limitation in #13 above, as claimed and under broadest reasonable interpretation (BRI), is an additional element that is insignificant extra-solution activity . The limitation “storing” in the context of this claim encompasses merely storing information in memory. See MPEP 2106.05(g). Step 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception? No. As discussed above with respect to integration of the abstract idea(s) into a practical application, the aforementioned additional elements amount to no more than components for obtaining or gathering data and comprising mere instructions to apply the exception which is evidently seen in MPEP 2106.05(g). Mere instructions to apply an exception using generic computer components cannot provide an inventive concept. Additionally, with regards to #12 and #13 above, per MPER 2106.05(d)(II), the courts have recognized the following computer functions as well-understood, routine, and conventional functions when they are claimed in a merely generic matter (e.g., at a high level of generality) or as insignificant extra-solution activity: Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); OIP Techs., 788 F.3d at 1363, 115 USPQ2d at 1092-93; Therefore, Claim 6 is directed to (an) abstract idea(s) without significantly more. Claim 7 recites: wherein storing the subset of the updates to the physical environment information that occur while the application is in the second mode comprises storing the subset of the updates to the physical environment information that occur while the application is in the second mode for the application, the method further comprising: while the application and a second application at the electronic device are in the second mode, storing, for the second application, a second subset of the updates to the physical environment information. Step 1: Is the claim to a process, machine, manufacture, or composition of matter? Yes. Claim 7 is a process. Step 2A, Prong II: Does the claim recite additional elements that integrate the judicial exception into a practical application? No. The ‘storing’ limitation in #14 above, as claimed and under broadest reasonable interpretation (BRI), is an additional element that is insignificant extra-solution activity . The limitation “storing” in the context of this claim encompasses merely storing information in memory. See MPEP 2106.05(g). The ‘storing’ limitation in #15 above, as claimed and under broadest reasonable interpretation (BRI), is an additional element that is insignificant extra-solution activity . The limitation “storing” in the context of this claim encompasses merely storing information in memory. See MPEP 2106.05(g). Step 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception? No. As discussed above with respect to integration of the abstract idea(s) into a practical application, the aforementioned additional elements amount to no more than components for obtaining or gathering data and comprising mere instructions to apply the exception which is evidently seen in MPEP 2106.05(g). Mere instructions to apply an exception using generic computer components cannot provide an inventive concept. Additionally, with regards to #14 and #15 above, per MPER 2106.05(d)(II), the courts have recognized the following computer functions as well-understood, routine, and conventional functions when they are claimed in a merely generic matter (e.g., at a high level of generality) or as insignificant extra-solution activity: Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); OIP Techs., 788 F.3d at 1363, 115 USPQ2d at 1092-93; Therefore, Claim 7 is directed to (an) abstract idea(s) without significantly more. Claims 11 and 18 recite: wherein storing the subset of the updates to the physical environment information comprises: storing a set of the updates to the physical environment information that occur while the application is in the second mode; and compacting, while the application is in the second mode, the stored set of the updates to the physical environment information that occur while the application is in the second mode to form the subset. Step 1: Is the claim to a process, machine, manufacture, or composition of matter? Yes. Claim 11 is a process. Claim 18 is a machine. Step 2A, Prong II: Does the claim recite additional elements that integrate the judicial exception into a practical application? No. The ‘storing’ limitation in #16 above, as claimed and under broadest reasonable interpretation (BRI), is an additional element that is insignificant extra-solution activity . The limitation “storing” in the context of this claim encompasses merely storing information in memory. See MPEP 2106.05(g). The ‘compacting’ limitation in #17 above, as claimed and under broadest reasonable interpretation (BRI), is an additional element as “apply it” that is mere instructions to apply an exception . The limitation “compacting” in the context of this claim encompasses merely compacting the stored set of updates. See MPEP 2106.05(f). Step 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception? No. As discussed above with respect to integration of the abstract idea(s) into a practical application, the aforementioned additional elements amount to no more than components for obtaining or gathering data and comprising mere instructions to apply the exception which is evidently seen in MPEP 2106.05(g)&(f). Mere instructions to apply an exception using generic computer components cannot provide an inventive concept. Additionally, with regards to #16 above, per MPER 2106.05(d)(II), the courts have recognized the following computer functions as well-understood, routine, and conventional functions when they are claimed in a merely generic matter (e.g., at a high level of generality) or as insignificant extra-solution activity: Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); OIP Techs., 788 F.3d at 1363, 115 USPQ2d at 1092-93; Therefore, Claims 11 and 18 are directed to (an) abstract idea(s) without significantly more. Claim 12 recites: wherein compacting the stored set of updates comprises at least one of: removing one of the updates from the stored set of updates or combining two or more of the updates in the set of updates. Step 1: Is the claim to a process, machine, manufacture, or composition of matter? Yes. Claim 12 is a process. Step 2A, Prong II: Does the claim recite additional elements that integrate the judicial exception into a practical application? No. The ‘removing’ limitation in #18 above, as claimed and under broadest reasonable interpretation (BRI), is an additional element as “apply it” that is mere instructions to apply an exception . The limitation “removing” in the context of this claim encompasses merely removing one of the updates from the stored set of updates. See MPEP 2106.05(f). Step 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception? No. As discussed above with respect to integration of the abstract idea(s) into a practical application, the aforementioned additional elements amount to no more than components for obtaining or gathering data and comprising mere instructions to apply the exception which is evidently seen in MPEP 2106.05(f). Mere instructions to apply an exception using generic computer components cannot provide an inventive concept. Therefore, Claim 12 is directed to (an) abstract idea(s) without significantly more. Claim 13 recites: wherein compacting the stored set of the updates to the physical environment information comprises: periodically compacting a first portion of the set of updates; and compacting a second portion of the set of updates during an inactive state of the electronic device. Step 1: Is the claim to a process, machine, manufacture, or composition of matter? Yes. Claim 13 is a process. Step 2A, Prong II: Does the claim recite additional elements that integrate the judicial exception into a practical application? No. The ‘compacting’ limitation in #19 above, as claimed and under broadest reasonable interpretation (BRI), is an additional element as “apply it” that is mere instructions to apply an exception . The limitation “compacting” in the context of this claim encompasses merely periodically compacting a first portion of the stored set of updates. See MPEP 2106.05(f). The ‘compacting’ limitation in #20 above, as claimed and under broadest reasonable interpretation (BRI), is an additional element as “apply it” that is mere instructions to apply an exception . The limitation “compacting” in the context of this claim encompasses merely compacting a second portion of the stored set of updates. See MPEP 2106.05(f). Step 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception? No. As discussed above with respect to integration of the abstract idea(s) into a practical application, the aforementioned additional elements amount to no more than components for obtaining or gathering data and comprising mere instructions to apply the exception which is evidently seen in MPEP 2106.05(f). Mere instructions to apply an exception using generic computer components cannot provide an inventive concept. Therefore, Claim 13 is directed to (an) abstract idea(s) without significantly more. Claim Rejections - 35 USC § 102 07-07-aia AIA 07-07 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – 07-08-aia AIA (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. 07-12-aia AIA (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim(s) 1-3, 8-10, 14-15, and 19-20 are rejected under 35 U.S.C. 102(a)(2) as being unpatentable over Kahan et al. (U.S. Patent No. US 20230237192 A1), hereinafter “Kahan.” With regards to Claim 1, Kahan teaches: A method, comprising: providing, by an electronic device to an application running on the electronic device in a first mode, updates to physical environment information that corresponds to a current state of a physical environment of the electronic device while the application is in the first mode (Paragraphs 8, 378, 424, and 426, “These embodiments may involve receiving image data from an image sensor associated with a wearable extended reality appliance, the image data is reflective of a physical environment… In some examples, the received image data of the physical environment may be analyzed to associate the at least two extended reality objects with the composite perspective of the physical environment… In some embodiments, enabling the non-synchronous display includes enabling a viewer to toggle between viewing only a representation of the physical environment, viewing only an extended reality environment including the at least two extended reality objects, or viewing a representation of both the physical environment and the extended reality environment… By using user interface element 3102, the viewer may select a desired view of the extended reality environment and/or the physical environment… Selecting composite view button 3110 may display a combined physical environment with extended reality objects, such as shown in the lower portion of FIG. 31.” The viewer selecting a desired view through a corresponding button correlate to the application being in a first mode. The wearable extended reality appliance sensors receiving image data reflective of a physical environment, which is analyzed to determine physical and extended reality object positions for a composite view, correlates to providing, by an electronic device to an application running on the electronic device in a first mode, updates to physical environment information that corresponds to a current state of a physical environment of the electronic device while the application is in the first mode) ; determining, by the electronic device, that the application has entered a second mode (Fig. 33, paragraphs 424 and 426, “In some embodiments, enabling the non-synchronous display includes enabling a viewer to toggle between viewing only a representation of the physical environment, viewing only an extended reality environment including the at least two extended reality objects, or viewing a representation of both the physical environment and the extended reality environment… By using user interface element 3102, the viewer may select a desired view of the extended reality environment and/or the physical environment… Selecting “Only XR View” button 3106 may display only extended reality object, such as shown and described below in connection with FIG. 33.” The viewer using a particular button to toggle between viewing a representation of both the physical and extended reality environment to only displaying extended reality objects correlates to determining, by the electronic device, that the application has entered a second mode) ; responsive to determining that the application has entered the second mode: ceasing providing, to the application, the updates to the physical environment information (Fig. 33, paragraph 428, “For example, the viewer may have selected to view only the extended reality environment (e.g., by using “Only XR View” button 3106 of user interface element 3102). As shown in FIG. 33, the extended reality environment view 3300 includes virtual representation of the wearer 2902 holding first extended reality object 2608 and second extended reality object 2610. Because the extended reality only view does not include objects in the physical environment, in FIG. 33, it appears as if second extended reality object 2610 is “floating” behind virtual representation of the wearer 2902.” The extended reality only view not including objects in the physical environment and causing extended reality objects to appear as if they were floating correlates to ceasing providing, to the application, the updates to the physical environment information) ; and storing a subset of the updates to the physical environment information that occur while the application is in the second mode (Paragraphs 396-397, “In some embodiments, the composite perspective of the physical environment is based on image data captured prior to the particular time period. As discussed above, the particular time period refers to a period of time during which the wearer is looking at the first extended reality object. For example, the composite perspective of the physical environment may be based on image data that was captured when the wearer activated the wearable extended reality appliance. As another example, the image data may have been captured at an earlier point in time (i.e., prior to the particular time period) and stored in a storage, such as a memory or other storage in the wearable extended reality appliance, a memory or other storage in a device (e.g., input unit 202) associated with the wearable extended reality appliance, or a remote storage (e.g., a cloud-based storage) … For example, the image data may include time information about when the image data was captured (e.g., a timestamp or other time indicator). If the particular time period is more than a threshold amount of time (e.g., one hour, one day, one week, or other amount of time) later than the timestamp, the composite perspective may be updated with image data captured during the particular time period (i.e., the period of time during which the wearer is looking at the first extended reality object). The composite perspective may be updated with image data captured during the particular time period to accurately reflect a current state of the physical environment.” The particular time period where a user is looking at a first extended reality object can occur while a user is in an extended reality only view and therefore correlates to the application being in a second mode. The image data which is reflective of a physical environment being captured at various time periods such as during the particular time period and stored in a storage correlates to storing updates of physical environment information that occur while the application is in the second mode. The image data being captured at time stamps which can be greater than a threshold amount of time more than the particular time period would involve the image data not being captured and saved continuously. Therefore, the image data being stored at non-continuous time stamps correlates to storing a subset of the updates to the physical environment information that occur while the application is in the second mode) ; determining, by the electronic device, that the application has entered a third mode (Paragraph 424, “In some embodiments, enabling the non-synchronous display includes enabling a viewer to toggle between viewing only a representation of the physical environment, viewing only an extended reality environment including the at least two extended reality objects, or viewing a representation of both the physical environment and the extended reality environment… By using user interface element 3102, the viewer may select a desired view of the extended reality environment and/or the physical environment… Selecting composite view button 3110 may display a combined physical environment with extended reality objects, such as shown in the lower portion of FIG. 31… Once the view is selected, the viewer may use view rotator element 3112 to navigate around in the selected view (i.e., rotate the selected view to achieve a desired perspective of the selected view or a desired viewing angle within the selected view).” The viewer changing from different views through a toggle or picking a button, such as from a view representing only extended reality objects to a composite view, and the view being updated after the selection, correlates to determining, by the electronic device, that the application has entered a third mode) ; and providing, to the application responsive to determining that the application has entered the third mode, the subset of the updates (Paragraphs 396-397 and 426 “As discussed above, the particular time period refers to a period of time during which the wearer is looking at the first extended reality object… In this example, the stored image data may be retrieved by the wearable extended reality appliance and the composite perspective of the physical environment may be generated based on the stored image data… The composite perspective may be updated with image data captured during the particular time period to accurately reflect a current state of the physical environment… In some embodiments, enabling the non-synchronous display includes enabling a viewer to toggle between viewing only a representation of the physical environment, viewing only an extended reality environment including the at least two extended reality objects, or viewing a representation of both the physical environment and the extended reality environment… Once the view is selected, the viewer may use view rotator element 3112 to navigate around in the selected view (i.e., rotate the selected view to achieve a desired perspective of the selected view or a desired viewing angle within the selected view).” The image data being captured during the particular time period and stored in a storage correlates to the subset of the updates. The stored image data being retrieved to generate the composite perspective after a user selects a composite view button correlates to providing, to the application responsive to determining that the application has entered the third mode, the subset of the updates) . With regards to Claims 14 and 19, the method of Claim 1 performs the same steps as the machine and manufacture of Claims 14 and 19 respectively, and Claims 14 and 19 are therefore rejected using the same rationale set forth above in the rejection of Claim 1. With regards to Claim 2, Kahan teaches the method of Claim 1 above. Kahan further teaches: providing, for display while the application is in the first mode, a user interface of the application (Paragraph 426, “By using user interface element 3102, the viewer may select a desired view of the extended reality environment and/or the physical environment… Selecting composite view button 3110 may display a combined physical environment with extended reality objects, such as shown in the lower portion of FIG. 31… Once the view is selected, the viewer may use view rotator element 3112 to navigate around in the selected view (i.e., rotate the selected view to achieve a desired perspective of the selected view or a desired viewing angle within the selected view). For example, as shown in FIG. 31, display 3100 includes a composite view (e.g., selected by using composite view button 3110) including virtual representation of the wearer 2902 holding first extended reality object 2608 and second extended reality object 2610 positioned on cabinet 2602.” The user selecting a desired view such as a composite view and being able to navigate around the selected view correlates to providing, for display while the application is in the first mode, a user interface of the application) ; and providing, for display while the application is in the third mode, the user interface with at least one modification that is based on the subset of the updates to the physical environment information (Paragraphs 396-398 and 426 “As discussed above, the particular time period refers to a period of time during which the wearer is looking at the first extended reality object… The composite perspective may be updated with image data captured during the particular time period to accurately reflect a current state of the physical environment… For example, prior to the particular time period, objects in the physical environment may be in a first location and in the particular time period, objects in the physical environment may have been added, moved, or removed from the prior point in time… During the particular time period (i.e., a later point in time than that shown in FIG. 26), table 2604 is moved to a different location in physical environment 2600 (e.g., next to cabinet 2602). The composite perspective may then be updated to reflect the current location of table 2604... In some embodiments, enabling the non-synchronous display includes enabling a viewer to toggle between viewing only a representation of the physical environment, viewing only an extended reality environment including the at least two extended reality objects, or viewing a representation of both the physical environment and the extended reality environment… Once the view is selected, the viewer may use view rotator element 3112 to navigate around in the selected view (i.e., rotate the selected view to achieve a desired perspective of the selected view or a desired viewing angle within the selected view).” The particular time period where a user is looking at the first extended reality object may occur during a view which only shows extended reality objects. In the example where a physical object is moved during the particular time period, the user would not be updated on the new location until they switch to a mode where the physical objects are visible again such as a composite view. The composite view then being selected and updated to reflect the new location of the moved table, where the user is able to navigate around the selected view and the table was in a different position than in any of the previous modes, correlates to providing, for display while the application is in the third mode, the user interface with at least one modification that is based on the subset of the updates to the physical environment information) . With regards to Claims 15 and 20, the method of Claim 2 performs the same steps as the machine and manufacture of Claims 15 and 20 respectively, and Claims 15 and 20 are therefore rejected using the same rationale set forth above in the rejection of Claim 2. With regards to Claim 3, Kahan teaches the method of Claim 1 above. Kahan further teaches: wherein the third mode is the same as the first mode (Paragraph 424, “In some embodiments, enabling the non-synchronous display includes enabling a viewer to toggle between viewing only a representation of the physical environment, viewing only an extended reality environment including the at least two extended reality objects, or viewing a representation of both the physical environment and the extended reality environment. In some embodiments, the viewer may toggle (i.e., select) the view by using a user interface element.” The user toggling between different views by using a user interface element, such as from a composite mode to an extended reality environment back to a composite mode, correlates to the third mode being the same as the first mode) . With regards to Claim 8, Kahan teaches the method of Claim 1 above. Kahan further teaches: wherein the subset of the updates to the physical environment information that occur while the application is in the second mode comprises: a first subset of the updates corresponding to a first portion of the physical environment information (Paragraphs 396-397, “In some embodiments, the composite perspective of the physical environment is based on image data captured prior to the particular time period. As discussed above, the particular time period refers to a period of time during which the wearer is looking at the first extended reality object. For example, the composite perspective of the physical environment may be based on image data that was captured when the wearer activated the wearable extended reality appliance. As another example, the image data may have been captured at an earlier point in time (i.e., prior to the particular time period) and stored in a storage, such as a memory or other storage in the wearable extended reality appliance, a memory or other storage in a device (e.g., input unit 202) associated with the wearable extended reality appliance, or a remote storage (e.g., a cloud-based storage) … For example, the image data may include time information about when the image data was captured (e.g., a timestamp or other time indicator). If the particular time period is more than a threshold amount of time (e.g., one hour, one day, one week, or other amount of time) later than the timestamp, the composite perspective may be updated with image data captured during the particular time period (i.e., the period of time during which the wearer is looking at the first extended reality object). The composite perspective may be updated with image data captured during the particular time period to accurately reflect a current state of the physical environment.” The particular time period where a user is looking at a first extended reality object can occur while a user is in an extended reality only view and therefore correlates to the application being in a second mode. The image data which is reflective of a physical environment being captured at various time periods such as during the particular time period and stored in a storage correlates to storing updates of physical environment information that occur while the application is in the second mode. The image data being captured at time stamps which can be greater than a threshold amount of time more than the particular time period would involve the image data not being captured and saved continuously. Each image data snapshot which is represented by the time stamp the image data was captured at would involve at least one image data correlates to a first subset of updates. Therefore, the image data being stored at non-continuous time stamps correlates to a first subset of the updates corresponding to a first portion of the physical environment information) ; and a second subset of the updates corresponding to a second portion of the physical environment information (Paragraphs 396-398, “In some embodiments, the composite perspective of the physical environment is based on image data captured prior to the particular time period. As discussed above, the particular time period refers to a period of time during which the wearer is looking at the first extended reality object. For example, the composite perspective of the physical environment may be based on image data that was captured when the wearer activated the wearable extended reality appliance. As another example, the image data may have been captured at an earlier point in time (i.e., prior to the particular time period) and stored in a storage, such as a memory or other storage in the wearable extended reality appliance, a memory or other storage in a device (e.g., input unit 202) associated with the wearable extended reality appliance, or a remote storage (e.g., a cloud-based storage) … For example, the image data may include time information about when the image data was captured (e.g., a timestamp or other time indicator). If the particular time period is more than a threshold amount of time (e.g., one hour, one day, one week, or other amount of time) later than the timestamp, the composite perspective may be updated with image data captured during the particular time period (i.e., the period of time during which the wearer is looking at the first extended reality object). The composite perspective may be updated with image data captured during the particular time period to accurately reflect a current state of the physical environment... During the particular time period (i.e., a later point in time than that shown in FIG. 26), table 2604 is moved to a different location in physical environment 2600 (e.g., next to cabinet 2602).” The particular time period where a user is looking at a first extended reality object can occur while a user is in an extended reality only view and therefore correlates to the application being in a second mode. The image data which is reflective of a physical environment being captured at various time periods such as during the particular time period and stored in a storage correlates to storing updates of physical environment information that occur while the application is in the second mode. The image data being captured at time stamps which can be greater than a threshold amount of time more than the particular time period would involve the image data not being captured and saved continuously. In a scenario where two instances of image data are captured during the particular time period, such as a table being moved to a different location, the second instance of image data which has a different time stamp than the first instance of image data and may account for the new table location correlates to a second subset of updates. Therefore, the image data being stored at non-continuous time stamps correlates to a second subset of the updates corresponding to a second portion of the physical environment information) . With regards to Claim 9, Kahan teaches the method of Claim 8 above. Kahan further teaches: wherein the updates to the physical environment information include updates corresponding to one or more of: a change in a location of a physical object in the physical environment (Paragraph 397-398, “In some embodiments, the operations further include updating the composite perspective of the physical environment based on image data captured during the particular time period… For example, prior to the particular time period, objects in the physical environment may be in a first location and in the particular time period, objects in the physical environment may have been added, moved, or removed from the prior point in time. For example, assume that FIG. 26 represents a prior point in time (i.e., before the particular time period). At the prior point in time, wearer 2606 is facing a wall of physical environment 2600 and table 2604 appears to the wearer’s left. Depending on the wearer’s point of focus, table 2604 may be within the wearer’s field of view. During the particular time period (i.e., a later point in time than that shown in FIG. 26), table 2604 is moved to a different location in physical environment 2600 (e.g., next to cabinet 2602). The composite perspective may then be updated to reflect the current location of table 2604.” The objects in the physical environment being added, moved, removed between a time prior to the particular time period and the particular time period, such as a table being moved to a different location, correlates to a change in a location of a physical object in the physical environment. The image data being used to update a composite perspective of the physical environment correlates to the updates to the physical environment information including updates corresponding to a change in a location of a physical object in the physical environment) , a notification of removal of a physical object from the physical environment, a notification of an addition of a new physical object to the physical environment, or a update to a mesh generated by the electronic device to represent the physical environment. With regards to Claim 10, Kahan teaches the method of Claim 9 above. Kahan further teaches: wherein the first portion of the physical environment information comprises the location of the physical object in the physical environment (Paragraph 397-398, “In some embodiments, the operations further include updating the composite perspective of the physical environment based on image data captured during the particular time period… For example, prior to the particular time period, objects in the physical environment may be in a first location and in the particular time period, objects in the physical environment may have been added, moved, or removed from the prior point in time. For example, assume that FIG. 26 represents a prior point in time (i.e., before the particular time period). At the prior point in time, wearer 2606 is facing a wall of physical environment 2600 and table 2604 appears to the wearer’s left. Depending on the wearer’s point of focus, table 2604 may be within the wearer’s field of view. During the particular time period (i.e., a later point in time than that shown in FIG. 26), table 2604 is moved to a different location in physical environment 2600 (e.g., next to cabinet 2602). The composite perspective may then be updated to reflect the current location of table 2604.” The image data being used to update a composite perspective of the physical environment, where the image data may reflect a table being moved to a different location in the physical environment, correlates to the first portion of the physical environment information comprising the location of the physical object in the physical environment) , and wherein the second portion of the physical environment information comprises the mesh generated by the electronic device to represent the physical environment (Paragraph 67-68, 101, and 315, “In some examples, the image data may be read from memory, may be received from an external device, may be generated (for example, using a generative model), and so forth… In another embodiment, the digital signals may include a representation of virtual content, for example, by encoding objects in a three-dimensional array of voxels, in a polygon mesh, or in any other format in which virtual content may be presented… Consistent with the present disclosure, the image data may include pixel data streams, digital images, digital video s
Read full office action

Prosecution Timeline

Oct 31, 2023
Application Filed
Mar 18, 2025
Response after Non-Final Action
Mar 27, 2026
Non-Final Rejection — §101, §102 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12585485
Warm migrations for virtual machines in a cloud computing environment
2y 5m to grant Granted Mar 24, 2026
Patent 12563114
CONTENT INITIALIZATION METHOD, ELECTRONIC DEVICE AND STORAGE MEDIUM
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 2 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
67%
Grant Probability
99%
With Interview (+100.0%)
3y 3m
Median Time to Grant
Low
PTA Risk
Based on 3 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month