Prosecution Insights
Last updated: April 19, 2026
Application No. 18/802,659

METHOD, APPARATUS, DEVICE, AND STORAGE MEDIUM FOR ENVIRONMENT CALIBRATION

Non-Final OA §102§103§112
Filed
Aug 13, 2024
Examiner
BEUTEL, WILLIAM A
Art Unit
2616
Tech Center
2600 — Communications
Assignee
BEIJING ZITIAO NETWORK TECHNOLOGY CO., LTD.
OA Round
1 (Non-Final)
70%
Grant Probability
Favorable
1-2
OA Rounds
2y 7m
To Grant
90%
With Interview

Examiner Intelligence

Grants 70% — above average
70%
Career Allow Rate
328 granted / 469 resolved
+7.9% vs TC avg
Strong +20% interview lift
Without
With
+20.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
28 currently pending
Career history
497
Total Applications
across all art units

Statute-Specific Performance

§101
9.9%
-30.1% vs TC avg
§103
49.8%
+9.8% vs TC avg
§102
10.7%
-29.3% vs TC avg
§112
22.0%
-18.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 469 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Allowable Subject Matter Claims 3 and 19 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims, and correcting the minor typographical errors as indicated below (Examiner further notes that claim 11 contains substantially the same subject matter, but is subject to an additional rejection under 35 U.S.C. 112(b) below). Claim Objections Claims 2-7, 10-15 and 18-20 are objected to because of the following informalities: Claim 2 recites “wherein the determining, in the three-dimensional computer-generated environment, calibration composition data of a target physical object in a computation coordinate system comprises” in lines 1-3 which does not follow through with all the proper antecedent terms (although still understood as referencing the limitation from the parent claim). Examiner suggests amending to recite, “wherein the determining, in the three-dimensional computer-generated environment, the calibration data of [[a]]the target physical object in [[a]]the computation coordinate system comprises”. Claims 3-5 are objected to for incorporating the same language by reference. Claim 3 recites “wherein the determining a feature point of the target physical object based on an environmental depth map corresponding to the three-dimensional computer-generated environment comprises” in lines 1-3 which does not follow through with all the proper antecedent terms (although still understood as referencing the limitation from the parent claim). Examiner suggests amending the claim to recite, “wherein the determining [[a]]the feature point of the target physical object based on [[an]]the environmental depth map corresponding to the three-dimensional computer-generated environment comprises”. Claim 4 recites “wherein the determining the calibration composition data of the target physical object in the computation coordinate system, based on three-dimensional pose information of the feature point in the computation coordinate system comprises” in lines 1-3 which does not follow through with all the proper antecedent terms (although still understood as referencing the limitation from the parent claim). Examiner suggests amending the claim to recite, “wherein the determining the calibration composition data of the target physical object in the computation coordinate system, based on the three-dimensional pose information of the feature point in the computation coordinate system comprises”. Claim 5 is objected to for incorporating the same language by reference from claim 4. Claim 6 recites “wherein the determining a calibration model of the target physical object in a rendering coordinate system used for calibration, based on the calibration composition data and a coordinate offset between the computation coordinate system and the rendering coordinate system comprises” in lines 1-4 which does not follow through with all the proper antecedent terms (although still understood as referencing the limitation from the parent claim). Examiner suggests amending to recite, “wherein the determining [[a]]the calibration model of the target physical object in [[a]]the rendering coordinate system used for calibration, based on the calibration composition data and [[a]]the coordinate offset between the computation coordinate system and the rendering coordinate system comprises”. Claim 7 is objected to for incorporating the same language by reference from claim 6. Regarding claims 10-15, the claims contain substantially the same objected-to language as claims 2-7 discussed above. Accordingly, claims 10-15 are objected to for the same reasons as claims 2-7, and the suggested language for amendment is substantially the same. Regarding claims 18-20, the claims contain substantially the same objected-to language as claims 2-4 discussed above. Accordingly, claims 18-20 are objected to for the same reasons as claims 2-4, and the suggested language for amendment is substantially the same. Appropriate correction is required. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 9-16 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Regarding claim 9, the claim recites “at an electronic device” in line 6 of the claim. The claim is already directed to “An electronic device” in line 1 and introducing an additional “an electronic device” renders the claim indefinite as it is unclear whether the claim intends to recite two separate electronic devices or whether the second instance is intended to refer back to the initial claimed device. Claims 10-16 depend from claim 9 and therefore incorporate the indefinite language as recited in claim 9, discussed above. Furthermore, claims 10-16 all begin with “The electronic device of claim …” As a result of introducing two separate instances of “an electronic device” in claim 9, claims 10-16 are indefinite as to which device is referred back to (i.e. the claimed “An electronic device” in line 1 of claim 9 or to the second instances of “an electronic device” in claim 9). Accordingly, the claims are rendered indefinite. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim(s) 1, 2, 4, 6, 9, 10, 12, 14, 17, 18 and 20 is/are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Ahmed et al. (US 2025/0086830 A1). Regarding claim 9, Ahmed discloses: An electronic device, (Ahmed, Figs. 2A-2B and ¶49: headset) comprising: A processor, and a memory configured to store a computer program, wherein the processor is configured to invoke and run the computer program stored in the memory to implement operations (Ahmed, Figs. 2A and 2B and ¶50: processor 208; Also ¶56: processor 268 and memory unit 270, storage device 271; ¶59: processor 208 configured to load instructions from storage device 211 into memory 210 for execution, with similar process performed for processor 268, to perform disclosed methods) comprising: At an electronic device configured to communicate with a display generation component and one or more input devices: (Ahmed, Fig. 2A and ¶52: augmented reality glasses; ¶53: augmented reality glasses having display device 255a/b for displaying augmented reality media content to user; ¶55: eye tracking and motion sensors for monitoring movement of augmented reality glasses; also ¶56: input/output device 272) Displaying, via the display generation component, a three-dimensional computer-generated environment (Ahmed, Figs. 1A and 1B and ¶46: user wears headset to view a virtual image of internal partitions defined in Building Information Model (BIM) aligned with part-constructed portions of building; Note background ¶2 disclosing use for 3D model BIM); Determining, in the three-dimensional computer-generated environment, calibration composition data of a target physical object in a computation coordinate system (Ahmed, ¶61: using camera to image 2D markers at construction site; ¶63: control markers 330 located at corners of 2D markers with known distance, having fixed spatial relationship, where 2D markers are affixed to construction site structure; ¶82: the pose of the headset is determined by the headset positioning system within the coordinate system used by the headset positioning system; ¶84: pose of camera 520 modelled as separate camera-centric coordinate system 522, defining dimensions of an image captured by camera; Fig. 5A and ¶85: centre-point 532 is located within a positioning coordinate system 534 used by a positioning system for the headset 530, where the origin of the positioning coordinate system 534 may be defined with reference to an arbitrary starting point for navigation (e.g., a location where the user first switched on the headset or acquired an image of the 2D marker); ¶87: headset captures image of 2D marker and image 570 is then supplied to a camera pose determination function that estimates a pose of the camera with respect to the 2D marker 540, i.e. determines a position and orientation of the camera with respect to the 2D marker); and Determining a calibration model of the target physical object in a rendering coordinate system used for calibration (Ahmed, ¶70: The locations of the set of control markers surrounding the 2D marker and/or the 2D marker itself may be obtained in a coordinate system that is also used to define the BIM, such as a global coordinate system or a site-specific coordinate system, with location of 2D marker accurately defined in 3D with respect to Building Information Model (BIM); Fig. 5A and ¶83 describes BIM 510 with reference to a coordinate system for the BIM 512), based on the calibration composition data and a coordinate offset between the computation coordinate system and the rendering coordinate system. (Ahmed, ¶60: initialize or configure a transformation between a coordinate system used by at least one positioning system and a coordinate system used by the Building Information Model (BIM); ¶86: A user wearing the headset 530 and viewing an augmented reality display such as augmented reality glasses 250 needs to view the BIM 512 aligned to the current pose of the headset 530, i.e. positions and orientations within the BIM coordinate system 512 need to be mapped to the positioning coordinate system 534, using point-to-point transformation, defining the relative rotation and translation between the origins of the coordinate systems for the BIM and the positioning system; ¶88 further disclosing transformation between coordinate systems) Regarding claim 1, the claimed method is the same as the implemented operations of claim 9, and as such claim 1 is rejected based on the same rationale as claim 9 set forth above. Regarding claim 17, Ahmed discloses: A non-transitory computer-readable storage medium storing a computer program, wherein the computer program causes a computer to perform operations (Ahmed, Figs. 2A and 2B and ¶50: processor 208; Also ¶56: processor 268 and memory unit 270, storage device 271; ¶59: processor 208 configured to load instructions from storage device 211 into memory 210 for execution, with similar process performed for processor 268, to perform disclosed methods; also note claim 20 of Ahmed directed to non-transitory computer-readable storage medium storing instructions for execution) Further regarding claim 17, the operations implement the method of claim 1 and as such claim 17 is further rejected based on the same rationale as claim 1 set forth above. Regarding claim 10, Ahmed further discloses: Wherein the determining, in the three-dimensional computer-generated environment, calibration composition data of a target physical object in a computation coordinate system comprises: determining a feature point of the target physical object based on an environmental depth map corresponding to the three-dimensional computer-generated environment (Ahmed, ¶48: headwear includes camera devices for simultaneous localization and mapping (SLAM) navigation; ¶44 discloses the type of SLAM systems that can be used, including depth stereo camera SLAM; ¶67: 2D marker detected using camera of electronic device, augmented reality head mounted display-AR HMD, and following detection, a virtual representation of the detected 2D marker is determined with respect to the electronic device within a 3D space, determining the plane or polygon forming the 2D marker in a 3D space defined with respect to the camera (or the electronic device)and a virtual representation may comprise determining 3D coordinates within the 3D space of at least three points on the 2D marker (e.g., the 3D coordinate of each corner of the 2D marker)); and Determining the calibration composition data of the target physical object in the computation coordinate system, based on three-dimensional pose information of the feature point in the computation coordinate system (Ahmed, ¶67: Given the virtual representation, the measured location data for the plurality of locations may be correlated with a plurality of points defined with respect to the virtual representation of the 2D marker to determine a mapping between the three-dimensional space and the construction site, comprising determining the transformation between the BIM space (i.e., BIM coordinate system) and one or more of the camera space (i.e., camera coordinate system) and a positioning system space (i.e., positioning system coordinate system) Regarding claim 2, the claimed method is the same as the implemented operations of claim 10, and as such claim 2 is rejected based on the same rationale as claim 10 set forth above. Regarding claim 18, the limitations included from claim 17 are rejected based on the same rationale as claim 17 set forth above. Further regarding claim 18, the operations further implement the method of claim 2 and as such claim 18 is further rejected based on the same rationale as claim 2 set forth above. Regarding claim 12, Ahmed further discloses: Wherein the determining the calibration composition data of the target physical object in the computation coordinate system, based on three-dimensional pose information of the feature point in the computation coordinate system comprises: determining a calibration representation of the target physical object based on the three-dimensional pose information of the feature point in the computation coordinate system (Ahmed, ¶44: feature based SLAM systems for marker based positioning system; ¶48: headwear with sensors using active markers and camera devices for SLAM navigation; ¶67: Determining a virtual representation may comprise determining 3D coordinates within the 3D space of at least three points on the 2D marker (e.g., the 3D coordinate of each corner of the 2D marker)); and Determining, based on the calibration representation, at least one spatial anchor point of the target physical object and boundary information associated with the spatial anchor point, to constitute the calibration composition data of the target physical object in the computation coordinate system. (Ahmed, ¶60: 2D markers are mounted in place on structure – see Figs. 3A-3F and ¶63, where Marker has boundary 334; ¶67 discloses determining a virtual representation of the 2D marker with respect to the electronic device HMD within a 3D space by determining a plane or polygon forming the 2D marker in 3D space with respect to the camera or electronic device, where the virtual representation comprises determining the 3D coordinates within the 3D space of the at least three points on the 2D marker – i.e. the polygon boundary information associated with the points used as 3D positional data of the 2D marker determined from the identified 2D marker point, which can be corners of the 2D marker, and given the virtual representation, the measured location data for the plurality of locations may be correlated with a plurality of points defined with respect to the virtual representation of the 2D marker to determine a mapping between the three-dimensional space and the construction site – note any feature point used as corner for determining polygon is an anchor point for maker, determined from 3D positional data of identified points in image) Regarding claim 4, the claimed method is the same as the implemented operations of claim 12, and as such claim 4 is rejected based on the same rationale as claim 12 set forth above. Regarding claim 20, the limitations included from claim 18 are rejected based on the same rationale as claim 18 set forth above. Further regarding claim 20, the operations further implement the method of claim 4 and as such claim 20 is further rejected based on the same rationale as claim 4 set forth above. Regarding claim 14, Ahmed further discloses: Wherein the determining a calibration model of the target physical object in a rendering coordinate system used for calibration, based on the calibration composition data and a coordinate offset between the computation coordinate system and the rendering coordinate system comprises: performing a coordinate transformation on the calibration composition data by using the coordinate offset between the computation coordinate system and the rendering coordinate system, to obtain a calibration pose of the target physical object in the rendering coordinate system; and determining, based on the calibration pose, the calibration model of the target physical object in the rendering coordinate system (Ahmed, ¶60: initialize or configure a transformation between a coordinate system used by at least one positioning system and a coordinate system used by the Building Information Model (BIM); ¶86: A user wearing the headset 530 and viewing an augmented reality display such as augmented reality glasses 250 needs to view the BIM 512 aligned to the current pose of the headset 530, i.e. positions and orientations within the BIM coordinate system 512 need to be mapped to the positioning coordinate system 534, using point-to-point transformation, defining the relative rotation and translation between the origins of the coordinate systems for the BIM and the positioning system; ¶88 further disclosing transformation between coordinate systems for accurate placement of the 2D marker within the system coordinates) Regarding claim 6, the claimed method is the same as the implemented operations of claim 14, and as such claim 6 is rejected based on the same rationale as claim 14 set forth above. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 5 and 13 is/are rejected under 35 U.S.C. 103 as being unpatentable over: Ahmed et al. (US 2025/0086830 A1) in view of Wong (US 2025/0111522 A1). Regarding claim 13, the limitations included from claim 12 are rejected based on the same rationale as claim 12 set forth above. Further regarding claim 13, Ahmed further discloses: Wherein the calibration representation of the target physical object comprises (Ahmed, ¶60: 2D markers are mounted in place on structure – see Figs. 3A-3F and ¶63, where Marker has boundary 334; ¶67 discloses determining a virtual representation of the 2D marker with respect to the electronic device HMD within a 3D space by determining a plane or polygon forming the 2D marker in 3D space with respect to the camera or electronic device, where the virtual representation comprises determining the 3D coordinates within the 3D space of the at least three points on the 2D marker) Further more, Ahmed discloses use of positioning with stereo cameras (Ahmed, ¶35) and for system using stereoscopic display HMD (Ahmed, ¶53: attached to, or incorporated in, each of the eye regions 253a, 253b is a respective transparent or semi-transparent display device 255a, 255b for displaying augmented reality media content to a user) The only limitation not explicitly taught is that the calibration representation includes a three dimensional stereoscopic graph as well as a 2D planar graph. Examiner notes that the applicant’s specification discloses the 3D graph as a box and the 2D graph as a plane, and that a 3D graph would also include the 2D plane as one side. Wong discloses: Wherein the calibration representation of the target physical object comprises a three-dimensional stereoscopic graph and a two-dimensional planar graph for enclosing the target physical object (Wong, Figs. 3-4 and ¶33: the virtual images corresponding to the tracking apparatus 2 are respectively displayed at the positions of the first virtual calibration coordinate VCC1 and the second virtual calibration coordinate VCC2, where the hand HD of the user using the head-mounted device HMD holds the tracking apparatus 2 and prepares to move the tracking apparatus 2 to the positions of the first virtual calibration coordinate VCC1 and the second virtual calibration coordinate VCC2; ¶35: as shown in FIG. 4, in the image window WIN2 of the head-mounted device HMD, the processor 13 determines whether the tracking apparatus 2 has moved to the first virtual calibration coordinate VCC1 (e.g., the tracking apparatus 2 is overlapped with the virtual image; ¶47 discloses three-dimensional space calibration based VCC1) Both Ahmed and Wong are directed to calibration of alignment of virtual and physical data for use in augmented reality. It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable expectation of success, to modify the system and technique for coordinating positional coordinate data to align virtual objects for proper placement in physical environments for augmented reality as provided by Ahmed, by using a 3D virtual box as an alignment object as provided by Wong, using known electronic interfacing and programming techniques. The modification merely substitutes one known alignment element for calibrating the alignment of coordinate data for proper visualization of virtual elements combined with physical elements in augmented reality for another, yielding predictable results of utilizing a three-dimensional alignment object. Moreover the modification results in an improved augmented reality device alignment by accounting for an additional visual alignment indicator by allowing for user to more easily ensure the augmented reality system is properly calibrated and allowing for an additional axes check, providing more data for performing calibration and potentially ensuring even better fit between virtual and world coordinate space. Regarding claim 5, the claimed method is the same as the implemented operations of claim 13, and as such claim 5 is rejected based on the same rationale as claim 13 set forth above. Claim(s) 7-8 and 15-16 is/are rejected under 35 U.S.C. 103 as being unpatentable over: Ahmed et al. (US 2025/0086830 A1) in view of Gibby et al. (US 2020/0186786 A1). Regarding claim 15, the limitations included from claim 14 are rejected based on the same rationale as claim 14 set forth above. Further regarding claim 15, Gibby discloses: Wherein the determining, based on the calibration pose, the calibration model of the target physical object in the rendering coordinate system comprises: presenting, based on the calibration pose, a preview model of the target physical object in the three-dimensional computer-generated environment (Gibby, ¶15: a user can view an optical code (e.g. a real view of an optical code) through an AR headset, where the optical code may be a 2D bar code, a QR code, linear bar code, an AprilTag, or another optical code that is visible to cameras or sensors of the AR headset, and user aligns a virtual image with optical code; ¶23: alignment marker 108 projected onto holographic lenses of AR headset to enable alignment marker 108 to be aligned with optical code 106, where AR headset projects alignment marker 108 in a position using a default interpupillary distance with which the AR headset is programed, but due to individual eye location variations of user, the alignment marker may not actually align with optical code 106 as viewed by user through AR headset; ¶34: this technology enables calibration of a user's individual eye positions using a printed optical code that the AR headset detects and then computes the 3D position of the optical code, with virtual representation of code displayed in proximity to physical optical code); In response to calibration adjustment on the preview model, updating the calibration pose (Gibby, ¶15: user moves or drags graphical marker until aligned with optical code as viewed from eye of user; ¶17: delta values or changes for the eye position settings for modifying where the virtual images or objects are projected for the individual user can be stored as a user setting or a user profile for each individual user in a user preferences database on the AR headset); and In response to calibration confirmation on the preview model, presenting, based on the updated calibration pose, the calibration model of the target physical object in the three-dimensional computer-generated environment. (Gibby, ¶15: ser moves or drags graphical marker until aligned with optical code as viewed from eye of user; Fig. 3 and ¶¶32-33: projected graphical marker 308 which user selects and drags until aligned with optical code 306; ¶37 discloses printed calibration page; Note that the use of the code is for calibration of 3D computer generated environment, as discussed in ¶40 which discloses the anchoring of the image data to a position in real-world space, within 3D image data set) Both Ahmed and Gibby are directed to calibration of alignment of virtual and physical data for use in augmented reality. It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable expectation of success, to modify the system and technique for coordinating positional coordinate data to align virtual objects for proper placement in physical environments for augmented reality as provided by Ahmed, by providing additional adjustments by a user to better align the data for more accurate presentation of augmented reality as provided by Gibby, using known electronic interfacing and programming techniques. The modification results in an improved augmented reality experience by allowing proper visualization of data to a specific user’s vision (see e.g. ¶3 and ¶18 of Gibby explaining need and improvement). Regarding claim 7, the claimed method is the same as the implemented operations of claim 15, and as such claim 7 is rejected based on the same rationale as claim 15 set forth above. Regarding claim 16, the limitations included from claim 9 are rejected based on the same rationale as claim 9 set forth above. Further regarding claim 16, Wong discloses: Determining a plurality of pre-configured application coordinate systems, and recording a coordinate offset between the computation coordinate system and each of the application coordinate systems, to determine the coordinate offset between the computation coordinate system and the rendering coordinate system (Gibby: ¶27: The eye adjustments may be transformed into an interpupillary distance (IPD) between the right eye and left eye of a user by computing and then modifying a stored interpupillary distance containing the right eye adjustments and left eye adjustments. The updated interpupillary distance (IPD) may then be applied as the eye adjustment in the AR headset. The virtual objects may be projected onto holographic lenses, waveguides diffraction gratings, or similar optical materials of the AR headset using the revised interpupillary distance (IPD). The revised interpupillary distance (IPD) for a distance between the eyes of the user may be stored in a user profile for each individual user of the AR headset. Alternatively, the right eye adjustments and the left eye adjustments may be stored in a user profile as a specific eye position for each eye (e.g., X and Y delta values); ¶28: The right eye adjustments and left eye adjustments may be created by referencing an adjustment to a position of the wireframe in two axes for both the right eye and left eye of the user – i.e. two coordinate systems, one for each eye, and performing adjustments for each eye to obtain the proper offset of virtual object data for proper display based on personal user’s pupils) Both Ahmed and Gibby are directed to calibration of alignment of virtual and physical data for use in augmented reality. It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable expectation of success, to modify the system and technique for coordinating positional coordinate data to align virtual objects for proper placement in physical environments for augmented reality as provided by Ahmed, by providing additional adjustments by a user to better align the data for more accurate presentation of augmented reality using adjustments for both eyes of a user as provided by Gibby, using known electronic interfacing and programming techniques. The modification results in an improved augmented reality experience by allowing proper visualization of data to a specific user’s vision (see e.g. ¶3 and ¶18 of Gibby explaining need and improvement). Regarding claim 8, the claimed method is the same as the implemented operations of claim 16, and as such claim 8 is rejected based on the same rationale as claim 16 set forth above. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to WILLIAM A BEUTEL whose telephone number is (571)272-3132. The examiner can normally be reached Monday-Friday 9:00 AM - 5:00 PM (EST). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, DANIEL HAJNIK can be reached at 571-272-7642. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /WILLIAM A BEUTEL/Primary Examiner, Art Unit 2616
Read full office action

Prosecution Timeline

Aug 13, 2024
Application Filed
Mar 26, 2026
Non-Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12581262
AUGMENTED REALITY INTERACTION METHOD AND ELECTRONIC DEVICE
2y 5m to grant Granted Mar 17, 2026
Patent 12572258
APPARATUS AND METHOD WITH IMAGE PROCESSING USER INTERFACE
2y 5m to grant Granted Mar 10, 2026
Patent 12566531
CONFIGURING A 3D MODEL WITHIN A VIRTUAL CONFERENCING SYSTEM
2y 5m to grant Granted Mar 03, 2026
Patent 12561927
MEDIA RESOURCE DISPLAY METHOD AND APPARATUS, DEVICE, AND STORAGE MEDIUM
2y 5m to grant Granted Feb 24, 2026
Patent 12554384
SYSTEMS AND METHODS FOR IMPROVED CONTENT EDITING AT A COMPUTING DEVICE
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
70%
Grant Probability
90%
With Interview (+20.4%)
2y 7m
Median Time to Grant
Low
PTA Risk
Based on 469 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month