Prosecution Insights
Last updated: April 19, 2026
Application No. 19/185,103

AUGMENT GLASS SYSTEM TO DYNAMICALLY ADJUST AN AUGMENTED REALITY OBJECT IN VIEW OF A USER

Non-Final OA §102§103§112
Filed
Apr 21, 2025
Examiner
LUBIT, RYAN A
Art Unit
2626
Tech Center
2600 — Communications
Assignee
Luminary LLC Dba Luminary Design Co.
OA Round
1 (Non-Final)
63%
Grant Probability
Moderate
1-2
OA Rounds
2y 4m
To Grant
99%
With Interview

Examiner Intelligence

Grants 63% of resolved cases
63%
Career Allow Rate
476 granted / 756 resolved
+1.0% vs TC avg
Strong +39% interview lift
Without
With
+38.6%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
18 currently pending
Career history
774
Total Applications
across all art units

Statute-Specific Performance

§101
4.3%
-35.7% vs TC avg
§103
45.3%
+5.3% vs TC avg
§102
19.9%
-20.1% vs TC avg
§112
23.1%
-16.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 756 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION Status of the Application 1. Claims 1 – 20 are pending and are under examination in this action. 2. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 112(b) 3. The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. 4. Claims 4 – 5, and 11 – 12 are rejected under 35 U.S.C. 112(b) as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor regards as the invention. Regarding claims 4 – 5 and 11 – 12, the recitations of “determining by calculating” are indefinite for at least the following reasons. Each of these claims fail to specifically state what is being determined by calculating. These claims are incomplete as the “determining” step just occurs without any disclosure as to what is being “determined”. Since it is unclear what is being determined by “calculating the distance”, no prior art can be applied to these claims. Regarding claims 4 – 5 and 11 – 12, the recitations of “the distance” in each of these claims lacks proper antecedent basis. Each of these recitations should be amended to recite “a distance”. Regarding claim 11, the recitations that the “user coordinates: determine” something via calculating, and “determine an average user point coordinate” make no sense. It is unclear how coordinates, by themselves, determine other information. User coordinates can be utilized to make other determinations via a processing unit, but cannot make determinations in and of themselves. Regarding claim 12, the recitations that the “object coordinates: determine” something via calculating, and “determine an average object point coordinate” make no sense. It is unclear how coordinates, by themselves, determine other information. Object coordinates can be utilized to make other determinations via a processing unit, but cannot make determinations in and of themselves. Claim Rejections - 35 USC § 102 5. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. 6. Claims 1 – 3 and 7 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Sobolev (U.S. Pub. 2022/0075477). Regarding claim 1, Sobolev teaches: a method to dynamically adjust a digital content in view of a user (Abstract; paragraph [0120]; displayed content is provided in-perspective for the user viewing the graphics such that when the user changes their angle relative to a display, the content is still viewable. Additionally, a user’s perspective is dynamically determined for detecting which object a user may be interacting depending on their particular perspective. Regardless of the disclosure of the prior art, this particular recitation is non-limiting as it fails to breathe life and meaning into the body of the claim as there is no tie in to the recitation “dynamically adjust”), wherein the method comprising: capturing sensor data of the user via a user sensor and an object via an object sensor (FIGS. 5, 18B; paragraphs [0041], [0100]; cameras 506 / 1812 may include a user-facing camera [sensor] which captures images of user 504 and an object-facing camera [sensor] that captures images of object(s) of interest 500), processing the sensor data obtained from the user sensor and the object sensor (FIGS. 5, 7, 18B; paragraph [0045]; in step 706, a user’s viewing position is determined based on sensor data obtained from the user-facing camera [sensor]. In step 712, the sensor data obtained from the user-facing camera [sensor] and the object-facing camera [sensor] is processed to correlate the particular object that aligns with the user’s position and direction of sight), determining user coordinates and object coordinates via one or more processing units (FIGS. 7, 13; paragraphs [0045], [0074], [0084]; determining the correlation between a particular object and the user’s position and direction of sight includes determining the user position and the object [target] position in a coordinate space, and therefore necessarily determines user and object coordinates. The particular operations disclosed therein are processed by processing resources, such as a processor, of computing system 1310), and displaying the digital content on an active transparent display (FIG. 18B; paragraph [0098]; transparent display 1806 displays, for example, swatches 1810 as digital content that corresponds to real-world content of a watch 1808). Regarding claim 2, Sobolev teaches: wherein the user sensor and the object sensor include an infrared camera or a face mapping module (paragraph [0083]; the cameras may include infrared sensors). Regarding claim 3, Sobolev teaches: wherein the one or more processing units compile the received sensor data to identify user details and object details (FIG. 7; paragraphs [0042], [0045], [0074]; as set forth above with regard to claim 1, the processor performs the disclosed operations. In step 702, an image of a user is captured. This sensor data is “compiled” in steps 704 and 706 to identify a particular user and viewing position [user details]. In step 708, an image of an object [target] is captured. This sensor data is “compiled” in step 710 to identify a particular object / target [object details]. This captured sensor data is further compiled in step 712 by correlating the user’s position and direction of sight with an identified object. This correlation identifies user details and object details, specifically, that they are correlated and a vector passes through both the user and the object / target). Regarding claim 7, Sobolev teaches: wherein the active transparent display comprises: a plurality of diodes for projecting the digital content based on user coordinates (FIG. 18B; paragraphs [0043], [0098]; transparent display 1806 may display content using LED lighting), and a touch interface for the user (FIG. 18B. paragraphs [0098], [0100]; storefront 1804, in which transparent display 1806 is installed, may include a user interactive touch interface). Claim Rejections - 35 USC § 103 7. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 8. Claims 6, 8 – 10, 13 – 14, and 17 – 18 are rejected under 35 U.S.C. 103 as being unpatentable over Sobolev, as similarly applied to claim 1. Regarding claim 6, Sobolev teaches: wherein the one or more processing units further include: calculating an angle of tilt or pitch angle at least based on a height of the user coordinates (paragraph [0120]; as set forth above with regard to claim 1, displayed content is provided in-perspective for the user viewing the graphics such that when the user changes their angle relative to a display, the content is still viewable. This accounts for users of different heights and positions to dynamically adjust for different user perspectives), orienting the digital content at least based on the real-time user coordinates (FIG. 18B; paragraphs [0080], [0120]; objects [digital content] are displayed based on a user’s detected perspective by tracking the user and objects in real-type. This allows the content to be viewable by a user as the user changes their angle relative to the display. Based on this disclosure, it is implied that the swatches 1810 would be rotated to orient in the direction of a user as the user moves relative to the transparent display 1806), and displaying the digital content on the active transparent display (FIG. 18B; paragraph [0098]; transparent display 1806 displays, for example, swatches 1810 as digital content that corresponds to real-world content of a watch 1808). Sobolev fails to explicitly disclose: the angle is based on a height of a hardware platform. However, Sobolev discloses that the disclosed system can be installed in storefronts, display cases, museum exhibits, etc. (FIGS. 18A, 19, 20A; paragraphs [0016], [0097], [0105], [0107]). Each of these exemplified installations have different installations and configurations thereof. With regard to the display case embodiment of FIG. 19, the display case in and of itself, in which the transparent display 1806 and watch 1808 are provided, is interpreted as a “hardware platform”. It is well-known and conventional for stores to have display cases having different heights. Accordingly, the particular angle of tilt between an object and a user is based on the height of the user, as set forth above, and the height of the “hardware platform” as well as the location of the user-facing camera. These relative heights and placement are inherent components of the angles formed between the user-facing camera, the object-facing camera, the object, and the user that are required to determine a vector that passes through both a user and an object (paragraph [0042]) to facilitate the disclosed in-perspective display as well as touch sensing. Therefore, it would have been obvious to a user of ordinary skill in the art before the effective filing date of the claimed invention to calculate an angle between a user and an object based on the inherent relationship between the user’s height and the location of the cameras which include a relative height of a “hardware platform”, as set forth above. Regarding claim 8, Sobolev teaches: an augment glass system to dynamically adjust a digital content in view of a user (Abstract; paragraph [0120], [0132]; an augmented reality system including a transparent touch panel displays content in-perspective for the user viewing the graphics such that when the user changes their angle relative to a display, the content is still viewable. Additionally, a user’s perspective is dynamically determined for detecting which object a user may be interacting depending on their particular perspective. Regardless of the disclosure of the prior art, this particular recitation is non-limiting as it fails to breathe life and meaning into the body of the claim as there is no tie in to the recitation “dynamically adjust”), wherein the augment glass system comprises: an active transparent display (FIG. 18B; paragraph [0100]; transparent display 1806); a sensor housing (FIGS. 5, 18B; paragraphs [0041], [0100]; cameras 506 / 1812), wherein the sensor housing includes: a user sensor to capture sensor data of the user (FIGS. 5, 18B; paragraphs [0041], [0100]; cameras 506 / 1812 include a user-facing camera [sensor] which captures images of user 504); and an object sensor to capture sensor data of an object (FIGS. 5, 18B; paragraphs [0041], [0100]; cameras 506 / 1812 include an object-facing camera [sensor] that captures images of object(s) of interest 500); one or more processing units (FIG. 13; paragraph [0074]; computing system 1310 includes processing resources, such as a processor, that is used to perform disclosed operations) configured for: processing sensor data obtained from the user sensor and the object sensor, determining user coordinates and object coordinates, via a computer vision engine (FIGS. 5, 7, 18B; paragraph [0045], [0047], [0080]; in step 706, a user’s viewing position is determined, using computer vision related software libraries, based on sensor data obtained from the user-facing camera [sensor]. In step 712, the sensor data obtained from the user-facing camera [sensor] and the object-facing camera [sensor] is processed to correlate the particular object that aligns with the user’s position and direction of sight. Location of objects [targets] may be determined using computer vision techniques as well), and displaying the digital content on the active transparent display via a frontend visualization application (FIG. 18B; paragraph [0098]; transparent display 1806 displays, for example, swatches 1810 as digital content that corresponds to real-world content of a watch 1808. It is implicit that there is an application that is responsible for displaying content via transparent display 1806, and this “application” is therefore interpreted as a “frontend visualization application”). Sobolev fails to explicitly disclose: the sensor housing is a custom sensor housing. However, Sobolev discloses that the disclosed system can be installed in storefronts, display cases, museum exhibits, etc. (FIGS. 18A, 19, 20A; paragraphs [0016], [0097], [0105], [0107]). Each of these exemplified installations have different installations and configurations thereof. It would have been obvious to a person of ordinary skill in the art before the effective filing date of Applicant’s claimed invention for the particular camera arrangement for each particular use of Sobolev to be customized to account for different installation environments. Sobolev implies as much by disclosing in-field installation and calibration (paragraph [0044], [0060]). Regarding claim 9, Sobolev teaches: wherein the custom sensor housing is a modular attachment mounted on the active transparent display (FIGS. 18A, 19, 20A; paragraph [0127]; the disclosed system may be modular, allowing for different numbers of touch panels and cameras. Accordingly, it is implied that the cameras themselves may be modular and attachably mounted to transparent displays 1806). Regarding claim 10, Sobolev teaches: wherein the user sensor and the object sensor include a camera to capture data of the user and the object (FIGS. 5, 18B; paragraphs [0041], [0100]; cameras 506 / 1812 include a user-facing camera [sensor] which captures images of user 504 and an object-facing camera [sensor] that captures images of object(s) of interest 500). Regarding claim 13, Sobolev teaches: wherein the one or more processing units: calculate an angle of tilt or pitch angle at least based on a height of the user coordinates (paragraph [0120]; as set forth above with regard to claim 1, displayed content is provided in-perspective for the user viewing the graphics such that when the user changes their angle relative to a display, the content is still viewable. This accounts for users of different heights and positions to dynamically adjust for different user perspectives), orient the digital content at least based on the real-time user coordinates (FIG. 18B; paragraphs [0080], [0120]; objects [digital content] are displayed based on a user’s detected perspective by tracking the user and objects in real-type. This allows the content to be viewable by a user as the user changes their angle relative to the display. Based on this disclosure, it is implied that the swatches 1810 would be rotated to orient in the direction of a user as the user moves relative to the transparent display 1806), and display the digital content on the active transparent display (FIG. 18B; paragraph [0098]; transparent display 1806 displays, for example, swatches 1810 as digital content that corresponds to real-world content of a watch 1808). Sobolev fails to explicitly disclose: the angle is based on a height of a hardware platform. However, Sobolev discloses that the disclosed system can be installed in storefronts, display cases, museum exhibits, etc. (FIGS. 18A, 19, 20A; paragraphs [0016], [0097], [0105], [0107]). Each of these exemplified installations have different installations and configurations thereof. With regard to the display case embodiment of FIG. 19, the display case in and of itself, in which the transparent display 1806 and watch 1808 are provided, is interpreted as a “hardware platform”. It is well-known and conventional for stores to have display cases having different heights. Accordingly, the particular angle of tilt between an object and a user is based on the height of the user, as set forth above, and the height of the “hardware platform” as well as the location of the user-facing camera. These relative heights and placement are inherent components of the angles formed between the user-facing camera, the object-facing camera, the object, and the user that are required to determine a vector that passes through both a user and an object (paragraph [0042]) to facilitate the disclosed in-perspective display as well as touch sensing. Therefore, it would have been obvious to a user of ordinary skill in the art before the effective filing date of the claimed invention to calculate an angle between a user and an object based on the inherent relationship between the user’s height and the location of the cameras which include a relative height of a “hardware platform”, as set forth above. Regarding claim 14, Sobolev teaches: wherein the active transparent display comprises: a plurality of diodes for projecting the digital content on the active transparent display based on user coordinates (FIG. 18B; paragraphs [0043], [0098]; transparent display 1806 may display content using LED lighting), and a touch interface for the user (FIG. 18B. paragraphs [0098], [0100]; storefront 1804, in which transparent display 1806 is installed, may include a user interactive touch interface). Regarding claim 17, Sobolev teaches: the one or more processing units configured to: process the sensor data of the user obtained from the user sensor to determine real-time user coordinates using a computer vision engine (FIGS. 5, 7, 18B; paragraph [0045], [0047], [0080]; in step 706, a user’s viewing position is determined in real-time, using computer vision related software libraries, based on sensor data obtained from the user-facing camera [sensor]); process the sensor data of the object obtained from the object sensor to determine real-time object coordinates using the computer vision engine (FIGS. 5, 7, 18B; paragraph [0045], [0047], [0080]; . In step 712, the sensor data obtained from the user-facing camera [sensor] and the object-facing camera [sensor] is processed to correlate the particular object that aligns with the user’s position and direction of sight. Location [coordinates] of objects [targets] may be determined using computer vision techniques as well, which is implicitly required in order to correlate the object to a user’s position and direction of sight); calculate an angle of tilt or pitch angle based at least on a height of the user coordinates (paragraph [0120]; as set forth above with regard to claim 1, displayed content is provided in-perspective for the user viewing the graphics such that when the user changes their angle relative to a display, the content is still viewable. This accounts for users of different heights and positions to dynamically adjust for different user perspectives); and dynamically adjust and orient the digital content displayed on the active transparent display based on real-time user coordinates in response to changes in the user coordinates (FIG. 18B; paragraphs [0080], [0120]; objects [digital content] are displayed based on a user’s detected perspective by tracking the user and objects in real-type. This allows the content to be viewable by a user as the user changes their angle relative to the display. Based on this disclosure, it is implied that the swatches 1810 would be rotated to orient in the direction of a user as the user moves relative to the transparent display 1806 based on the real-time user coordinates set forth above). Sobolev fails to explicitly disclose: a hardware platform; and the active transparent display operatively coupled to the hardware platform. However, Sobolev discloses that the disclosed system can be installed in storefronts, display cases, museum exhibits, etc. (FIGS. 18A, 19, 20A; paragraphs [0016], [0097], [0105], [0107]). Each of these exemplified installations have different installations and configurations thereof. With regard to the display case embodiment of FIG. 19, the display case in and of itself, in which the transparent display 1806 and watch 1808 are provided, is interpreted as a “hardware platform”. It is well-known and conventional for stores to have display cases having different heights. Accordingly, the particular angle of tilt between an object and a user is based on the height of the user, as set forth above, and the height of the “hardware platform” as well as the location of the user-facing camera. These relative heights and placement are inherent components of the angles formed between the user-facing camera, the object-facing camera, the object, and the user that are required to determine a vector that passes through both a user and an object (paragraph [0042]) to facilitate the disclosed in-perspective display as well as touch sensing. Therefore, it would have been obvious to a user of ordinary skill in the art before the effective filing date of the claimed invention to calculate an angle between a user and an object based on the inherent relationship between the user’s height and the location of the cameras which include a relative height of a “hardware platform”, as set forth above. Additionally, it would have been obvious to try incorporating the hardware components of Sobolev, such as the remote computing system 1310 and the processing resources thereof, within the “hardware platform”. There are a limited number of options for locating the computing system 1310 of Sobolev when implemented in a display case embodiment: within the “hardware platform”, wired to the “hardware platform”, or wirelessly communicating with components within the “hardware platform”. Accordingly, it would have been obvious to try locating the remote computing system 1310 within the “hardware platform” set forth above. Regarding claim 18, Sobolev teaches: wherein the hardware platform further comprises one or more active transparent displays operatively coupled to the hardware platform (FIG. 19; paragraphs [0100], [0105]; transparent display 1806 is installed in the “hardware platform” of the display case embodiment), and wherein the one or more processing units are configured to display the digital content selected from the group consisting of: (i) displaying the digital content on the one or more active transparent displays (FIG. 19; paragraphs [0098], [0105]; swatches 1810 may be displayed as digital content on the transparent display 1806); and (ii) displaying a plurality of digital content independently on the plurality of active transparent displays (This recitation is recited in the alternative and is therefore not a required element in this claim). 9. Claims 15 – 16, and 19 – 20 are rejected under 35 U.S.C. 103 as being unpatentable over Sobolev, as applied to claims 8 and 17 above, as evidenced by Sinharoy et al. (U.S. Pub. 2025/0022146). Regarding claim 15, Sobolev fails to explicitly disclose: further comprises a processing circuitry includes one or more processing units communicating via a communication device to a cloud network. However, it was well-known and conventional in the art before the effective filing date of Applicant’s claimed invention to combine local and cloud computing for processing information. For evidence, please see paragraph [0066] of Sinharoy. Accordingly, it would have been obvious to a person of ordinary skill in the art before the effective filing date of Applicant’s claimed invention to include cloud computing, and wireless transmission thereto, as part of the processing resources of Sobolev. Such a modification to Sobolev requires nothing more than a simple substitution of a well-known and conventional processing technique for another. Regarding claim 16, Sobolev fails to explicitly disclose: wherein the cloud network includes a server for processing the rendering of the digital content to display on the active transparent display. However, it was well-known and conventional in the art before the effective filing date of Applicant’s claimed invention to combine local and cloud computing for processing information. For evidence, please see paragraph [0066] of Sinharoy. Such an implementation would include a cloud-based server for performing the corresponding processing functions. Accordingly, it would have been obvious to a person of ordinary skill in the art before the effective filing date of Applicant’s claimed invention to include cloud computing, and wireless transmission thereto, as part of the processing resources of Sobolev. Such a modification to Sobolev requires nothing more than a simple substitution of a well-known and conventional processing technique for another to implement the function of displaying content. Regarding claim 19, Sobolev fails to explicitly disclose: wherein the active transparent display further comprises a processing circuitry including: one or more processing units configured to communicate via a communication device to a cloud network; wherein the cloud network includes a server for processing the rendering of the digital content to display on the active transparent display. However, it was well-known and conventional in the art before the effective filing date of Applicant’s claimed invention to combine local and cloud computing for processing information. For evidence, please see paragraph [0066] of Sinharoy. Such an implementation would include a cloud-based server for performing the corresponding processing functions. Accordingly, it would have been obvious to a person of ordinary skill in the art before the effective filing date of Applicant’s claimed invention to include cloud computing, and wireless transmission thereto, as part of the processing resources of Sobolev. Such a modification to Sobolev requires nothing more than a simple substitution of a well-known and conventional processing technique for another to implement the function of displaying content. Regarding claim 20, Sobolev teaches: further comprising: a plurality of augment glass systems for dynamically adjusting the digital content (paragraph [0118]; the disclosed technology may be used in retail storefronts, display cases, etc. It is well-known and conventional for retail stores to have multiple fronts [i.e., sections of glass], multiple display cases, etc. Additionally, multiple stores may be incorporate such technology which therefore include multiple “augment glass systems”), each augment glass system including: a processing circuitry including one or more processing units (FIG. 13; paragraph [0074]; each “augment glass system” would include a computing system 1310 that includes processing resources, such as a processor, that is used to perform disclosed operations). Sobolev fails to explicitly disclose: wherein the processing circuitry of each augment glass system is configured to communicate with at least one other processing circuitry of the plurality of augment glass systems via a communication device to a cloud network; wherein the cloud network includes a server for processing the rendering of the digital content to display on the active transparent display of each augment glass system. However, it was well-known and conventional in the art before the effective filing date of Applicant’s claimed invention to combine local and cloud computing for processing information. For evidence, please see paragraph [0066] of Sinharoy. Such an implementation would include a cloud-based server for performing the corresponding processing functions. Accordingly, it would have been obvious to a person of ordinary skill in the art before the effective filing date of Applicant’s claimed invention to include cloud computing, and wireless transmission thereto, as part of the processing resources of Sobolev. Such a modification to Sobolev requires nothing more than a simple substitution of a well-known and conventional processing technique for another to implement the function of displaying content. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to RYAN A LUBIT whose telephone number is (571)270-3389. The examiner can normally be reached M - F, ~6am - 3pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Temesghen Ghebretinsae can be reached at 571-272-3017. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /RYAN A LUBIT/Primary Examiner, Art Unit 2626
Read full office action

Prosecution Timeline

Apr 21, 2025
Application Filed
Feb 23, 2026
Non-Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602119
STYLUS MOVEMENT TRANSLATION SWITCHING
2y 5m to grant Granted Apr 14, 2026
Patent 12578817
DISPLAY PANEL, DRIVING METHOD THEREOF, AND ELECTRONIC TERMINAL
2y 5m to grant Granted Mar 17, 2026
Patent 12566499
EYE CENTER OF ROTATION DETERMINATION WITH ONE OR MORE EYE TRACKING CAMERAS
2y 5m to grant Granted Mar 03, 2026
Patent 12562098
DISPLAY APPARATUS AND DISPLAY PANEL
2y 5m to grant Granted Feb 24, 2026
Patent 12560833
DISPLAY DEVICE
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
63%
Grant Probability
99%
With Interview (+38.6%)
2y 4m
Median Time to Grant
Low
PTA Risk
Based on 756 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month