Prosecution Insights
Last updated: April 19, 2026
Application No. 18/863,529

SMART GLASSES AND METHOD FOR PROJECTING A PROJECTION IMAGE

Final Rejection §103
Filed
Nov 06, 2024
Examiner
ROSARIO, NELSON M
Art Unit
2624
Tech Center
2600 — Communications
Assignee
Robert Bosch GmbH
OA Round
2 (Final)
86%
Grant Probability
Favorable
3-4
OA Rounds
2y 0m
To Grant
92%
With Interview

Examiner Intelligence

Grants 86% — above average
86%
Career Allow Rate
704 granted / 818 resolved
+24.1% vs TC avg
Moderate +6% lift
Without
With
+5.8%
Interview Lift
resolved cases with interview
Fast prosecutor
2y 0m
Avg Prosecution
27 currently pending
Career history
845
Total Applications
across all art units

Statute-Specific Performance

§101
4.5%
-35.5% vs TC avg
§103
70.9%
+30.9% vs TC avg
§102
2.3%
-37.7% vs TC avg
§112
8.1%
-31.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 818 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION Response to Amendment Applicant’s response to the last Office Action, filed on November 6, 2025 has been entered and made of record Claims 14-25 are currently pending in this application. This action is final. Response to Arguments The applicant’s arguments to the claim rejections are fully considered, however they are not deemed to be persuasive. Applicant argues that Greenberg in view of Du does not explicitly disclose “to-be projected image.”, as it pertains to claims 14, 24 and 25. Greenberg in view of Du discloses to-be projected image (Du, see paragraphs [0034] through [0039], where Du discloses that S110: a location information obtaining step: obtaining current to-be-displayed location information corresponding to a coordination device. S120: A display information obtaining step: obtaining current display information corresponding to the coordination device. S130: A content generating step: generating, according to the display information, current virtual display content corresponding to the coordination device. S140: A projection step: projecting the virtual display content to a location corresponding to the to-be-displayed location information at a fundus of a user. In the method in the embodiment of the present application and the following corresponding apparatus embodiments, the to-be-displayed location information is information related to a to-be-displayed location. The to-be displayed location is a location at which the virtual display content that the user sees is presented. In the method in the embodiment of the present application, the virtual display content corresponding to the coordination device is projected to the fundus of the user according to the to-be-displayed location, to cause that the virtual display content that the user sees is imaged at the to-be-displayed location, and the user does not need to switch back and forth his or her location of gaze point between a display device and the coordination device substantially when the user interacts with the coordination device by using the virtual display content displayed at the to-be-displayed location, which more conforms to a use habit of the user and improves the user experience). PNG media_image1.png 524 486 media_image1.png Greyscale Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 14-25 are rejected under 35 U.S.C. 103 as being unpatentable over Greenberg et al (IDS submitted prior art US 20200150428 A1) in view of Du (US 20160259406 A1). As to Claim 14: Greenberg et al. discloses a smart glasses (Greenberg, see figure 5 and Abstract, where Greenberg discloses systems and methods for direct projection of images onto an eye retina including, for example, systems and methods for directing a projection/imaging optical path so as to track a location of the eye in accordance with a gaze direction thereof. This enables for projecting images onto specific/ fixed locations on the eye retina, while the gaze direction changes), comprising: an optical system (Greenberg, see light module 114 and image generator 116 in figure 1) configured to project a projection image (Greenberg, see LB in figure 1) onto an imaging region of an eye (Greenberg, see EP and paragraph [0079], where Greenberg discloses the eye projection optics 130, typically includes an angular beam relay module 134, which is adapted to relay the light beam for directing it to be incident onto a pupil EP of a user's eye with appropriate pupil incidence angle αin corresponding to the respective location of the corresponding pixel in the image to thereby enable focusing of the light beam by the eye-lens EL onto a proper location at the eye retina ER on which the image pixel associated with projection angle αscn should be projected. This facilitates direct projection of the image 12 onto the eye retina ER), wherein the optical system (Greenberg, see image projection system 110 in figure 4 and paragraph [0076]) includes: a light source configured to output an image (Greenberg, see light source module 114 in figure 4 and paragraph [0076], where Greenberg discloses that the image projection system 110 typically includes a light source/module 114 producing an input light beam ILB, and an image generator 116 including intensity and/or spectral modulator 117 (hereinafter intensity modulator 117) and an image scanner 118 located in the optical path of the light beam LB. The intensity modulator 117 is adapted for modulating the intensity of the light beam in accordance with the intensity of the projected pixel(s) of the image 12), a tracking module (Greenberg, see gaze tracking controller 120 in figure 1) configured to acquire a pupil position of a pupil of the eye (Greenberg, see paragraph [0080], where Greenberg discloses that the eye projection system 100 also includes a gaze tracking controller 120, which is configured and operable for adjusting/controlling the operations of the eye projection optics 130 and/or of the image projection system 110 in accordance with a gaze direction β of the eye, so as to direct the projections of images onto the retina ER in accordance with the pupil's location and its line of sight when at different gaze directions), an image forming module configured to form the output image into a projected image (Greenberg, see paragraph [0035], where Greenberg discloses that an image generator and an eye projection optical module. The image generator is configured to obtain data indicative of an image, produce a plurality of light beam portions corresponding to pixels of the image, adjust the intensity of each light beam portion in accordance with a value of a respective pixel of the image corresponding thereto and direct the light beam portion to propagate along a general optical propagation path towards the eye projection optical module), wherein the image forming module is configured to change an image plane of the projected image as a function of the pupil position acquired by the tracking module (Greenberg, see paragraph [0019], where Greenberg discloses that when the gaze direction of the eye changes, the location of the projection of the image on the eye retina changes accordingly. This is because the actual pupil incidence angle αin depends on the gaze direction. For example, marking the gaze direction by β ⁼ { βx, βy }, for a given projection angle αscn the pupil incidence angle αin will be as follows: PNG media_image2.png 34 474 media_image2.png Greyscale This will result in dependence between the projected location of the pixels on the retina (which depends on pupil incidence angle αin and gaze direction β), and a reflection module which is configured to project projected image as the projection image into the imaging region of the eye (Greenberg, see paragraph [0126], where Greenberg discloses that the eye glasses 500 may be configured and operable for projecting pure virtual reality and/or augmented virtual reality to one or both of the user's eyes. In the latter case, the eyeglass lens may include a beam splitter combiner surface BSC adapted for reflecting light from the eye projection system 100 towards the user eye and transmitting external light from a scenery towards the user's eye. For example, in some embodiments light module 114 of system 110 may be configured for generating input light beams including one or more narrow spectral bands ( e.g., narrow RGB spectral bands) having substantially narrow spectrum). PNG media_image3.png 776 1208 media_image3.png Greyscale PNG media_image4.png 812 668 media_image4.png Greyscale PNG media_image5.png 848 774 media_image5.png Greyscale Greenberg differs from the claimed subject matter in that Greenberg does not explicitly disclose to be- projected image. However in an analogous art, Du discloses to be- projected image (Du, see paragraphs [0034] through [0039], where Du discloses that S110: a location information obtaining step: obtaining current to-be-displayed location information corresponding to a coordination device. S120: A display information obtaining step: obtaining current display information corresponding to the coordination device. S130: A content generating step: generating, according to the display information, current virtual display content corresponding to the coordination device. S140: A projection step: projecting the virtual display content to a location corresponding to the to-be-displayed location information at a fundus of a user. In the method in the embodiment of the present application and the following corresponding apparatus embodiments, the to-be-displayed location information is information related to a to-be-displayed location. The to-be displayed location is a location at which the virtual display content that the user sees is presented. In the method in the embodiment of the present application, the virtual display content corresponding to the coordination device is projected to the fundus of the user according to the to-be-displayed location, to cause that the virtual display content that the user sees is imaged at the to-be-displayed location, and the user does not need to switch back and forth his or her location of gaze point between a display device and the coordination device substantially when the user interacts with the coordination device by using the virtual display content displayed at the to-be-displayed location, which more conforms to a use habit of the user and improves the user experience). It would have been obvious to one of ordinary skill in the art to modify the invention of Greenberg with Du. One would be motivated to modify Greenberg by disclosing to be- projected image as taught by Du, and thereby using strong projection and display functions of a device near an eye; therefore, the devices are all functioned optimally, and the user experience is improved (Du, see paragraph [0018]). As to Claim 14: Greenberg in view of Du discloses that the smart glasses according to claim 14, wherein the image forming module is configured to change the image plane of the to-be-projected image by tilting and/or rotating an image forming module element (Du, see paragraphs [0246] and [0247], where Du discloses that a first beam splitter 620, having a same function as the first beam splitting unit recorded in the implementation manner in FIG. 5b, disposed with a certain tilt angle at an intersection point of a gaze direction of an eye A and an incident direction of the camera 610, and transmitting light entering the eye A from an observed object and reflecting light from the eye to the camera 610; and a focal length adjustable lens 630, having a same function as the focal length adjustable lens recorded in the implementation manner in FIG. 5b, located between the first beam splitter 620 and the camera 610, and adjusting a focal length value in real time, to cause the camera 610 to shoot, at a certain focal length value, a clearest image at a fundus). As to Claim 16: Greenberg in view of Du discloses the smart glasses according to claim 14, wherein the image forming module (Greenberg, see paragraph [0035], where Greenberg discloses that an image generator and an eye projection optical module. The image generator is configured to obtain data indicative of an image, produce a plurality of light beam portions corresponding to pixels of the image, adjust the intensity of each light beam portion in accordance with a value of a respective pixel of the image corresponding thereto and direct the light beam portion to propagate along a general optical propagation path towards the eye projection optical module) is configured to change a position of the imaging region of the projection image (Greenberg, see paragraph [0019], where Greenberg discloses that when the gaze direction of the eye changes, the location of the projection of the image on the eye retina changes accordingly. This is because the actual pupil incidence angle αin depends on the gaze direction. For example, marking the gaze direction by β ⁼ { βx, βy }, for a given projection angle αscn the pupil incidence angle αin will be as follows: PNG media_image2.png 34 474 media_image2.png Greyscale This will result in dependence between the projected location of the pixels on the retina (which depends on pupil incidence angle αin and gaze direction β). As to Claim 17: Greenberg in view of Du discloses the smart glasses according to claim 14, wherein the image forming module is configured to change an extent of the imaging region of the projection image (Greenberg, see paragraph [0034], where Greenberg discloses that light beams with width being in the order of 60% of a typical pupil radius (e.g., which is about 1.5 mm) are used to provide the sufficiently large depth of field/focus of the image on the retina. In this connection, according to the invention due to the large depth of field obtained when using/projecting of such narrow light beams, e.g., narrower than the pupil, on to the eye, a need for adjustable focusing and associated optics may be obviated). As to Claim 18: Greenberg in view of Du discloses the smart glasses according to claim 14, wherein the image forming module includes at least one movable lens element and/or a micromirror element (Du, see paragraph [0071], where Du discloses that the optical device may be a focal length adjustable lens, configured to adjust the focal length of the optical device by adjusting a refractive index and/or a shape of the optical device. Specifically: 1) The focal length is adjusted by adjusting curvature of at least one surface of the focal length adjustable lens, for example, the curvature of the focal length adjustable lens is adjusted by increasing or decreasing a liquid medium in a cavity formed by a two-layer transparent layer; and 2) the focal length is adjusted by changing the refractive index of the focal length adjustable lens, for example, the focal length adjustable lens is filled with a specific liquid crystal medium, an arrangement of the liquid crystal medium is adjusted by adjusting a voltage of an electrode corresponding to the liquid crystal medium, and therefore, the refractive index of the focal length adjustable lens is changed). As to Claim 19: Greenberg in view of Du discloses that the smart glasses according to claim 14, further comprising: a focusing module configured to focus or defocus: (i) the output image or output image beams, and/or (ii) the to-be-projected image (Du, see paragraph [0086], where Du discloses calibrating a fundus image, to obtain at least one reference image corresponding to the image presented at the fundus. Specifically, comparative calculation is performed on the acquired image and the reference image, to obtain the clearest image. Herein, the clearest image may be an obtained image that is the least different from the reference image. In the method in this implementation manner, a difference between a currently obtained image and the reference image may be calculated by using an existing image processing algorithm, for example, by using a classic automatic phase difference focusing algorithm). As to Claim 20: Greenberg in view of Du discloses the smart glasses according to claim 14, further comprising: a mirror module configured to redirect the to-be-projected image onto the reflection module (Greenberg, see paragraph [0090], where Greenberg discloses that an adjustable/ addressable optical deflector 132A (e.g., being an addressable gaze tracking mirror) and a field selector optical module 132B which are configured and operable together for controlling the propagation of light beams LB (e.g., LBl and LB2 in the figures) of different image pixels to intersect with the respective locations of the pupil (LP0 , and LP1 in the figures) when it gazes in different directions, and also to adjusting the pupil incidence angles αin of the light beam LB on the pupil (here αin1 and αin2 of beams LBl and LB2 respectively) with respect to the lines of sight LOS (here LOS0 and LOS 1 correspond to two different gaze directions) such that the incidence angles αin remain fixed with respect to the line of sight LOS of the eye and are invariant to changes in the line of sight LOS direction of the eye/pupil). As to Claim 21: Greenberg in view of Du discloses the smart glasses according to claim 14, wherein the reflection module is configured with at least one holographic optical element (Du, see 930 in figure 9 and paragraph [0274], where Du discloses that FIG. 9 is a schematic diagram of interacting, by an interactive projection display system 910, with a coordination device 920 by using virtual display content 930 presented at a to-be-displayed location according to an embodiment of the present application). As to Claim 22: Greenberg in view of Du discloses the smart glasses according to claim 14, wherein the optical system is manufactured using MEMS technology (Greenberg, see paragraph [0086], where Greenberg discloses that the scanning/ raster-scanning mirror(s)/deflectors may be implemented utilizing any suitable technique, for example electro optical deflectors and/or using mirrors. such as Micro Electro Mechanical System (MEMS) mirrors mechanically coupled to suitable actuators, such as Piezo-electrical actuators or other types of actuators, enabling the mirrors to deflect a light beam from light module 114 to perform an image/raster scan of the light beam across a range of projection angles). As to Claim 23: Greenberg in view of Du discloses that the smart glasses according to claim 14, wherein: (i) the smart glasses are configured such that convergence points behind a pupil plane of the user are shifted Greenberg, see paragraph [0080], where Greenberg discloses that the eye projection system 100 also includes a gaze tracking controller 120, which is configured and operable for adjusting/controlling the operations of the eye projection optics 130 and/or of the image projection system 110 in accordance with a gaze direction β of the eye, so as to direct the projections of images onto the retina ER in accordance with the pupil's location and its line of sight when at different gaze directions), and/or an intersection point of individual beams in the pupil plane of the user is separated from other intersections, and/or (ii) a tunable focus lens is provided or synchronized to defocus a beam at an angle greater than an angle of incidence and/or to increase a lateral beam size in the pupil plane of the user (Greenberg, see paragraph [0019], where Greenberg discloses that when the gaze direction of the eye changes, the location of the projection of the image on the eye retina changes accordingly. This is because the actual pupil incidence angle αin depends on the gaze direction. For example, marking the gaze direction by β ⁼ { βx, βy }, for a given projection angle αscn the pupil incidence angle αin will be as follows: PNG media_image2.png 34 474 media_image2.png Greyscale This will result in dependence between the projected location of the pixels on the retina (which depends on pupil incidence angle αin and gaze direction β). . As to Claim 24: Greenberg et al. discloses a method for projecting an image onto an imaging region of an eye Greenberg, see figure 5 and Abstract, where Greenberg discloses systems and methods for direct projection of images onto an eye retina including, for example, systems and methods for directing a projection/imaging optical path so as to track a location of the eye in accordance with a gaze direction thereof. This enables for projecting images onto specific/ fixed locations on the eye retina, while the gaze direction changes), the method comprising the following steps: outputting the image (Greenberg, see light source module 114 in figure 4 and paragraph [0076], where Greenberg discloses that the image projection system 110 typically includes a light source/module 114 producing an input light beam ILB, and an image generator 116 including intensity and/or spectral modulator 117 (hereinafter intensity modulator 117) and an image scanner 118 located in the optical path of the light beam LB. The intensity modulator 117 is adapted for modulating the intensity of the light beam in accordance with the intensity of the projected pixel(s) of the image 12); acquiring a pupil position of the eye (Greenberg, see paragraph [0080], where Greenberg discloses that the eye projection system 100 also includes a gaze tracking controller 120, which is configured and operable for adjusting/controlling the operations of the eye projection optics 130 and/or of the image projection system 110 in accordance with a gaze direction β of the eye, so as to direct the projections of images onto the retina ER in accordance with the pupil's location and its line of sight when at different gaze directions); forming the output image into a projected image (Greenberg, see paragraph [0035], where Greenberg discloses that an image generator and an eye projection optical module. The image generator is configured to obtain data indicative of an image, produce a plurality of light beam portions corresponding to pixels of the image, adjust the intensity of each light beam portion in accordance with a value of a respective pixel of the image corresponding thereto and direct the light beam portion to propagate along a general optical propagation path towards the eye projection optical module, wherein an image plane of the projected image is changed as a function of the acquired pupil position Greenberg, see paragraph [0019], where Greenberg discloses that when the gaze direction of the eye changes, the location of the projection of the image on the eye retina changes accordingly. This is because the actual pupil incidence angle αin depends on the gaze direction. For example, marking the gaze direction by β ⁼ { βx, βy }, for a given projection angle αscn the pupil incidence angle αin will be as follows: PNG media_image2.png 34 474 media_image2.png Greyscale This will result in dependence between the projected location of the pixels on the retina (which depends on pupil incidence angle αin and gaze direction β); and projecting the projected image as a projection image into the imaging region of the eye (Greenberg, see paragraph [0126], where Greenberg discloses that the eye glasses 500 may be configured and operable for projecting pure virtual reality and/or augmented virtual reality to one or both of the user's eyes. In the latter case, the eyeglass lens may include a beam splitter combiner surface BSC adapted for reflecting light from the eye projection system 100 towards the user eye and transmitting external light from a scenery towards the user's eye. For example, in some embodiments light module 114 of system 110 may be configured for generating input light beams including one or more narrow spectral bands ( e.g., narrow RGB spectral bands) having substantially narrow spectrum). PNG media_image3.png 776 1208 media_image3.png Greyscale PNG media_image4.png 812 668 media_image4.png Greyscale PNG media_image5.png 848 774 media_image5.png Greyscale Greenberg differs from the claimed subject matter in that Greenberg does not explicitly disclose to be- projected image. However in an analogous art, Du discloses to be- projected image (Du, see paragraphs [0034] through [0039], where Du discloses that S110: a location information obtaining step: obtaining current to-be-displayed location information corresponding to a coordination device. S120: A display information obtaining step: obtaining current display information corresponding to the coordination device. S130: A content generating step: generating, according to the display information, current virtual display content corresponding to the coordination device. S140: A projection step: projecting the virtual display content to a location corresponding to the to-be-displayed location information at a fundus of a user. In the method in the embodiment of the present application and the following corresponding apparatus embodiments, the to-be-displayed location information is information related to a to-be-displayed location. The to-be displayed location is a location at which the virtual display content that the user sees is presented. In the method in the embodiment of the present application, the virtual display content corresponding to the coordination device is projected to the fundus of the user according to the to-be-displayed location, to cause that the virtual display content that the user sees is imaged at the to-be-displayed location, and the user does not need to switch back and forth his or her location of gaze point between a display device and the coordination device substantially when the user interacts with the coordination device by using the virtual display content displayed at the to-be-displayed location, which more conforms to a use habit of the user and improves the user experience). It would have been obvious to one of ordinary skill in the art to modify the invention of Greenberg with Du. One would be motivated to modify Greenberg by disclosing to be- projected image as taught by Du, and thereby using strong projection and display functions of a device near an eye; therefore, the devices are all functioned optimally, and the user experience is improved (Du, see paragraph [0018]). As to Claim 25: Greenberg et al. discloses a non-transitory machine-readable storage medium on which is stored a computer program for projecting an image onto an imaging region of an eye (Greenberg, see figure 5 and Abstract, where Greenberg discloses systems and methods for direct projection of images onto an eye retina including, for example, systems and methods for directing a projection/imaging optical path so as to track a location of the eye in accordance with a gaze direction thereof. This enables for projecting images onto specific/ fixed locations on the eye retina, while the gaze direction changes), the computer program, when executed by a computer, causing the computer to perform or control the following steps: outputting the image (Greenberg, see light source module 114 in figure 4 and paragraph [0076], where Greenberg discloses that the image projection system 110 typically includes a light source/module 114 producing an input light beam ILB, and an image generator 116 including intensity and/or spectral modulator 117 (hereinafter intensity modulator 117) and an image scanner 118 located in the optical path of the light beam LB. The intensity modulator 117 is adapted for modulating the intensity of the light beam in accordance with the intensity of the projected pixel(s) of the image 12); acquiring a pupil position of the eye (Greenberg, see paragraph [0080], where Greenberg discloses that the eye projection system 100 also includes a gaze tracking controller 120, which is configured and operable for adjusting/controlling the operations of the eye projection optics 130 and/or of the image projection system 110 in accordance with a gaze direction β of the eye, so as to direct the projections of images onto the retina ER in accordance with the pupil's location and its line of sight when at different gaze directions); forming the output image into a projected image (Greenberg, see paragraph [0035], where Greenberg discloses that an image generator and an eye projection optical module. The image generator is configured to obtain data indicative of an image, produce a plurality of light beam portions corresponding to pixels of the image, adjust the intensity of each light beam portion in accordance with a value of a respective pixel of the image corresponding thereto and direct the light beam portion to propagate along a general optical propagation path towards the eye projection optical module), wherein an image plane of the projected image is changed as a function of the acquired pupil position (Greenberg, see paragraph [0019], where Greenberg discloses that when the gaze direction of the eye changes, the location of the projection of the image on the eye retina changes accordingly. This is because the actual pupil incidence angle αin depends on the gaze direction. For example, marking the gaze direction by β ⁼ { βx, βy }, for a given projection angle αscn the pupil incidence angle αin will be as follows: PNG media_image2.png 34 474 media_image2.png Greyscale This will result in dependence between the projected location of the pixels on the retina (which depends on pupil incidence angle αin and gaze direction β); and projecting the projected image as a projection image into the imaging region of the eye (Greenberg, see paragraph [0126], where Greenberg discloses that the eye glasses 500 may be configured and operable for projecting pure virtual reality and/or augmented virtual reality to one or both of the user's eyes. In the latter case, the eyeglass lens may include a beam splitter combiner surface BSC adapted for reflecting light from the eye projection system 100 towards the user eye and transmitting external light from a scenery towards the user's eye. For example, in some embodiments light module 114 of system 110 may be configured for generating input light beams including one or more narrow spectral bands ( e.g., narrow RGB spectral bands) having substantially narrow spectrum). PNG media_image3.png 776 1208 media_image3.png Greyscale PNG media_image4.png 812 668 media_image4.png Greyscale PNG media_image5.png 848 774 media_image5.png Greyscale Greenberg differs from the claimed subject matter in that Greenberg does not explicitly disclose to be- projected image. However in an analogous art, Du discloses to be- projected image (Du, see paragraphs [0034] through [0039], where Du discloses that S110: a location information obtaining step: obtaining current to-be-displayed location information corresponding to a coordination device. S120: A display information obtaining step: obtaining current display information corresponding to the coordination device. S130: A content generating step: generating, according to the display information, current virtual display content corresponding to the coordination device. S140: A projection step: projecting the virtual display content to a location corresponding to the to-be-displayed location information at a fundus of a user. In the method in the embodiment of the present application and the following corresponding apparatus embodiments, the to-be-displayed location information is information related to a to-be-displayed location. The to-be displayed location is a location at which the virtual display content that the user sees is presented. In the method in the embodiment of the present application, the virtual display content corresponding to the coordination device is projected to the fundus of the user according to the to-be-displayed location, to cause that the virtual display content that the user sees is imaged at the to-be-displayed location, and the user does not need to switch back and forth his or her location of gaze point between a display device and the coordination device substantially when the user interacts with the coordination device by using the virtual display content displayed at the to-be-displayed location, which more conforms to a use habit of the user and improves the user experience). It would have been obvious to one of ordinary skill in the art to modify the invention of Greenberg with Du. One would be motivated to modify Greenberg by disclosing to be- projected image as taught by Du, and thereby using strong projection and display functions of a device near an eye; therefore, the devices are all functioned optimally, and the user experience is improved (Du, see paragraph [0018]). Conclusion THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action. Contact Information Any inquiry concerning this communication or earlier communications from the examiner should be directed to NELSON ROSARIO whose telephone number is (571)270-1866. The examiner can normally be reached on Monday through Friday, 7:30am- 5:00pm EST. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Matthew Eason can be reached on (571) 270-7230. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /NELSON M ROSARIO/Primary Examiner, Art Unit 2624 .
Read full office action

Prosecution Timeline

Nov 06, 2024
Application Filed
Aug 06, 2025
Non-Final Rejection — §103
Nov 06, 2025
Response Filed
Nov 29, 2025
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12599503
Goggle lens
2y 5m to grant Granted Apr 14, 2026
Patent 12601932
COLOR-CHANGING EYEGLASS
2y 5m to grant Granted Apr 14, 2026
Patent 12602123
ELECTRONIC PEN
2y 5m to grant Granted Apr 14, 2026
Patent 12601912
AUGMENTED REALITY GAMING USING VIRTUAL EYEWEAR BEAMS
2y 5m to grant Granted Apr 14, 2026
Patent 12593977
Vision Screening Device Including Color Imaging
2y 5m to grant Granted Apr 07, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
86%
Grant Probability
92%
With Interview (+5.8%)
2y 0m
Median Time to Grant
Moderate
PTA Risk
Based on 818 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month