Prosecution Insights
Last updated: April 19, 2026
Application No. 18/199,650

LASER PROJECTOR SIMULATION

Non-Final OA §103
Filed
May 19, 2023
Examiner
OWENS, DANELL L
Art Unit
2882
Tech Center
2800 — Semiconductors & Electrical Systems
Assignee
Faro Technologies Inc.
OA Round
1 (Non-Final)
76%
Grant Probability
Favorable
1-2
OA Rounds
2y 8m
To Grant
87%
With Interview

Examiner Intelligence

Grants 76% — above average
76%
Career Allow Rate
566 granted / 743 resolved
+8.2% vs TC avg
Moderate +11% lift
Without
With
+10.7%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
33 currently pending
Career history
776
Total Applications
across all art units

Statute-Specific Performance

§101
0.6%
-39.4% vs TC avg
§103
54.5%
+14.5% vs TC avg
§102
33.8%
-6.2% vs TC avg
§112
8.9%
-31.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 743 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-3, 5, 8-13, 15 and 18-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Yuan et al. (CN113160421A) in view of Sanjeev et al. (US PG Pub. 20210255328). Regarding claim 1, Yuan discloses a method comprising: receiving a point cloud representative of a real-world environment (pg. 7 para. 5 and 6; (2) after the depth camera acquires the depth data of an actual experimental scene, converting the depth data into point cloud data in real time according to internal parameters of the depth camera; (3) the data processor divides the point cloud data of the physical object from the point cloud data, determines the position and orientation data of the physical object based on the point cloud data of the physical object, and corrects the position and orientation of the digital model corresponding to the physical object in the virtual experimental scene based on the position and orientation data of the physical object…); simulating a projection of a projector into a virtual environment based at least in part on the point cloud the virtual environment representative of the real-world environment (pg. 7 7th para.; projecting the virtual experiment scene subjected to the pose correction of the digital model into an actual experiment scene by using the projector.). evaluating the projection to determine whether at least one projector preference is satisfied (pg. 9 9th para.; In the step (3), the performing of pose correction on the digital model corresponding to the real object in the virtual experimental scene based on the pose data of the real object means that the data processor adjusts parameters and states of the real object in the virtual experimental scene according to the pose data of the real object in the actual experimental scene, and registers the pose data of the actual experimental scene and the pose data of the virtual experimental scene.); and responsive to determining that the at least one projector preference is not satisfied, adjusting at least one of a position of the projector, an orientation of the laser projector, or a property of the projector (pg. 9 7th para.; The external parameters of the projector are relative pose relations between the projector and a world coordinate system, and comprise a rotation vector R (a vector with the size of 1x3 or a rotation matrix 3x3) and a translation vector T (Tx, Ty, Tz) and pg 9 10th para.; And projecting the adjusted virtual experiment scene into an actual experiment scene through a projector to realize virtual-real fusion presentation). Yuan fails to explicitly teach a laser projector. Sanjeev discloses a laser projector (para. 0038; Laser projector 102 can be a device that projects laser beams on a specified object to create a moving image for entertainment or professional use). It would have been obvious to one of ordinary skill in the art prior to the filing date of the application to modify the display of Yuan with the laser projector of Sanjeev because laser projectors offer ability to produce brighter more vibrant images thus increasing the user experience. Regarding claim 2, Yuan discloses wherein receiving the point cloud representative of the real-world environment comprises scanning the real-world environment with a laser scanner to obtain a plurality of three-dimensional coordinates in the real-world environment (pg. 7 para. 5 and 6; (2) after the depth camera acquires the depth data of an actual experimental scene, converting the depth data into point cloud data in real time according to internal parameters of the depth camera). Yuan fails to explicitly teach a depth camera with a laser scanner to obtain a plurality of three-dimensional coordinates in the real-world environment; however, it is well known that depth cameras include 3D laser scanners. Regarding claim 3, Yuan discloses wherein the at least one projector preference comprises a field of view preference (pg. 10; And the projector 5 projects the adjusted virtual experiment scene onto the corresponding real object 2 and the experiment table 1 in real time, so that the virtual and real fusion presentation is realized. The projection area of the projector 5 covers the movable range of the real object, and the field of view of the depth camera 3 covers a larger range than the projection area of the projector.). Regarding claim 5, Yuan discloses wherein the at least one projector preference comprises an incident angle preference (pg. 9 The external parameters of the projector are relative pose relations between the projector and a world coordinate system, and comprise a rotation vector R (a vector with the size of 1x3 or a rotation matrix 3x3) and a translation vector T (Tx, Ty, Tz) and In the step (3), the performing of pose correction on the digital model corresponding to the real object in the virtual experimental scene based on the pose data of the real object means that the data processor adjusts parameters and states of the real object in the virtual experimental scene according to the pose data of the real object in the actual experimental scene, and registers the pose data of the actual experimental scene and the pose data of the virtual experimental scene.). Regarding claim 8, Yuan discloses further comprising, subsequent to adjusting the at least one of the position or the orientation of the projector (pg. 9 9th para.; In the step (3), the performing of pose correction on the digital model corresponding to the real object in the virtual experimental scene based on the pose data of the real object means that the data processor adjusts parameters and states of the real object in the virtual experimental scene according to the pose data of the real object in the actual experimental scene): re-simulating the projection of the laser projector into the virtual environment that is based at least in part on the point cloud; and re-evaluating the projection to determine whether the at least one projector preference is satisfied (pg. 9; In the step (3), the performing of pose correction on the digital model corresponding to the real object in the virtual experimental scene based on the pose data of the real object means that the data processor adjusts parameters and states of the real object in the virtual experimental scene according to the pose data of the real object in the actual experimental scene, and registers the pose data of the actual experimental scene and the pose data of the virtual experimental scene. And projecting the adjusted virtual experiment scene into an actual experiment scene through a projector to realize virtual-real fusion presentation.). Regarding claim 9, Yuan discloses further comprising, prior to simulating the projection, aligning the point cloud to the real-world environment (pg. 7 (3) the data processor divides the point cloud data of the physical object from the point cloud data, determines the position and orientation data of the physical object based on the point cloud data of the physical object, and corrects the position and orientation of the digital model corresponding to the physical object in the virtual experimental scene based on the position and orientation data of the physical object). Regarding claim 10, Yuan discloses further comprising disposing the laser projector in the real-world environment based at least in part on the simulation (illustrated in fig. 1). Regarding claim 11, Yuan discloses a system comprising: a memory comprising computer readable instructions (computer 4 of fig. 1); and a processing device (processor located within the computer 4) for executing the computer readable instructions, the computer readable instructions controlling the processing device to preform operations comprising: receiving a point cloud representative of a real-world environment (pg. 7 para. 5 and 6; (2) after the depth camera acquires the depth data of an actual experimental scene, converting the depth data into point cloud data in real time according to internal parameters of the depth camera; (3) the data processor divides the point cloud data of the physical object from the point cloud data, determines the position and orientation data of the physical object based on the point cloud data of the physical object, and corrects the position and orientation of the digital model corresponding to the physical object in the virtual experimental scene based on the position and orientation data of the physical object…); simulating a projection of a projector into a virtual environment based at least in part on the point cloud the virtual environment representative of the real-world environment (pg. 7 7th para.; projecting the virtual experiment scene subjected to the pose correction of the digital model into an actual experiment scene by using the projector.). evaluating the projection to determine whether at least one projector preference is satisfied (pg. 9 9th para.; In the step (3), the performing of pose correction on the digital model corresponding to the real object in the virtual experimental scene based on the pose data of the real object means that the data processor adjusts parameters and states of the real object in the virtual experimental scene according to the pose data of the real object in the actual experimental scene, and registers the pose data of the actual experimental scene and the pose data of the virtual experimental scene.); and responsive to determining that the at least one projector preference is not satisfied, adjusting at least one of a position of the projector, an orientation of the laser projector, or a property of the projector (pg. 9 7th para.; The external parameters of the projector are relative pose relations between the projector and a world coordinate system, and comprise a rotation vector R (a vector with the size of 1x3 or a rotation matrix 3x3) and a translation vector T (Tx, Ty, Tz) and pg 9 10th para.; And projecting the adjusted virtual experiment scene into an actual experiment scene through a projector to realize virtual-real fusion presentation). Yuan fails to explicitly teach a laser projector. Sanjeev discloses a laser projector (para. 0038; Laser projector 102 can be a device that projects laser beams on a specified object to create a moving image for entertainment or professional use). It would have been obvious to one of ordinary skill in the art prior to the filing date of the application to modify the display of Yuan with the laser projector of Sanjeev because laser projectors offer ability to produce brighter more vibrant images thus increasing the user experience. Regarding claim 12, Yuan discloses wherein receiving the point cloud representative of the real-world environment comprises scanning the real-world environment with a laser scanner to obtain a plurality of three-dimensional coordinates in the real-world environment (pg. 7 para. 5 and 6; (2) after the depth camera acquires the depth data of an actual experimental scene, converting the depth data into point cloud data in real time according to internal parameters of the depth camera). Yuan fails to explicitly teach a depth camera with a laser scanner to obtain a plurality of three-dimensional coordinates in the real-world environment; however, it is well known that depth cameras include 3D laser scanners. Regarding claim 13, Yuan discloses wherein the at least one projector preference comprises a field of view preference (pg. 10; And the projector 5 projects the adjusted virtual experiment scene onto the corresponding real object 2 and the experiment table 1 in real time, so that the virtual and real fusion presentation is realized. The projection area of the projector 5 covers the movable range of the real object, and the field of view of the depth camera 3 covers a larger range than the projection area of the projector.). Regarding claim 15, Yuan discloses wherein the at least one projector preference comprises an incident angle preference (pg. 9 The external parameters of the projector are relative pose relations between the projector and a world coordinate system, and comprise a rotation vector R (a vector with the size of 1x3 or a rotation matrix 3x3) and a translation vector T (Tx, Ty, Tz) and In the step (3), the performing of pose correction on the digital model corresponding to the real object in the virtual experimental scene based on the pose data of the real object means that the data processor adjusts parameters and states of the real object in the virtual experimental scene according to the pose data of the real object in the actual experimental scene, and registers the pose data of the actual experimental scene and the pose data of the virtual experimental scene.). Regarding claim 18, Yuan discloses further comprising, subsequent to adjusting the at least one of the position or the orientation of the projector (pg. 9 9th para.; In the step (3), the performing of pose correction on the digital model corresponding to the real object in the virtual experimental scene based on the pose data of the real object means that the data processor adjusts parameters and states of the real object in the virtual experimental scene according to the pose data of the real object in the actual experimental scene): re-simulating the projection of the laser projector into the virtual environment that is based at least in part on the point cloud; and re-evaluating the projection to determine whether the at least one projector preference is satisfied (pg. 9; In the step (3), the performing of pose correction on the digital model corresponding to the real object in the virtual experimental scene based on the pose data of the real object means that the data processor adjusts parameters and states of the real object in the virtual experimental scene according to the pose data of the real object in the actual experimental scene, and registers the pose data of the actual experimental scene and the pose data of the virtual experimental scene. And projecting the adjusted virtual experiment scene into an actual experiment scene through a projector to realize virtual-real fusion presentation.). Regarding claim 19, Yuan discloses further comprising, prior to simulating the projection, aligning the point cloud to the real-world environment (pg. 7 (3) the data processor divides the point cloud data of the physical object from the point cloud data, determines the position and orientation data of the physical object based on the point cloud data of the physical object, and corrects the position and orientation of the digital model corresponding to the physical object in the virtual experimental scene based on the position and orientation data of the physical object). Regarding claim 20, Yuan discloses further comprising disposing the laser projector in the real-world environment based at least in part on the simulation (illustrated in fig. 1). Claim(s) 4 and 14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Yuan et al. (CN113160421A) and Sanjeev et al. (US PG Pub. 20210255328) as applied to claims 1 and 11 above, and further in view of Xiao (CN 112650018 A). Regarding claim 4, Yuan as modified by Sanjeev discloses a real object interaction virtual projection device (illustrated in fig. 1). Yuan as modified by Sanjeev fails to teach wherein the at least one projector preference comprises an overlap preference. Xiao discloses wherein the at least one projector preference comprises an overlap preference (pg. 3 3rd. para,; through the combination processing of the soft hardware, eliminating the redundant brightness of the light overlapping part; so as to ensure that there is no seam on the whole picture; the brightness is uniform, perfect visual impact to the audience). It would have been obvious to one of ordinary skill in the art prior to the filing date of the application to modify the real object interaction virtual projection device of Yuan and Sanjeev with the overlap preference of Xiao in order to ensure that there is no seam on the whole picture; the brightness is uniform, perfect visual impact to the audience (Xiao; pg. 3 3rd para.). Regarding claim 14, Yuan as modified by Sanjeev discloses a real object interaction virtual projection device (illustrated in fig. 1). Yuan as modified by Sanjeev fails to teach wherein the at least one projector preference comprises an overlap preference. Xiao discloses wherein the at least one projector preference comprises an overlap preference (pg. 3 3rd. para.; through the combination processing of the soft hardware, eliminating the redundant brightness of the light overlapping part; so as to ensure that there is no seam on the whole picture; the brightness is uniform, perfect visual impact to the audience). It would have been obvious to one of ordinary skill in the art prior to the filing date of the application to modify the real object interaction virtual projection device of Yuan and Sanjeev with the overlap preference of Xiao in order to ensure that there is no seam on the whole picture; the brightness is uniform, perfect visual impact to the audience (Xiao; pg. 3 3rd para.). Claim(s) 6 and 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Yuan et al. (CN113160421A) and Sanjeev et al. (US PG Pub. 20210255328) as applied to claims 1 and 11 above, and further in view of Laurino (US Pat. 10,685,478). Regarding claim 6, Yuan as modified by Sanjeev discloses a real object interaction virtual projection device (illustrated in fig. 1). Yuan as modified by Sanjeev fails to teach wherein the at least one projector preference comprises an obstruction preference. Laurino discloses wherein the at least one projector preference comprises an obstruction preference (col. 2 line 29-35; determining a location in the virtual reality environment that corresponds to the viewing participant in the physical space and determining a shadow in the virtual environment that would be cast by a virtual object in the location blocking light emitted from a virtual light source in the virtual environment that corresponds to the projector in the physical space). It would have been obvious to one of ordinary skill in the art prior to the filing date of the application to modify the real object interaction virtual projection device of Yuan and Sanjeev with the obstruction preference of Laurino in order to overcome the problem of viewer discomfort caused by glare of light shining from a lens of the projector into the eyes of the viewer in the path of projection. Regarding claim 16, Yuan as modified by Sanjeev discloses a real object interaction virtual projection device (illustrated in fig. 1). Yuan as modified by Sanjeev fails to teach wherein the at least one projector preference comprises an obstruction preference. Laurino discloses wherein the at least one projector preference comprises an obstruction preference (col. 2 line 29-35; determining a location in the virtual reality environment that corresponds to the viewing participant in the physical space and determining a shadow in the virtual environment that would be cast by a virtual object in the location blocking light emitted from a virtual light source in the virtual environment that corresponds to the projector in the physical space). It would have been obvious to one of ordinary skill in the art prior to the filing date of the application to modify the real object interaction virtual projection device of Yuan and Sanjeev with the obstruction preference of Laurino in order to overcome the problem of viewer discomfort caused by glare of light shining from a lens of the projector into the eyes of the viewer in the path of projection. Claim(s) 7 and 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Yuan et al. (CN113160421A) and Sanjeev et al. (US PG Pub. 20210255328) as applied to claims 1 and 11 above, and further in view of Brard (FR 3069692 A). Regarding claim 7, Yuan as modified by Sanjeev discloses a real object interaction virtual projection device (illustrated in fig. 1). Yuan as modified by Sanjeev fails to teach wherein the laser projector is a first laser projector, and wherein simulating the projection comprises simulating the projection for the first laser projector and a second laser projector. Brard discloses wherein the laser projector is a first laser projector, and wherein simulating the projection comprises simulating the projection for the first laser projector and a second laser projector (pg. 5 the virtual reality scene on a projection surface of any shape, simple or complex, using one or more projectors). It would have been obvious to one of ordinary skill in the art prior to the filing date of the application to modify the real object interaction virtual projection device of Yuan and Sanjeev with the multi-projection system of Brard in order to display the data of the virtual reality scene on a surface having the shape of an airplane or car cockpit, or on a surface having the shape of a bubble. With such surface shapes, the user is completely surrounded by the projection surface (Brard; pg. 5 General Principle 5.1; display the data of the virtual reality scene on a surface having the shape of an airplane or car cockpit, or on a surface having the shape of a bubble. With such surface shapes, the user is completely surrounded by the projection surface). Regarding claim 17, Yuan as modified by Sanjeev discloses a real object interaction virtual projection device (illustrated in fig. 1). Yuan as modified by Sanjeev fails to teach wherein the laser projector is a first laser projector, and wherein simulating the projection comprises simulating the projection for the first laser projector and a second laser projector. Brard discloses wherein the laser projector is a first laser projector, and wherein simulating the projection comprises simulating the projection for the first laser projector and a second laser projector (pg. 5 the virtual reality scene on a projection surface of any shape, simple or complex, using one or more projectors). It would have been obvious to one of ordinary skill in the art prior to the filing date of the application to modify the real object interaction virtual projection device of Yuan and Sanjeev with the multi-projection system of Brard in order to display the data of the virtual reality scene on a surface having the shape of an airplane or car cockpit, or on a surface having the shape of a bubble. With such surface shapes, the user is completely surrounded by the projection surface (Brard; pg. 5 General Principle 5.1; display the data of the virtual reality scene on a surface having the shape of an airplane or car cockpit, or on a surface having the shape of a bubble. With such surface shapes, the user is completely surrounded by the projection surface). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to DANELL L OWENS whose telephone number is (571)270-5365. The examiner can normally be reached 9:00am-5:00pm M-F. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Minh-Toan Ton can be reached at 571-272-2303. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DANELL L OWENS/ Examiner, Art Unit 2882 9 January 2026 /BAO-LUAN Q LE/ Primary Examiner, Art Unit 2882
Read full office action

Prosecution Timeline

May 19, 2023
Application Filed
Jan 09, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12601962
PROJECTOR
2y 5m to grant Granted Apr 14, 2026
Patent 12597099
Presentation System Having Static And Dynamic Components
2y 5m to grant Granted Apr 07, 2026
Patent 12584999
OPTICAL APERTURE DIVISION FOR CUSTOMIZATION OF FAR FIELD PATTERN
2y 5m to grant Granted Mar 24, 2026
Patent 12585127
INTERPUPILLARY DISTANCE ADJUSTING DEVICE
2y 5m to grant Granted Mar 24, 2026
Patent 12578632
EFFICIENT LIGHT ENGINE SYSTEMS
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
76%
Grant Probability
87%
With Interview (+10.7%)
2y 8m
Median Time to Grant
Low
PTA Risk
Based on 743 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month