Prosecution Insights
Last updated: April 19, 2026
Application No. 19/071,658

PROJECTION SYSTEM AND METHOD OF FORMING STEREOSCOPIC IMAGE

Non-Final OA §102§103
Filed
Mar 05, 2025
Examiner
ZHENG, XUEMEI
Art Unit
2629
Tech Center
2600 — Communications
Assignee
Optoma Corporation
OA Round
1 (Non-Final)
85%
Grant Probability
Favorable
1-2
OA Rounds
2y 1m
To Grant
99%
With Interview

Examiner Intelligence

Grants 85% — above average
85%
Career Allow Rate
598 granted / 707 resolved
+22.6% vs TC avg
Moderate +14% lift
Without
With
+14.0%
Interview Lift
resolved cases with interview
Fast prosecutor
2y 1m
Avg Prosecution
23 currently pending
Career history
730
Total Applications
across all art units

Statute-Specific Performance

§101
1.0%
-39.0% vs TC avg
§103
41.4%
+1.4% vs TC avg
§102
23.0%
-17.0% vs TC avg
§112
25.8%
-14.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 707 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1-4, 6-7, 10-13 and 15-16 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by You (CN 117812241 A, machine translation of which is used for this examination). Regarding claim 1, You teaches a projection system (Abstract; Fig. 1: folding angle display device 10), comprising: a non-planar projection surface (Figs. 2-3: first display surface 110 and second display surface 111 collectively form a non-planar projection surface); a projection apparatus (Fig. 1: inherent projection apparatus folded angle display 11; Examiner’s Note; a projection apparatus necessarily exists in folded angle display 11 to achieve projected image on projection surface(s) of folded angle display 11); and a controller (Fig. 1: inherent controller in folded angle display 11 to implement control of projection of compensated image to projection surfaces of folded angle display 11; Examiner’s Note: a projection controller inherently communicates with processor 15 that is responsible for calculation of compensation of an original image to derive an output image and implements projection of light beam accordingly to projection surfaces to generate the output image), communicatively connected to the projection apparatus, wherein the controller is configured to provide stereoscopic image information (Page 5, 2nd paragraph: “The processor 15 may be a built-in or external output/input device […] to obtain a first image signal (hereinafter referred to as an original image, […] The second image signal (hereinafter referred to as the display image 115) is output to the break-angle display 11 so that the break-angle display 11 presents the display image 115”; Examiner’s Note: as shown in Fig. 11, display image 115 is a floating/stereoscopic image necessarily based on stereoscopic image information, which results from implementation of compensation of the original image) and transmit the stereoscopic image information to the projection apparatus (Fig. 1: processor 15 transmits all necessary information, including stereoscopic image information, to folded angle display 11), the projection apparatus is configured to project an image beam onto the non-planar projection surface according to the stereoscopic image information (Fig. 4: step S609; Examiner’s Note: images projected to first display surface 110 and second display surface 111 to cause output image 115 in Fig. 11 necessarily result from projecting an image beam to both surfaces), wherein a stereoscopic image (Fig. 11: output image 115) is presented within a field of view of the image beam (Page 7, 3rd paragraph: “In some embodiments, the parallel projection includes orthographic projection and oblique projection. In some embodiments of step S607, depending on the different viewing positions 40 and the distribution positions of the first projection image 30 on the projection surface 24, the first projection image 30 and the second projection image 30, the second projection image 30 and the third projection image 30 are arranged on the projection surface 24”; Fig. 4: step S607; Figs. 5: generation of output image is based on field of view associated with viewing position 40). Regarding claim 2, You further teaches the projection system according to claim 1, further comprising: a terminal apparatus (Page 5, 2nd paragraph: “The processor 15 may be a built-in or external output/input device (e.g., a camera, a scanner, a scanner, or the like) of the self-folding angle display device 10 or a universal serial bus (USB) device or the like) (not shown) to obtain a first image signal (hereinafter referred to as an original image), and performing an image compensation method on the original image”; Fig. 1: processor 15 optionally associated with a terminal apparatus), wherein the controller is communicatively connected to the terminal apparatus (Fig. 1), the terminal apparatus is configured to provide a projection position, projection parameters of the projection position, and a projection range to the controller (Page 6, 3rd paragraph: “the processor 15 establishes a projection plane 24 in the three-dimensional space 100 (step S603). For example, as shown in FIG. 5 and FIG. 7, the projection surface 24 is located at the rear side of the backplane model 23 relative to the first simulation surface 20 and the second simulation surface 21. However, the present invention is not limited to this, for example, as shown in FIG. 8 and FIG. 9, the projection surface 24, the first simulation surface 20 and the second simulation surface 21 are all located on the same side (front side) of the backplane model 23. In some embodiments, the projection surface 24 is a plane or a curved surface”; Page 8, 3rd paragraph: “The size of the different projection surface 24 allows the second projection image 31 as the display image 115 to have a different degree of stereoscopic impression”; Fig. 4: step S603; Examiner’s Note: establishment of projection surface of various position, shape and size is necessarily based on “a projection position, projection parameters of the projection position, and a projection range” being provided), the controller is configured to convert a 3D content image into the stereoscopic image information according to the projection position, the projection parameters of the projection position, the projection range, and a 3D projection plane (Figs. 4-6, 11: 3D projection plane read by plane on which output image 115 is located), the controller is configured to provide the stereoscopic image information to the projection apparatus (Fig. 1; also see interpretation of the controller in claim 1), the projection apparatus is configured to generate the image beam according to the stereoscopic image information, so that the stereoscopic image is formed on the 3D projection plane within the field of view of the image beam (Figs. 4-6, 11). Regarding claim 3, You further teaches the projection system according to claim 2, wherein the 3D content image at least comprises a 3D object (Page 6, 3rd paragraph: “the projection surface 24 is a plane or a curved surface”; Fig. 11: output image 115; Examiner’s Note: a projection surface 24 in form of a curved surface would result in a 3D object as an output image) Regarding claim 4, You further teaches the projection system according to claim 2, wherein the controller is configured to determine a position of the stereoscopic image on the 3D projection plane according to a preset eye position (Page 7, 3rd paragraph: “the parallel projection includes orthographic projection and oblique projection. In some embodiments of step S607, depending on the different viewing positions 40 and the distribution positions of the first projection image 30 on the projection surface 24, the first projection image 30 and the second projection image 30, the second projection image 30 and the third projection image 30 are arranged on the projection surface 24”; Page 7, 4th paragraph: “the second projection image 31 as the display image 115 does not visually deform due to different viewing positions 40 (e.g., the viewer viewing the break-angle display 11 in a plan view or viewing the break-angle display 11 in a bottom view”; Fig. 5: viewing position 40). Regarding claim 6, You further teaches the projection system according to claim 2, wherein the controller is configured to generate a transformation matrix (Last three paragraphs of page 6 continued to first three paragraphs of page 7: second conversion function Ps reads on “transformation matrix”) according to an angle of projection of the projection position (Page 7, first three paragraph: Mob is a parallel projection conversion matrix and the parallel projection includes orthographic projection and oblique projection, i.e., “an angle of projection of the projection position” is taken into consideration), a 3D range adjustment matrix (Page 6, last two paragraphs from the bottom: “The Mvtran is a conversion matrix that converts pixel coordinate parameters into projection plane coordinate parameters of the projection plane 24”; Page 8, 3rd paragraph: “The size of the different projection surface 24 allows the second projection image 31 as the display image 115 to have a different degree of stereoscopic impression”; Examiner’s Note: size adjustment results in a different conversion matrix Mvtran that converts pixel coordinate parameters into projection plane coordinate parameters of the projection plane 24, which makes it justified that conversion matrix Mvtran read on “3D range adjustment matrix”), and a preset eye position (Page 7, 3rd paragraph: a preset eye position is associated with a different viewing positions 40), the controller is configured to convert coordinate values of each pixel of an original image in the projection range into coordinate values corresponding to the 3D projection plane by using the transformation matrix to convert the 3D content image into the stereoscopic image information (Fig. 4; last three paragraphs of page 6 continued to first three paragraphs of page 7). Regarding claim 7, You further teaches the projection system according to claim 6, wherein the controller is configured to generate the 3D range adjustment matrix according to the projection range (Figs. 7, 14: break angle 212) and the 3D projection plane (Fig. 11: plane on which display image 115 is located), wherein the 3D range adjustment matrix is a matrix that converts a matrix formed by coordinate values of four endpoints of the projection range into a matrix formed by coordinate values of four endpoints of the 3D projection plane (Figs. 4, 6). Claim 10 is rejected for substantially the same rationale as applied to claim 1. Claim 11 is rejected for substantially the same rationale as applied to claim 2. Claim 12 is rejected for substantially the same rationale as applied to claim 3. Claim 13 is rejected for substantially the same rationale as applied to claim 4. Claim 15 is rejected for substantially the same rationale as applied to claim 6. Claim 16 is rejected for substantially the same rationale as applied to claim 7. Claim Rejections - 35 USC § 103 Claims 5 and 14 are rejected under 35 U.S.C. 103 as being unpatentable over You (CN 117812241 A, machine translation of which is used for this examination) in view of Shintani et al. (US 2024/0253465). Regarding claim 5, You further teaches the projection system according to claim 2, further comprising: the controller is configured to determine a position of the stereoscopic image on the 3D projection plane according to the preset eye position (Page 7, 4th paragraph: “the second projection image 31 as the display image 115 does not visually deform due to different viewing positions 40 (e.g., the viewer viewing the break-angle display 11 in a plan view or viewing the break-angle display 11 in a bottom view”; Fig. 5: viewing position 40). You does not further teach the projection system according to claim 2, further comprising: a camera, communicatively connected to the controller, configured to sense a position of at least one eye within the field of view of the image beam, wherein the controller is configured to obtain a preset eye position according to the position of the at least one eye. The differentiating limitation indicates an eye tracking camera is used to determine a preset eye position to adjust a field of view associated with a stereoscopic/floating image to be displayed. However, the technique is not new in the related art. Shintani, for instance, teaches in Fig. 1 and [0037] displaying an aerial image M1 corresponds to the position of a predetermined gaze point of the observer and in Figs. 3-4 and [0047], [0061] using an interior camera 62 and a gaze point detector (part of a controller) to detect the gaze point. Before the effective filing date of the invention, it would have been obvious for one ordinary skill in the art to combine Shintani’s technique with You’s technique adding an eye tracing camera that communicate with a controller to obtain a preset eye position. The motivation/suggestion would have been to enhance flexibility of the projection system regardless of a viewing position. Claim 14 is rejected for substantially the same rationale as applied to claim 5. Claims 8 and 17 are rejected under 35 U.S.C. 103 as being unpatentable over You (CN 117812241 A, machine translation of which is used for this examination) in view of Takahashi et al. (US 2022/0172474). Regarding claim 8, You does not further teach the projection system according to claim 2, wherein the controller is configured to control the image beam generated by the projection apparatus to form a mask region in a region of the 3D projection plane that does not display the stereoscopic image. The feature is not new, however. Takahashi, for instance, teaches in [0077] and Fig. 5 controlling the image beam generated by the projection apparatus to form a mask region (i.e., region mask image 52) in a region of the 3D projection plane (i.e., projection plane on which projection 3D region 53 is located) that does not display the stereoscopic image. Before the effective filing date of the invention, it would have been obvious for one ordinary skill in the art to combine Takahashi’s technique with You’s technique to enhance a contrast between the stereoscopic image and its surrounding. Claim 17 is rejected for substantially the same rationale as applied to claim 8. Claims 9 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over You (CN 117812241 A, machine translation of which is used for this examination) in view of Joseph (US 2016/0266479). Regarding claim 9, You does not further teach the projection system according to claim 2, wherein the controller is communicatively connected to an ambient light sensor and an ambient light source, the ambient light sensor is configured to provide an ambient light brightness signal, the controller is configured to provide an ambient light source adjustment signal, the controller is configured to adjust brightness of an output beam of the ambient light source according to the ambient light brightness signal, thereby generating a high contrast between brightness of a floating light of the image beam and the brightness of the output beam of the ambient light source. The features indicate use of ambient light source and ambient light sensor to adjust environmental lighting for the project system. The technique is not new, however. Joseph, for instance teaches in Fig. 5 and [0059] an ambient light sensor ([0059]: “an ambient light sensor may be used to adjust the light intensity of the light source 158”) and an ambient light source (Fig. 5: light source 158; [0059]: “the light source 158, which may be one or more LEDs or other light producing elements, may be modified to adjust to environmental factors, such as the ambient lighting of the projection surface 102”), the ambient light sensor is configured to provide an ambient light brightness signal ([0059]: “an environmental sensor 165 that detects changes in the environment”, “an ambient light sensor may be used to adjust the light intensity of the light source 158”), the controller is configured to provide an ambient light source adjustment signal ([0059]: “an ambient light sensor may be used to adjust the light intensity of the light source 158”; Examiner’s Note: the light intensity adjustment of light source 158 necessarily results from an ambient light source adjustment signal), the controller is configured to adjust brightness of an output beam of the ambient light source according to the ambient light brightness signal ([0059]: “the intensity may be changed to account for ambient lighting”, “an ambient light sensor may be used to adjust the light intensity of the light source 158”). Before the effective filing date of the invention, it would have been obvious for one ordinary skill in the art to modify the technique of Joseph with the technique of You to improve viewing effect of the projection system Claim 18 is rejected for substantially the same rationale as applied to claim 9. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure: US 2014/0347724 by Schultz teaches in Fig. 1 projector configured to project an image onto the projection screen 14 with the aid of a virtual mask 16. CN 109521564 A by Ma et al. teaches in Figs. 7-9 atmosphere light system 80 comprising a sensing module 81 and ambient light 84 to control ambient lighting for a projector. Any inquiry concerning this communication or earlier communications from the examiner should be directed to XUEMEI ZHENG whose telephone number is (571)272-1434. The examiner can normally be reached Monday-Friday: 9:30 pm-6:00 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Benjamin Lee can be reached at 571-272-2963. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /XUEMEI ZHENG/Primary Examiner, Art Unit 2629
Read full office action

Prosecution Timeline

Mar 05, 2025
Application Filed
Jan 09, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596441
Chinese Character Input Method, System and Keyboard
2y 5m to grant Granted Apr 07, 2026
Patent 12572318
SYSTEMS AND METHODS FOR DYNAMICALLY SHARING MEDIA BASED ON CONTACT PROXIMITY, GROUP PARTICIPATION, OR EVENT
2y 5m to grant Granted Mar 10, 2026
Patent 12563939
DISPLAY SUBSTRATE AND DISPLAY DEVICE
2y 5m to grant Granted Feb 24, 2026
Patent 12554140
POSITIONING, STABILISING, AND INTERFACING STRUCTURES AND SYSTEM INCORPORATING SAME
2y 5m to grant Granted Feb 17, 2026
Patent 12554136
COLOR CORRECTION FOR XR DISPLAY
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
85%
Grant Probability
99%
With Interview (+14.0%)
2y 1m
Median Time to Grant
Low
PTA Risk
Based on 707 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month