Prosecution Insights
Last updated: April 19, 2026
Application No. 18/775,893

BIT PLANE DITHERING APPARATUS

Non-Final OA §103§DP
Filed
Jul 17, 2024
Examiner
LIU, GORDON G
Art Unit
2618
Tech Center
2600 — Communications
Assignee
Texas Instruments Incorporated
OA Round
1 (Non-Final)
83%
Grant Probability
Favorable
1-2
OA Rounds
2y 4m
To Grant
98%
With Interview

Examiner Intelligence

Grants 83% — above average
83%
Career Allow Rate
556 granted / 673 resolved
+20.6% vs TC avg
Strong +15% interview lift
Without
With
+15.1%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
29 currently pending
Career history
702
Total Applications
across all art units

Statute-Specific Performance

§101
6.7%
-33.3% vs TC avg
§103
73.3%
+33.3% vs TC avg
§102
3.0%
-37.0% vs TC avg
§112
5.7%
-34.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 673 resolved cases

Office Action

§103 §DP
DETAILED ACTION The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claims 1-20 are pending under this Office action. Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP §§ 706.02(l)(1) - 706.02(l)(3) for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/process/file/efs/guidance/eTD-info-I.jsp. Claims 1-20 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-20 of U.S. Patent No. 12,067,956. Although the claims at issue are not identical, they are not patentably distinct from each other because they can read on to each other, see the following mapping table. Application No. 18/775,893 (Instant Application) U.S. Patent No. 12,067,956 1. A system, comprising: a controller configured to: obtain, for an image frame stored in a frame memory image, data associated with a color component; and generate dithered bit planes having a dither noise pattern, the dithered bit planes including time repeated bit plane sequences in the image frame; a spatial light modulator (SLM) coupled to the controller, the SLM configured to project an image of the image frame according to the dithered bit planes; and a light source optically coupled to the SLM, the light source configured to provide light for projecting the image. 10. A system, comprising: a controller configured to: obtain, for an image frame stored in a frame memory image, data associated with a color component; and generate dithered bit planes having a dither noise pattern, the dithered bit planes including time repeated bit plane sequences in the image frame; and a display device coupled to the controller, the display device configured to display the image frame according to the dithered bit planes. 11. The system of claim 10, wherein the display device comprises: a spatial light modulator (SLM) configured to project an image of the image frame according to the dithered bit planes; and a light source optically coupled to the SLM, the light source configured to provide light for projecting the image, the light having multiple wavelengths that provide color modes to display the image frame. 2. The system of claim 1, wherein the SLM is a digital mirror device (DMD), a phase light modulator (PLM), a liquid crystal display (LCD), or a microscopic light emitting diode (microLED). 12. The system of claim 10, wherein the display device includes a digital mirror device (DMD), a phase light modulator (PLM), or a microscopic light emitting diode (microLED). 3. The system of claim 1, wherein the controller comprises: a frame memory configured to store an image frame; a frame memory controller coupled to the frame memory, the frame memory controller configured to obtain image data from the image frame, the image data associated with a color component of the image frame; a dither noise mask generator configured to provide a dither noise mask according to a dither noise level for the image data; a dither percentage calculator coupled to the frame memory controller, the dither percentage calculator configured to produce a dither noise level based on the image data; and a bit plane generator coupled to the frame memory controller, to the dither percentage calculator, and to the dither noise mask generator, the bit plane generator configured to produce dithered bit planes based on the dither noise mask and the dither noise level. 1. A controller, comprising: a frame memory configured to store an image frame; a frame memory controller coupled to the frame memory, the frame memory controller configured to obtain image data from the image frame, the image data associated with a color component of the image frame; a dither noise mask generator configured to provide a dither noise mask according to a dither noise level for the image data; a dither percentage calculator coupled to the frame memory controller, the dither percentage calculator configured to produce a dither noise level based on the image data; and a bit plane generator coupled to the frame memory controller, to the dither percentage calculator, and to the dither noise mask generator, the bit plane generator configured to produce dithered bit planes based on the dither noise mask and the dither noise level. 4. The system of claim 1, wherein generating dithered bit planes comprises: determining a dither noise level for the data based on the data and a transfer function; determining, based on the dither noise level, a dither noise pattern for the data; and generating bit plane data based on the dither noise pattern. 2. The controller of claim 1, wherein the dither percentage calculator is further configured to provide the dither noise level for the image data according to a transfer function. 3. The controller of claim 1, wherein the bit plane generator is configured to provide, to a display device, the dithered bit planes including a dither noise pattern according to the dither noise mask. 5. The system of claim 4, wherein the dither noise level is represented by a first number of bits, the data is represented by a second number of bits, and the second number of bits is different than the first number of bits. 13. A method, comprising: obtaining, by a frame memory controller, image data associated with one or more color components of an image frame, from a frame memory; determining a dither noise level for the image data based on the image data and a transfer function, wherein the dither noise level is represented by a first number of bits, the image data is represented by a second number of bits, and the second number of bits is different than the first number of bits; determining, based on the dither noise level, a dither noise pattern for the image data; generating, by a bit plane generator, bit plane data based on the dither noise pattern; and loading, by a display formatter, the bit plane data onto a buffer to send to a display device. 6. The system of claim 5, wherein the data is divided into blocks of bits, wherein the dither noise level and one or more dither noise masks are obtained for each block, and wherein the bit plane data is determined by combining the dither noise masks for the blocks. 14. The method of claim 13, wherein the image data is divided into blocks of bits, wherein the dither noise level and one or more dither noise masks are obtained for each block, and wherein the bit plane data is determined by combining the dither noise masks for the blocks. 7. The system of claim 4, wherein the transfer function is provided by a look-up table (LUT), and wherein the dither noise level is obtained from the LUT according to a pixel value in the data. 17. The method of claim 13, wherein the transfer function is provided by a look-up table (LUT), and wherein the dither noise level is obtained from the LUT according to a pixel value in the image data. 8. A method comprising: obtaining, by a controller, for an image frame stored in a frame memory image, data associated with a color component; and generating, by the controller, dithered bit planes having a dither noise pattern, the dithered bit planes including time repeated bit plane sequences in the image frame; and transmitting, by the controller to a display device, the dithered bit planes. 13. A method, comprising: obtaining, by a frame memory controller, image data associated with one or more color components of an image frame, from a frame memory; determining a dither noise level for the image data based on the image data and a transfer function, wherein the dither noise level is represented by a first number of bits, the image data is represented by a second number of bits, and the second number of bits is different than the first number of bits; determining, based on the dither noise level, a dither noise pattern for the image data; generating, by a bit plane generator, bit plane data based on the dither noise pattern; and loading, by a display formatter, the bit plane data onto a buffer to send to a display device. 9. The method of claim 8, wherein generating dithered bit planes comprises: determining a dither noise level for the data based on the data and a transfer function; determining, based on the dither noise level, a dither noise pattern for the data; and generating bit plane data based on the dither noise pattern. 13. determining a dither noise level for the image data based on the image data and a transfer function, wherein the dither noise level is represented by a first number of bits, the image data is represented by a second number of bits, and the second number of bits is different than the first number of bits; determining, based on the dither noise level, a dither noise pattern for the image data; generating, by a bit plane generator, bit plane data based on the dither noise pattern; 10. The method of claim 9, wherein the dither noise level is represented by a first number of bits, the data is represented by a second number of bits, and the second number of bits is different than the first number of bits. 13. determining a dither noise level for the image data based on the image data and a transfer function, wherein the dither noise level is represented by a first number of bits, the image data is represented by a second number of bits, and the second number of bits is different than the first number of bits; 11. The method of claim 10, wherein the data is divided into blocks of bits, wherein the dither noise level and one or more dither noise masks are obtained for each block, and wherein the bit plane data is determined by combining the dither noise masks for the blocks. 14. The method of claim 13, wherein the image data is divided into blocks of bits, wherein the dither noise level and one or more dither noise masks are obtained for each block, and wherein the bit plane data is determined by combining the dither noise masks for the blocks. 12. The method of claim 9, wherein the transfer function is provided by a look-up table (LUT), and wherein the dither noise level is obtained from the LUT according to a pixel value in the data. 17. The method of claim 13, wherein the transfer function is provided by a look-up table (LUT), and wherein the dither noise level is obtained from the LUT according to a pixel value in the image data. 13. The method of claim 8, further comprising displaying, by a display device, a display image based on the dithered bit planes. 10. a display device coupled to the controller, the display device configured to display the image frame according to the dithered bit planes. 14. The method of claim 13, wherein displaying the display image comprises: producing, by a light source, light; and projecting, by a spatial light modulator (SLM), the display image based on the dithered bit planes and the light from the light source. 11. The system of claim 10, wherein the display device comprises: a spatial light modulator (SLM) configured to project an image of the image frame according to the dithered bit planes; and a light source optically coupled to the SLM, the light source configured to provide light for projecting the image, the light having multiple wavelengths that provide color modes to display the image frame. 15. A method comprising: obtaining, by a frame memory controller, image data from an image frame, the image data associated with a color component of the image frame; providing, by a dither noise mask generator, a dither noise mask according to a dither noise level for the image data; producing, by a dither percentage calculator, a dither noise level based on the image data; and producing, by a bit plane generator, dithered bit planes based on the dither noise mask and the dither noise level. 1. A controller, comprising: a frame memory configured to store an image frame; a frame memory controller coupled to the frame memory, the frame memory controller configured to obtain image data from the image frame, the image data associated with a color component of the image frame; a dither noise mask generator configured to provide a dither noise mask according to a dither noise level for the image data; a dither percentage calculator coupled to the frame memory controller, the dither percentage calculator configured to produce a dither noise level based on the image data; and a bit plane generator coupled to the frame memory controller, to the dither percentage calculator, and to the dither noise mask generator, the bit plane generator configured to produce dithered bit planes based on the dither noise mask and the dither noise level. 16. The method of claim 15, further comprising: loading the dithered bit planes into a buffer; and sending the dithered bit planes from the buffer to a display device. 13. loading, by a display formatter, the bit plane data onto a buffer to send to a display device. 17. The method of claim 16, further comprising displaying, by a display device, a display image based on the dithered bit planes. 10. a display device coupled to the controller, the display device configured to display the image frame according to the dithered bit planes. 18. The method of claim 15, wherein the dither noise level is represented by a first number of bits, the image data is represented by a second number of bits, and the second number of bits is different than the first number of bits. 13. determining a dither noise level for the image data based on the image data and a transfer function, wherein the dither noise level is represented by a first number of bits, the image data is represented by a second number of bits, and the second number of bits is different than the first number of bits; 19. The method of claim 15, wherein the image data is divided into blocks of bits, wherein the dither noise level and one or more dither noise masks are obtained for each block, and wherein the dithered bit planes are determined by combining the dither noise masks for the blocks. 14. The method of claim 13, wherein the image data is divided into blocks of bits, wherein the dither noise level and one or more dither noise masks are obtained for each block, and wherein the bit plane data is determined by combining the dither noise masks for the blocks. 20. The method of claim 15, wherein determining the dither noise level for the image data is further performed based on a transfer function. 2. The controller of claim 1, wherein the dither percentage calculator is further configured to provide the dither noise level for the image data according to a transfer function. Claim 1 of the instant application is drawn to a system, comprising: a controller configured to: obtain, for an image frame stored in a frame memory image, data associated with a color component; and generate dithered bit planes having a dither noise pattern, the dithered bit planes including time repeated bit plane sequences in the image frame; a spatial light modulator (SLM) coupled to the controller, the SLM configured to project an image of the image frame according to the dithered bit planes; and a light source optically coupled to the SLM, the light source configured to provide light for projecting the image. While the exact wordings of claim 10 of the ‘956 patent may not be the same as that of claim 1 of the instant application, but there is no significant difference in scope between the claim 1 of the instant application and the claim 10 of the patent ‘956. Therefore, Claim 1 of the instant application cannot be considered patentably distinct over claim 10 of the ‘956 patent. Claim Interpretation – 35 USC 112 (f) The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. Claim Limitation Interpreted under 35 USC 112(f) Use of the word "means" (or "step for") in a claim with functional language creates a rebuttable presumption that the claim element is to be treated in accordance with 35 U.S.C. 112(f) (pre-AIA 35 U.S.C. 112, sixth paragraph). The presumption that 35 U.S.C. 112(f) (pre-AIA 35 U.S.C. 112, sixth paragraph) is invoked is rebutted when the function is recited with sufficient structure, material, or acts within the claim itself to entirely perform the recited function. Absence of the word "means" (or "step for") in a claim creates a rebuttable presumption that the claim element is not to be treated in accordance with 35 U.S.C. 112(f) (pre-AIA 35 U.S.C. 112, sixth paragraph). The presumption that 35 U.S.C. 112(f) (pre-AIA 35 U.S.C. 112, sixth paragraph) is not invoked is rebutted when the claim element recites function but fails to recite sufficiently definite structure, material or acts to perform that function. Claim elements in this application that use the word "means" (or "step for") are presumed to invoke 35 U.S.C. 112(f) except as otherwise indicated in an Office action. Similarly, claim elements that do not use the word "means" (or "step for") are presumed not to invoke 35 U.S.C. 112(f) except as otherwise indicated in an Office action. Claim limitations "a controller configured to " has been interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because it uses/they use a generic placeholder "a controller" coupled with functional language "configured to obtain/generate" without reciting sufficient structure to achieve the function. Furthermore, the generic placeholder is not preceded by a structural modifier. Since the claim limitations invokes 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, claim 1, and its related dependent claims 2-7, has been interpreted to cover the corresponding structure described in the specification that achieves the claimed function, and equivalents thereof. A review of the specification shows that the following appears to be the corresponding structure described in the specification for the 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph limitation: " a controller configured to obtain/generate" is interpreted as to " The image processing system 200 includes an image display controller 201 configured to receive image data from an image or video application 202 in the form of image or video signals 203 and process the image data to provide processed image data. The image display controller 201 sends the processed image data in the form of voltage signals 204 to the display device 290 for displaying respective images. For example, the image display controller 201 may be part of the controller 112 and the display device 290 is an example of the display device 110 "). If applicant wishes to provide further explanation or dispute the examiner's interpretation of the corresponding structure, applicant must identify the corresponding structure with reference to the specification by page and line number, and to the drawing, if any, by reference characters in response to this Office action. If applicant does not intend to have the claim limitations treated under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may amend the claim(s) so that it/they will clearly not invoke 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, or present a sufficient showing that the claim recites/recite sufficient structure, material, or acts for performing the claimed function to preclude application of 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. For more information, see MPEP § 2173 et seq. and Supplementary Examination Guidelines for Determining Compliance With 35 U.S.C. 112 and for Treatment of Related Issues in Patent Applications, 76 FR 7162, 7167 (Feb. 9, 2011). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-2, 8, and 13-14 are rejected under 35 U.S.C. 103 as being unpatentable over Morgan, etc. (US 20020005913 A1) in view of Kempf (US 20170339383 A1), further in view of Bhattacharjee, etc. (US 20210304365 A1) and Nicholson (US 20170280119 A1). Regarding claim 1, Morgan teaches that a system (See Morgan: Figs. 4-5, and [0058], "FIG. 5 is a schematic view of a micromirror-based image projection system 500 according to the present invention. In FIG. 5, light from light source 504 is focused on the micromirror 502 by lens 506. Although shown as a single lens, lens 506 is typically a group of lenses and mirrors which together focus and direct light from the light source 504 onto the surface of the micromirror device 502. Image data and control signals from controller 514, which include the integer and fractional bit planes described above, cause some mirrors to rotate to an on position and others to rotate to an off position. Mirrors on the micromirror device that are rotated to an off position reflect light to a light trap 508 while mirrors rotated to an on position reflect light to projection lens 510, which is shown as a single lens for simplicity. Projection lens 510 focuses the light modulated by the micromirror device 502 onto an image plane or screen 512"), comprising: a controller configured to (See Morgan: Figs. 4-5, and [0058], "FIG. 5 is a schematic view of a micromirror-based image projection system 500 according to the present invention. In FIG. 5, light from light source 504 is focused on the micromirror 502 by lens 506. Although shown as a single lens, lens 506 is typically a group of lenses and mirrors which together focus and direct light from the light source 504 onto the surface of the micromirror device 502. Image data and control signals from controller 514, which include the integer and fractional bit planes described above, cause some mirrors to rotate to an on position and others to rotate to an off position. Mirrors on the micromirror device that are rotated to an off position reflect light to a light trap 508 while mirrors rotated to an on position reflect light to projection lens 510, which is shown as a single lens for simplicity. Projection lens 510 focuses the light modulated by the micromirror device 502 onto an image plane or screen 512"): obtain, for an image frame stored in a frame memory image, data associated with a color component (See Morgan: Figs. 4-5, and [0029], "A single-modulator display system sequentially produces three single color images to provide the perception of a full color image. A three-modulator display system delivers three single color images to the display screen simultaneously to allow the viewer's eye to integrate the images and perceive a full-color image. In a parallel color display system, each single-color intensity work is used during the entire frame period. In a sequential color system, each single-color intensity word is used during roughly one-third of the frame period. Furthermore, to reduce color artifacts, sequential color systems may produce multiple single-color images in a single frame time. For example, a sequential color display system may create red, green, blue, white, red, green, and blue images in a single frame period". Note that the three color components are mapped to data associated with a color component); and generate dithered bit planes having a dither noise pattern, the dithered bit planes including time repeated bit sequences in the image frame; and a spatial light modulator (SLM) coupled to the controller, the SLM configured to project an image of the image frame according to the dithered bit planes; and a light source optically coupled to the SLM, the light source configured to provide light for projecting the image. However, Morgan fails to explicitly disclose that obtain, for an image frame stored in a frame memory image, data; generate dithered bit planes having a dither noise pattern, the dithered bit planes including time repeated bit sequences in the image frame; and a spatial light modulator (SLM) coupled to the controller, the SLM configured to project an image of the image frame according to the dithered bit planes; and a light source optically coupled to the SLM, the light source configured to provide light for projecting the image. However, Kempf teaches that obtain, for an image frame stored in a frame memory image, data (See Kempf: Fig. 1, and [0035], "In FIG. 3, the DMD controller 301 receives pixel data. A block 311 converts the incoming pixel data, which has multiple bits for each color for each pixel, these are mapped into bit planes. In an example, a frame for display can include colors red, green and blue and can include, in one example, 8 bits per pixel for each color or 24 bits for each pixel. The conversion block 311 converts the image frame data received by the ASIC into bit planes and stores the bit planes in a frame buffer 305. The data can be formatted before writing the data to the frame buffer 305. In a second frame buffer 307, the data is read from the frame buffer 305. By switching between the two frame buffers, the frame being read for transmission to the SLM is separated from the frame being written with bit planes corresponding to the incoming frame image data. Employing two frame buffers enables the system to operate continuously, receiving data, converting and writing bit planes to a first frame buffer while simultaneously reading bit plane data from a previously loaded frame buffer. After the data is read from the frame buffer 307 in FIG. 3, additional data formatting can be performed to ready the data for transmission on the high speed interface I/F. For example, in a packet data interface, the data packets can be formed". Note that the frame buffers 305 and 307 are mapped to the frame memory, image data stored in the frame buffer are mapped to the image data); generate dithered bit planes (See Kempf: Fig. 9, and [0018], "In described examples, an optical projection system has reduced or eliminated visible artifacts due to diffraction induced non-linearity in gray scale shade rendition. Example embodiments are especially advantageous when dithering techniques are used to produce gray scale shades between the gray scale shades that the system can directly achieve. In example embodiments, dithering is applied only to low intensity bit planes, reducing the artifacts to levels below the perceivable levels for the human visual system"; and [0063], “At step 913, the translation look up table is used to translate to the entry in the DAR look up table. After step 913, the method transitions to step 915, and the necessary dithering is applied to achieve the desired pixel intensity. Finally the method then flows to step 905 and the gray shade is displayed”. Note that the low intensity bit plane with dithering applied are mapped to the frame memory, image data stored in the frame buffer are mapped to generate dithered bit planes) having a dither noise pattern, the dithered bit planes including time repeated bit sequences in the image frame; and a spatial light modulator (SLM) coupled to the controller, the SLM configured to project an image of the image frame according to the dithered bit planes (See Kempf: Fig. 2, and [0003], "Spatial light modulators (SLMs) such as digital micro-mirror devices (DMDs), liquid crystal display (LCD) and liquid-crystal-on-silicon (LCoS) devices are often used to project images in projection systems"; and [0025], “The microprocessor 211 is coupled to a digital DMD controller circuit 203. DMD controller 203 is another digital video processing integrated circuit. Sometimes this controller 203 can be implemented using a customized integrated circuit or an application specific integrated circuit (ASIC). An analog circuit configured to manage power and LED illumination referred to as a power management integrated circuit (PMIC) and numbered 215 is also provided. The PMIC 215 controls the intensity and power to the LEDs 209. DMD controller 203 provides digital data to the DMD 201 for modulating the illumination light that strikes the DMD surface, and the PMIC DMD controller 215 also provides power and analog signals to the DMD 201. Light rays from the illumination sources LEDs 209 are input to illumination components in block 215 such as the cover prism and wedge described above, and strike the reflective mirrors inside the package of DMD 201. The reflected light for projection leaves the surface of the DMD 201 and travels into the projection optics 207 which operate to project the image as described above. Together the integrated circuits 203, 215 cause the DMD 201 and the optical components 215, 207 to operate to project the digital video signals as an image”. Note that the SLM 201 is coupled to the controller 203 and projects the received DVI images dithered according to the dithering requirements per Fig. 9, which is mapped to this cited limitation of “a spatial light modulator (SLM) coupled to the controller, the SLM configured to project an image of the image frame according to the dithered bit planes”); and a light source optically coupled to the SLM, the light source configured to provide light for projecting the image (See Kempf: Fig. 1, and [0021], “In system 100, illumination is provided by the red, green and blue (RGB) LEDs 102, 103 and 106. However, alternative illumination sources can also be used, such as incandescent lamps with reflectors, single lamps with color wheels, laser and laser-phosphor illumination. The LEDs can include an optical coating or collimating optics 41 which operate to collect and collimate the light output by the LEDs. Other color schemes can be used, such as white, cyan, etc. As illustrated in FIG. 1, two LEDs 102 and 106 are shown on a single integrated device, and they be the red and green LED devices in this example, while the blue LED 103 is a separate discrete component. In alternative systems, three individual LEDs are used, and two dichroic plates in the form of an X box shape can be used to combine the three colors (RGB) into an illumination source. In the particular example shown in FIG. 1, dichroic plate 108 reflects the light from red LED 106 at one surface, reflects the light from green LED 102 at a second surface, and passes the light from blue LED 103 through and to the illumination path. In alternative arrangements, many LEDs can be used or multiple LEDs can be used, instead of one LED for each color. Color wheels can also be used with white illumination light, such as to create the colors”; and [0024], “As shown in FIG. 1, wedge prism 115 and TIR prism 116 together form a coupling prism that accomplishes the needed separation of the illumination light rays directed onto the spatial light modulator from the image light rays coming from the spatial light modulator. The image light rays exit prism 116 and are coupled into a projection system that includes optical elements 124, 126, and 129”. Note that the LED light source 102, 103, and 106 provides illumination light through path with mirror 113, separated the image light (from the SLM) from the illumination light using the prisms RTIR 115 and 116 pair, and project the images to display using the optical system 124, 126, and 129; this whole process is mapped to the instant cited limitation of “a light source optically coupled to the SLM, the light source configured to provide light for projecting the image”). Therefore, it would have been obvious to one of ordinary skill in the art at the time of the invention was effectively filed to modify Morgan to have obtain, for an image frame stored in a frame memory image, data; generate dithered bit planes having a dither noise pattern, the dithered bit planes including time repeated bit sequences in the image frame; and a spatial light modulator (SLM) coupled to the controller, the SLM configured to project an image of the image frame according to the dithered bit planes; and a light source optically coupled to the SLM, the light source configured to provide light for projecting the image as taught by Kempf in order to improve the images for display to a human visual system (HVS) (See Kempf: Figs. 1-2, and [0030], "Further “bit planes” can be defined to format the images for the spatial light modulator and also to further improve the images for display to the HVS. Because the pixel elements for a binary spatial light modulator are either “on” or “off”, the intensity observed for a particular pixel is determined by the amount of time that pixel is on during the frame display time"). Morgan teaches a method and system that may generate fractional display bit with blue noise masks for the micromirror-based projection system; while Kempf teaches a projection system and method that may reduce the diffraction artifacts by projecting the dithered images using LED light sources to illuminate the SLM projected images. Therefore, it is obvious to one of ordinary skill in the art to modify Morgan by Kempf to dither the received image and use the light source to illuminate the SLM projected images. The motivation to modify Morgan by Kempf is "Use of known technique to improve similar devices (methods, or products) in the same way". However, Morgan, modified by Kempf, fails to explicitly disclose that generate dithered bit planes having a dither noise pattern, the dithered bit planes including time repeated bit sequences in the image frame. However, Bhattacharjee teaches that generate dithered bit planes having a dither noise pattern (See Bhattacharjee: Fig. 14, and [0191], " According to various embodiments, display engine 1430 can invoke or activate pixel processing engine 1434 to apply dither to pixels of a frame and inject a pseudo-random noise level. An applied noise level can be selected based on brightness of a local pixel region and/or characteristics of one or more temporally neighboring frames. A local pixel region can be a pixel region that surrounds a pixel to which a determination is made of whether to apply pseudo-random noise or a level of pseudo-random noise to apply. A temporally neighboring frame can be one or more frames before and/or after the frame to which dither and noise are to be applied. In addition, a configurable noise level from display configurations 1414 can be used to adjust a level of noise applied by pixel processing engine 1434. In some examples, a change of a scene can influence whether to consider a level of brightness of noise of a frame prior to the frame to which dither and noise are to be applied. Various embodiments described herein provide additional examples of noise application techniques. Note that in some examples, a determination can be made not to apply dither or noise. A processed image frame can be output to display 1440 for display". Note that apply dither to pixels of a frame and inject a pseudo-random noise (pseudo-random noise with adjustable noise level is a type/pattern of noise) level is mapped to the instant limitation of “generate dithered bit planes having a dither noise pattern”). Therefore, it would have been obvious to one of ordinary skill in the art at the time of the invention was effectively filed to modify Morgan to have generate dithered bit planes having a dither noise pattern as taught by Bhattacharjee in order to enable applying pixel processing techniques to improve viewability or visual appeal of displayed images given device display properties or other conditions after pixels are generated and before display of pixels (See Bhattacharjee: Fig. 15, and [0002], "After pixels are generated and before display of pixels, pixel processing techniques can be applied to improve the viewability or visual appeal of displayed images given device display properties or other conditions. Color banding (also called posterization) is a common pixel processing technique for displays that use eight (8) or fewer bits per color channel (e.g., Red, Green, Blue color). Color banding involves conversion of a continuous gradation of color to multiples regions with fewer resulting colors. However, color banding can introduce visual artifacts that negatively impact image quality"). Morgan teaches a method and system that may generate fractional display bit with blue noise masks for the micromirror-based projection system; while Bhattacharjee teaches a method and system that may dither the image pixels with noise levels and pattern based on the image brightness estimations. Therefore, it is obvious to one of ordinary skill in the art to modify Morgan by Bhattacharjee to dither the image data based on the image data characteristics. The motivation to modify Morgan by Bhattacharjee is "Use of known technique to improve similar devices (methods, or products) in the same way". However, Morgan, modified by Kempf and Bhattacharjee, fails to explicitly disclose that the dithered bit planes including time repeated bit sequences in the image frame. However, Nicholson teaches that the dithered bit planes including time repeated bit sequences in the image frame (See Nicholson: Fig. 5, and [0086], "A blend coefficient module 505 at control device 210-C can produce a mask index K (K being a counter for pixels across blend zone 205) that is representative of the geometry between projectors 201, and for each of 2.sup.n bitplanes; bitplane coefficient module 505 can also produce dithering data for dithering between 2.sup.m-number of frames F, which is conditionally combined with the mask index, depending on whether mask index K is less than 2.sup.n-1 and whether a frame count, as determined from a logical component 507, is greater than F, as determined at a logical component 509. If true, bitplane mask K+I is selected from table 513, thus the bitplane mask is dithered between mask K and mask K+I: for F frames out of 2.sup.m frames (e.g. "D" number of frames referred to above), the bitplane mask K+1 is used whereas in the remaining frames bitplane mask K is used. The dithering is used to smooth brightness steps between pixels in blend zone 205"); and [0087], "Bitplane mask table 513 output is combined with the respective 2.sup.n bitplanes at a logical component 515 to mask the appropriate bitplanes for the blend coefficient, which produces data that enables a bitplane manager 517 to control a DMD 519 at a projector 201 (or other digitally controllable light modulator)". Note that the selected bitplanes for dithering are selected a series of frames according to the table based on the frame sequence positions, thus, generating the dithered bitplanes in time repeated sequences). Therefore, it would have been obvious to one of ordinary skill in the art at the time of the invention was effectively filed to modify Morgan to have the dithered bit planes including time repeated bit sequences in the image frame as taught by Nicholson in order to reduce and/or eliminate banding in blend zone (See Nicholson: Figs. 1-2, and [0068], "Hence, there can be reduced and/or eliminated banding in blend zone 205, as compared to FIG. 1. In other words, the regions of high brightness and low brightness have been reduced and/or eliminated, and the total brightness is uniform across the frame period; for example, compare the total "A+B" brightness of this region in FIG. 4 and FIG. 1"). Morgan teaches a method and system that may generate fractional display bit with blue noise masks for the micromirror-based projection system; while Nicholson teaches a multiple digital projection system and method that may dither mask the image data and generate bit plane arranged in a sequence to reduce the blend artifacts. Therefore, it is obvious to one of ordinary skill in the art to modify Morgan by Nicholson to generate bit planes based on the dithering masks. The motivation to modify Morgan by Nicholson is "Use of known technique to improve similar devices (methods, or products) in the same way". Regarding claim 2, Morgan, Kempf, Bhattacharjee, and Nicholson teach all the features with respect to claim 1 as outlined above. Further, Kempf teaches that the system of claim 1, wherein the SLM is a digital mirror device (DMD), a phase light modulator (PLM), a liquid crystal display (LCD), or a microscopic light emitting diode (microLED) (See Kempf: Fig. 2, and [0003], "Spatial light modulators (SLMs) such as digital micro-mirror devices (DMDs), liquid crystal display (LCD) and liquid-crystal-on-silicon (LCoS) devices are often used to project images in projection systems". Note that the Spatial light modulators (SLMs) such as digital micro-mirror devices (DMDs) is mapped to this cited limitation of “the SLM is a digital mirror device (DMD), a phase light modulator (PLM), a liquid crystal display (LCD), or a microscopic light emitting diode (microLED)”). Regarding claim 8, Morgan, Kempf, Bhattacharjee, and Nicholson teach all the features with respect to claim 1 as outlined above. Further, Morgan, Kempf, Bhattacharjee, and Nicholson teach that a method (See Morgan: Figs. 4-5, and [0058], "FIG. 5 is a schematic view of a micromirror-based image projection system 500 according to the present invention. In FIG. 5, light from light source 504 is focused on the micromirror 502 by lens 506. Although shown as a single lens, lens 506 is typically a group of lenses and mirrors which together focus and direct light from the light source 504 onto the surface of the micromirror device 502. Image data and control signals from controller 514, which include the integer and fractional bit planes described above, cause some mirrors to rotate to an on position and others to rotate to an off position. Mirrors on the micromirror device that are rotated to an off position reflect light to a light trap 508 while mirrors rotated to an on position reflect light to projection lens 510, which is shown as a single lens for simplicity. Projection lens 510 focuses the light modulated by the micromirror device 502 onto an image plane or screen 512") comprising: obtaining, by a controller, for an image frame stored in a frame memory image (See Kempf: Fig. 1, and [0035], "In FIG. 3, the DMD controller 301 receives pixel data. A block 311 converts the incoming pixel data, which has multiple bits for each color for each pixel, these are mapped into bit planes. In an example, a frame for display can include colors red, green and blue and can include, in one example, 8 bits per pixel for each color or 24 bits for each pixel. The conversion block 311 converts the image frame data received by the ASIC into bit planes and stores the bit planes in a frame buffer 305. The data can be formatted before writing the data to the frame buffer 305. In a second frame buffer 307, the data is read from the frame buffer 305. By switching between the two frame buffers, the frame being read for transmission to the SLM is separated from the frame being written with bit planes corresponding to the incoming frame image data. Employing two frame buffers enables the system to operate continuously, receiving data, converting and writing bit planes to a first frame buffer while simultaneously reading bit plane data from a previously loaded frame buffer. After the data is read from the frame buffer 307 in FIG. 3, additional data formatting can be performed to ready the data for transmission on the high speed interface I/F. For example, in a packet data interface, the data packets can be formed". Note that the frame buffers 305 and 307 are mapped to the frame memory, image data stored in the frame buffer are mapped to the image data), data associated with a color component (See Morgan: Figs. 4-5, and [0029], "A single-modulator display system sequentially produces three single color images to provide the perception of a full color image. A three-modulator display system delivers three single color images to the display screen simultaneously to allow the viewer's eye to integrate the images and perceive a full-color image. In a parallel color display system, each single-color intensity work is used during the entire frame period. In a sequential color system, each single-color intensity word is used during roughly one-third of the frame period. Furthermore, to reduce color artifacts, sequential color systems may produce multiple single-color images in a single frame time. For example, a sequential color display system may create red, green, blue, white, red, green, and blue images in a single frame period". Note that the three color components are mapped to data associated with a color component); and generating, by the controller, dithered bit planes (See Kempf: Fig. 9, and [0018], "In described examples, an optical projection system has reduced or eliminated visible artifacts due to diffraction induced non-linearity in gray scale shade rendition. Example embodiments are especially advantageous when dithering techniques are used to produce gray scale shades between the gray scale shades that the system can directly achieve. In example embodiments, dithering is applied only to low intensity bit planes, reducing the artifacts to levels below the perceivable levels for the human visual system"; and [0063], “At step 913, the translation look up table is used to translate to the entry in the DAR look up table. After step 913, the method transitions to step 915, and the necessary dithering is applied to achieve the desired pixel intensity. Finally the method then flows to step 905 and the gray shade is displayed”. Note that the low intensity bit plane with dithering applied are mapped to the frame memory, image data stored in the frame buffer are mapped to generate dithered bit planes) having a dither noise pattern (See Bhattacharjee: Fig. 14, and [0191], " According to various embodiments, display engine 1430 can invoke or activate pixel processing engine 1434 to apply dither to pixels of a frame and inject a pseudo-random noise level. An applied noise level can be selected based on brightness of a local pixel region and/or characteristics of one or more temporally neighboring frames. A local pixel region can be a pixel region that surrounds a pixel to which a determination is made of whether to apply pseudo-random noise or a level of pseudo-random noise to apply. A temporally neighboring frame can be one or more frames before and/or after the frame to which dither and noise are to be applied. In addition, a configurable noise level from display configurations 1414 can be used to adjust a level of noise applied by pixel processing engine 1434. In some examples, a change of a scene can influence whether to consider a level of brightness of noise of a frame prior to the frame to which dither and noise are to be applied. Various embodiments described herein provide additional examples of noise application techniques. Note that in some examples, a determination can be made not to apply dither or noise. A processed image frame can be output to display 1440 for display". Note that apply dither to pixels of a frame and inject a pseudo-random noise (pseudo-random noise with adjustable noise level is a type/pattern of noise) level is mapped to the instant limitation of “generate dithered bit planes having a dither noise pattern”), the dithered bit planes including time repeated bit plane sequences in the image frame (See Nicholson: Fig. 5, and [0086], "A blend coefficient module 505 at control device 210-C can produce a mask index K (K being a counter for pixels across blend zone 205) that is representative of the geometry between projectors 201, and for each of 2.sup.n bitplanes; bitplane coefficient module 505 can also produce dithering data for dithering between 2.sup.m-number of frames F, which is conditionally combined with the mask index, depending on whether mask index K is less than 2.sup.n-1 and whether a frame count, as determined from a logical component 507, is greater than F, as determined at a logical component 509. If true, bitplane mask K+I is selected from table 513, thus the bitplane mask is dithered between mask K and mask K+I: for F frames out of 2.sup.m frames (e.g. "D" number of frames referred to above), the bitplane mask K+1 is used whereas in the remaining frames bitplane mask K is used. The dithering is used to smooth brightness steps between pixels in blend zone 205"); and [0087], "Bitplane mask table 513 output is combined with the respective 2.sup.n bitplanes at a logical component 515 to mask the appropriate bitplanes for the blend coefficient, which produces data that enables a bitplane manager 517 to control a DMD 519 at a projector 201 (or other digitally controllable light modulator)". Note that the selected bitplanes for dithering are selected a series of frames according to the table based on the frame sequence positions, thus, generating the dithered bitplanes in time repeated sequences); and transmitting, by the controller to a display device, the dithered bit planes (See Kempf: Fig. 3, and [0036], “Multiple bit planes are displayed during each frame display time period. The bit planes are arranged so as to create the desired pixel intensity. To divide the changes in the image for better displayed image quality with few or no visible artifacts, the number of bit planes transmitted to the SLM can be increased from 24 (8 bits per pixel for 3 colors) to 60 or more, with some repeating. In a conventional application, some of the bit planes are repeated, and because conventional solutions do not make any correlation between the data transmitted for the various ones of the bit planes, all of the pixel data for each bit plane is transmitted over the data interface I/F in FIG. 3 to the SLM device 303. Blocks 311 and 315 in the controller circuit 301 provide formatting that creates the bit planes from the read frame buffer and then transmits these bit planes to the SLM 303 on the data interface labeled I/F”. Note that all of the pixel data for each bit plane is transmitted over the data interface I/F in FIG. 3 to the SLM device 303 is mapped to the current limitation “transmitting … the dithered bit planes”). Regarding claim 13, Morgan, Kempf, Bhattacharjee, and Nicholson teach all the features with respect to claim 8 as outlined above. Further, Morgan teaches that the method of claim 8, further comprising displaying, by a display device, a display image based on the dithered bit planes (See Morgan: Figs. 4-5, and [0058], "FIG. 5 is a schematic view of a micromirror-based image projection system 500 according to the present invention. In FIG. 5, light from light source 504 is focused on the micromirror 502 by lens 506. Although shown as a single lens, lens 506 is typically a group of lenses and mirrors which together focus and direct light from the light source 504 onto the surface of the micromirror device 502. Image data and control signals from controller 514, which include the integer and fractional bit planes described above, cause some mirrors to rotate to an on position and others to rotate to an off position. Mirrors on the micromirror device that are rotated to an off position reflect light to a light trap 508 while mirrors rotated to an on position reflect light to projection lens 510, which is shown as a single lens for simplicity. Projection lens 510 focuses the light modulated by the micromirror device 502 onto an image plane or screen 512"; and [0055], “The mask is not only inverted periodically, it is also shifted from time to time. When displaying scenes without motion, shifting the mask every other frame is sufficient. When displaying scenes with motion, the mask typically is shifted each frame. The continual shifting and inverting of the mask reduces the visible artifacts created by the fractional bits”. Note that the image data with dithered bit planes are projected using SLM and displayed, which is mapped to “a display image based on the dithered bit planes”). Regarding claim 14, Morgan, Kempf, Bhattacharjee, and Nicholson teach all the features with respect to claim 13 as outlined above. Further, Morgan and Kempf teach that the method of claim 13, wherein displaying the display image comprises: producing, by a light source, light (See Kempf: Fig. 1, and [0021], “In system 100, illumination is provided by the red, green and blue (RGB) LEDs 102, 103 and 106. However, alternative illumination sources can also be used, such as incandescent lamps with reflectors, single lamps with color wheels, laser and laser-phosphor illumination. The LEDs can include an optical coating or collimating optics 41 which operate to collect and collimate the light output by the LEDs. Other color schemes can be used, such as white, cyan, etc. As illustrated in FIG. 1, two LEDs 102 and 106 are shown on a single integrated device, and they be the red and green LED devices in this example, while the blue LED 103 is a separate discrete component. In alternative systems, three individual LEDs are used, and two dichroic plates in the form of an X box shape can be used to combine the three colors (RGB) into an illumination source. In the particular example shown in FIG. 1, dichroic plate 108 reflects the light from red LED 106 at one surface, reflects the light from green LED 102 at a second surface, and passes the light from blue LED 103 through and to the illumination path. In alternative arrangements, many LEDs can be used or multiple LEDs can be used, instead of one LED for each color. Color wheels can also be used with white illumination light, such as to create the colors”; and [0024], “As shown in FIG. 1, wedge prism 115 and TIR prism 116 together form a coupling prism that accomplishes the needed separation of the illumination light rays directed onto the spatial light modulator from the image light rays coming from the spatial light modulator. The image light rays exit prism 116 and are coupled into a projection system that includes optical elements 124, 126, and 129”. Note that the LED light source 102, 103, and 106 provides illumination light through path with mirror 113, separated the image light (from the SLM) from the illumination light using the prisms RTIR 115 and 116 pair, and project the images to display using the optical system 124, 126, and 129; this whole process is mapped to the instant cited limitation of “producing, by a light source, light”); and projecting, by a spatial light modulator (SLM), the display image based on the dithered bit planes and the light from the light source (See Morgan: Figs. 4-5, and [0058], "FIG. 5 is a schematic view of a micromirror-based image projection system 500 according to the present invention. In FIG. 5, light from light source 504 is focused on the micromirror 502 by lens 506. Although shown as a single lens, lens 506 is typically a group of lenses and mirrors which together focus and direct light from the light source 504 onto the surface of the micromirror device 502. Image data and control signals from controller 514, which include the integer and fractional bit planes described above, cause some mirrors to rotate to an on position and others to rotate to an off position. Mirrors on the micromirror device that are rotated to an off position reflect light to a light trap 508 while mirrors rotated to an on position reflect light to projection lens 510, which is shown as a single lens for simplicity. Projection lens 510 focuses the light modulated by the micromirror device 502 onto an image plane or screen 512"; and [0055], “The mask is not only inverted periodically, it is also shifted from time to time. When displaying scenes without motion, shifting the mask every other frame is sufficient. When displaying scenes with motion, the mask typically is shifted each frame. The continual shifting and inverting of the mask reduces the visible artifacts created by the fractional bits”. Note that the image data with dithered bit planes are projected using SLM and displayed, combining with the limitation of the light source to generate illumination light and separating the image light from the illumination light, which is mapped to “projecting, by a spatial light modulator (SLM), the display image based on the dithered bit planes and the light from the light source”). Allowable Subject Matter Claim 3 is objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. The best arts searched do not teach the claimed limitations of “the system of claim 1, wherein the controller comprises: a frame memory configured to store an image frame; a frame memory controller coupled to the frame memory, the frame memory controller configured to obtain image data from the image frame, the image data associated with a color component of the image frame; a dither noise mask generator configured to provide a dither noise mask according to a dither noise level for the image data; a dither percentage calculator coupled to the frame memory controller, the dither percentage calculator configured to produce a dither noise level based on the image data; and a bit plane generator coupled to the frame memory controller, to the dither percentage calculator, and to the dither noise mask generator, the bit plane generator configured to produce dithered bit planes based on the dither noise mask and the dither noise level.” Claim 4-7 and 9-12 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. The best arts searched do not teach the claimed limitations of “the system of claim 1, wherein generating dithered bit planes comprises: determining a dither noise level for the data based on the data and a transfer function; determining, based on the dither noise level, a dither noise pattern for the data; and generating bit plane data based on the dither noise pattern.” Claims 15-20 are allowed but have been ODP rejected. Claim 15 is independent claim including the allowable limitations cited in dependent claim 3, and claims 16-20 are dependent on the allowable independent Claim 15. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to GORDON G LIU whose telephone number is (571)270-0382. The examiner can normally be reached Monday - Friday 8:00-5:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Devona E Faulk can be reached at 571-272-7515. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /GORDON G LIU/ Primary Examiner, Art Unit 2618
Read full office action

Prosecution Timeline

Jul 17, 2024
Application Filed
Jan 20, 2026
Non-Final Rejection — §103, §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602846
GENERATING REALISTIC MACHINE LEARNING-BASED PRODUCT IMAGES FOR ONLINE CATALOGS
2y 5m to grant Granted Apr 14, 2026
Patent 12602840
IMAGE PROCESSING SYSTEM, IMAGE PROCESSING METHOD, AND STORAGE MEDIUM
2y 5m to grant Granted Apr 14, 2026
Patent 12602871
MESH TOPOLOGY GENERATION USING PARALLEL PROCESSING
2y 5m to grant Granted Apr 14, 2026
Patent 12592022
INTEGRATION CACHE FOR THREE-DIMENSIONAL (3D) RECONSTRUCTION
2y 5m to grant Granted Mar 31, 2026
Patent 12586330
DISPLAYING A VIRTUAL OBJECT IN A REAL-LIFE SCENE
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
83%
Grant Probability
98%
With Interview (+15.1%)
2y 4m
Median Time to Grant
Low
PTA Risk
Based on 673 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month