Prosecution Insights
Last updated: April 19, 2026
Application No. 18/252,718

CONTROL BLURRING METHOD AND APPARATUS, TERMINAL DEVICE, AND READABLE STORAGE MEDIUM

Non-Final OA §103
Filed
May 11, 2023
Examiner
TRUONG, KARL DUC
Art Unit
2614
Tech Center
2600 — Communications
Assignee
Huawei Technologies Co., Ltd.
OA Round
3 (Non-Final)
52%
Grant Probability
Moderate
3-4
OA Rounds
2y 7m
To Grant
83%
With Interview

Examiner Intelligence

Grants 52% of resolved cases
52%
Career Allow Rate
15 granted / 29 resolved
-10.3% vs TC avg
Strong +31% interview lift
Without
With
+31.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
45 currently pending
Career history
74
Total Applications
across all art units

Statute-Specific Performance

§101
3.2%
-36.8% vs TC avg
§103
85.3%
+45.3% vs TC avg
§102
9.5%
-30.5% vs TC avg
§112
2.1%
-37.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 29 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 11th November, 2025 has been entered. Response to Amendment This action is in response to the amendment filed on 11th November, 2025. Claims 65, 67, 73, 75, and 81 have been amended. Claims 68, 70-72, 76, 78-80, 84, 86, and 88 have been cancelled. Claims 65-67, 69, 73-75, 77, 81-83, 85, and 87 remain rejected in the application. Applicant's amendments to the claims have overcome each and every objection previously set forth in the final office action mailed 12th June, 2025. Response to Arguments Applicant's arguments with respect to Claims 65, 73, and 81 filed on 11th November, 2025, with respect to the rejection under 35 U.S.C. § 103, regarding that the prior art does not teach the limitation(s): "obtaining, in a drawing and rendering phase and from a drawing instruction of a first frame image of a plurality of frame images, a dynamic parameter of a first control in the first frame image and a static parameter of the first control, wherein the first control includes a control animation with the plurality of frame images, wherein the dynamic parameter of the first control and the static parameter of the first control are blurring parameters of the first control and included in the drawing instruction of the first frame image, the static parameter indicate a blur effect parameter when blurring processing is performed on all frames of the plurality of frame images, and the dynamic parameter changes over time and is used when blurring processing is performed on a current frame of the first frame image other than other frames of other frame images of the plurality of frame images", "sending the static parameter of the control to a window manager", and "performing, in a composition phase, composition process to obtain a composition result of the first frame image in accordance with the static parameter of the first control from the window manager and the encapsulation result obtained from the frame buffer, wherein blurring processing is based on the drawing result of the first frame image and the dynamic parameter that are encapsulated in the encapsulation result and the static parameter of the control from the window manager" have been fully considered, but are moot because of new grounds for rejection. It has now been taught by the combination of Darsa and Conn. Regarding arguments to Claims 66-67, 69, 74-75, 77, 82-83, 85, and 87, they directly/indirectly depend on independent Claims 65, 73, and 81 respectively. Applicant does not argue anything other than independent Claims 65, 73, and 81. The limitations in those claims, in conjunction with combination, was previously established as explained. Claim Objections Claim 81 is objected to because of the following informalities: Claim 81 recites the limitation(s): "cause the control blurring apparatus to perform:..." on PG(s). 5, Line(s) 15-16; examiner suggests amending this to: "cause a control blurring apparatus to perform:..."Appropriate correction is required. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 65-67, 69, 73-75, 77, 81-83, 85, and 87 are rejected under 35 U.S.C. 103 as being unpatentable over Darsa et al. (US 20120154426 A1, previously cited), hereinafter referenced as Darsa, in view of Conn et al. (US 20190095086 A1), hereinafter referenced as Conn. Regarding Claim 65, Darsa discloses a control blurring method (Darsa, [0037]: teaches a process <read on control blurring method> for processing pixel data, such as translucent and overlaid windows), comprising: obtaining, in a drawing and rendering phase and from a drawing instruction of a first frame image of a plurality of frame images, a dynamic parameter of a first control in the first frame image [[and a static parameter of the first control]] (Darsa, [0023]: teaches "the UI subsystem 104 can operate to create <read on drawing instruction> a window <read on first control> having associated alpha and occlusion parameters <read on dynamic parameter> that can be used to combine one or more UI elements having transparency properties with a video stream <read on first frame image of frame images> being captured or played"; Note: it should be noted that although not expressly disclosed, it is common in the art to perform rendering techniques, including obtaining intermediate video data, during a draw and render process), wherein the first control includes a control animation with the plurality of frame images (Darsa, [0039]: teaches using dirty rectangle blit operations to "update changed pixels, such as pixels <read on first control> that have changed <read on control animation> from a prior time and/or frame"; Note: it should be noted that the pixels are being interpreted as the pixels of a window), wherein the dynamic parameter of the first control [[and the static parameter of the first control]] are blurring parameters of the first control and included in the drawing instruction of the first frame image (Darsa, [0023]: teaches "the UI subsystem 104 can operate to create a window <read on first control> having associated alpha and occlusion parameters <read on dynamic parameter> that can be used to combine <read on included drawing instruction> one or more UI elements having transparency properties <read on blur processing> with a video stream <read on first frame image> being captured or played"), [[the static parameter indicate a blur effect parameter when blurring processing is performed on all frames of the plurality of frame images, and]] the dynamic parameter changes over time and is used when blurring processing is performed on a current frame of the first frame image other than other frames of other frame images of the plurality of frame images (Darsa, [0070]: teaches the current UI element 510 <read on current frame of first frame image> "has a destination alpha value <read on dynamic parameter> of 0.75 for portions that are superimposed over portions of the UI element 506, a destination alpha value of 0.5 for portions that are superimposed only over portions of the overlay window 504, and a destination alpha value of 1.0 for the portions that do not cover any other structures" as shown in FIG. 5E, which updates the current UI element <read on dynamic parameter changing over time> to an opaque UI element <read on blurring processing>; [0070]: further teaches "the video generator can blit the color key when updating only video pixel data <read on other frames of other frame images>"; Note: it should be noted that the updating destination alpha value only contains specific values for a given frame, such as a first frame image, before switching to another value on other frames); PNG media_image1.png 178 280 media_image1.png Greyscale [[sending the static parameter of the control to a window manager;]] obtaining, in the drawing and rendering phase, a drawing result of the first frame image (Darsa, [0072]: teaches pixel processing operations that display the resulting color and alpha values <read on obtained drawing result> for an opaque UI element 600); encapsulating, in the drawing and rendering phase, the drawing result of the first frame image and the dynamic parameter, to obtain an encapsulation result (Darsa, [0026]: teaches a compositor 110 copying "portions of the composition buffer into the primary buffer through a number of dirty blit operations for minimally sized rectangles," where "the UI subsystem 104 can track dirty rectangles by maintaining a list of rectangles <read on encapsulating drawing result> whose content has been modified by one or more applications (e.g., dirty)," where "the dirty rectangle/tiling information <read on obtaining encapsulation result> can be sent to the compositor 110 for use in processing the associated pixel data"), wherein the encapsulation result is stored in a frame buffer (Darsa, [0031]: teaches compositor 110 copying contents <read on encapsulation result> of an associated back buffer directly to the composition buffer <read on storing encapsulation result in frame buffer> to determine current associated pixel values); performing, in a composition phase, composition process to obtain a composition result of the first frame image in accordance with [[the static parameter of the first control from the window manager and]] the encapsulation result obtained from the frame buffer (Darsa, [0052]: teaches the compositor 110 performing compositing operations <read on performing composition process in composition phase> "using the composition buffer <read on encapsulation result obtained from frame buffer>," where "the compositor 110 can operate to update information stored in the composition buffer when the video generator 108 and/or the UI subsystem 104 require an update"; [0056]: teaches "the compositor 110 can blend the UI element 312 and the overlay 307 <read on composition result of first frame image> in overlapping areas so that the UI element 312 shows over or is superimposed with the video pixel data in the resulting display 309"), wherein blurring processing is based on the drawing result of the first frame image and the dynamic parameter that are encapsulated in the encapsulation result [[and the static parameter of the control from the window manager]] (Darsa, [0061]: teaches the compositor 110 writing final color and alpha values to the primary buffer 414, where an alpha value can be set between 0 and 1 <read on blurring processing>; [0039]: teaches "the compositor 110 can track UI elements so that they can be blended appropriately over the video <read on first frame image> at the UI update rate with a new video frame or when a UI element is updated (e.g., moved, closed, etc.) <read on dynamic parameter>," where "the compositor 110 can perform a series of dirty rectangle blit operations when managing pixel information <read on encapsulation result> from the composition buffer to a primary buffer"; [0032]: teaches the compositor 110 <read on window manager> identifying an overlay window <read on control> that includes identifying the associated alpha and occlusion properties <read on dynamic parameter>, where "the overlay window can be configured to be co-located with an overlay and used to present UI pixel data having varying amounts of transparency"), the composition result having a blur effect of the first frame image (Darsa, [0063]: teaches "the UI subsystem can set a flag for use in alpha blending operations, e.g., DDABLT_NOBLEND, that enables the loading of values, such as zero, into destination alpha and color components," where "after setting the flag, the resulting color and alpha values for the overlay window 504 <read on composition result> can be set," such as setting the alpha value to 0.5 <read on blur effect of first frame image>); and performing display based on the composition result of the first frame image (Darsa, [0040]: teaches "the compositor 110 can communicate alpha values to the display controller 114 for use when performing alpha blending <read on composition result of first frame image> in the device hardware to produce a final composed view for the display 116 <read on performing display>"). However, Darsa does not expressly disclose obtaining, in a drawing and rendering phase and from a drawing instruction of a first frame image of a plurality of frame images, a dynamic parameter of a first control in the first frame image and a static parameter of the first control, wherein the dynamic parameter of the first control and the static parameter of the first control are blurring parameters of the first control and included in the drawing instruction of the first frame image, the static parameter indicate a blur effect parameter when blurring processing is performed on all frames of the plurality of frame images; sending the static parameter of the control to a window manager; and performing, in a composition phase, composition process to obtain a composition result of the first frame image in accordance with the static parameter of the first control from the window manager and the encapsulation result obtained from the frame buffer, wherein blurring processing is based on the drawing result of the first frame image and the dynamic parameter that are encapsulated in the encapsulation result and the static parameter of the control from the window manager. Conn discloses obtaining, in a drawing and rendering phase and from a drawing instruction of a first frame image of a plurality of frame images, a dynamic parameter of a first control in the first frame image and a static parameter of the first control (Conn, [0030]: teaches when a user interface 130 <read on first control> overlays over a background image 120, a user interface adjustment module 110 determines a blur level <read on static parameter> of the background image 120 "based on the background image 120's average color characteristic or any appropriate combination of color parameters, such as brightness, average color, determined number of colors comprising average color, color distribution, and the like" as shown in FIG. 2; Note: it should be noted that it is being interpreted that the type of UI being overlaid will vary the background blur output), wherein PNG media_image2.png 530 412 media_image2.png Greyscale the dynamic parameter of the first control and the static parameter of the first control are blurring parameters of the first control and included in the drawing instruction of the first frame image (Conn, [0030]: teaches the user interface 130 <read on first control> being overlaid over background image 120 <read on first frame image>, where the user interface adjustment module 110 modifies <read on drawing instruction> the blur level <read on static parameter> of background image 120), the static parameter indicate a blur effect parameter when blurring processing is performed on all frames of the plurality of frame images (Conn, [0030]: teaches when the user interface 130 overlays over background image 120, the user interface adjustment module 110 determines a blur level <read on static parameter> of the background image 120 "based on the background image 120's average color characteristic or any appropriate combination of color parameters <read on indicate blur effect parameter>, such as brightness, average color, determined number of colors comprising average color, color distribution, and the like"; [0043]: teaches adjusting color parameters for the background image, where for an example light color theme, the "H value may stay the same <read on all frames>, the S value may increase to 100%, and B may change to 25%" and stays static <read on frame images> until updated later, which affects the background blur value <read on blurring process>); sending the static parameter of the control to a window manager (Conn, FIG. 5 teaches the user interface adjustment module 110 <read on window manager> identifying the color characteristics of the background image, which is then adjusted, which further affects the blur value <read on sending static parameter>, where the selected user interface is overlaid over the blurred background for display); and PNG media_image3.png 626 322 media_image3.png Greyscale performing, in a composition phase, composition process to obtain a composition result of the first frame image in accordance with the static parameter of the first control from the window manager and the encapsulation result obtained from the frame buffer (Conn, FIG. 5 teaches the user interface adjustment module 110 <read on window manager> identifying the color characteristics of the background image, which is then adjusted, which further affects the blur value <read on sending static parameter>, where the selected user interface is overlaid over the blurred background for display), wherein blurring processing is based on the drawing result of the first frame image and the dynamic parameter that are encapsulated in the encapsulation result and the static parameter of the control from the window manager (Conn, FIG. 5 teaches the user interface adjustment module 110 <read on window manager> identifying the color characteristics of the background image, which is then adjusted, which further affects the blur value <read on sending static parameter>, where the selected user interface is overlaid over the blurred background for display). Conn is analogous art with respect to Darsa because they are from the same field of endeavor, namely handling overlaid window characteristics, such as UI elements and transparency/blur control. Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to implement a user interface adjustment module to add blur to static images, such as backgrounds, whenever dynamic UI elements are overlaid on screen as taught by Conn into the teaching of Darsa. The suggestion for doing so would allow the user to better distinguish foreground elements on the display from the background, thereby improving the overall user experience and usability. Therefore, it would have been obvious to combine Conn with Darsa. Regarding Claim 73, it recites the limitations that are similar in scope to Claim 65, but in a control blurring apparatus. As shown in the rejection, the combination of Darsa and Conn discloses the limitations of Claim 65. Additionally, Darsa discloses a control blurring apparatus (Darsa, [0082]: teaches a computer <read on control blurring apparatus>), comprising: a processor (Darsa, [0082]: teaches the computer including a CPU <read on processor>), and a non-transitory computer-readable storage coupled to the processor and configured to store a computer program (Darsa, [0083]: teaches the computer including a mass storage device and its associated non-volatile computer-readable media <read on non-transitory computer-readable storage> that are connected to the CPU, where the CPU can access application programs <read on computer program>), the computer program comprising computer instructions that, when executed by the processor, cause the control blurring apparatus to perform (Darsa, [0084]: teaches the computer storage media including information, such as computer-readable instructions):… Thus, Claim 73 is met by Darsa according to the mapping presented in the rejection of Claim 65, given the control blurring method corresponds to a control blurring apparatus. Regarding Claim 81, it recites the limitations that are similar in scope to Claim 65, but in a non-transitory computer-readable storage medium. As shown in the rejection, the combination of Darsa and Conn discloses the limitations of Claim 65. Additionally, Darsa discloses a non-transitory computer-readable storage medium (Darsa, [0083]: teaches a computer including a mass storage device and its associated non-volatile computer-readable media <read on non-transitory computer-readable storage medium> that are connected to the CPU), wherein the computer-readable storage medium stores a computer program (Darsa, [0083]: teaches the computer including a mass storage device and its associated non-volatile computer-readable media that are connected to the CPU, where the CPU can access application programs <read on computer program>), the computer program comprising computer instructions that, when executed by a processor, cause the control blurring apparatus to perform (Darsa, [0083]: teaches the computer including a mass storage device and its associated non-volatile computer-readable media that are connected to the CPU <read on processor>, where the CPU can access application programs <read on computer program>):… Thus, Claim 81 is met by Darsa according to the mapping presented in the rejection of Claim 65, given the control blurring method corresponds to a non-transitory computer-readable storage medium. Regarding Claims 66 and 74, the combination of Darsa and Conn discloses the control blurring method and the control blurring apparatus of Claims 65 and 73 respectively. Additionally, Darsa further discloses wherein control property of the first control comprises a blurring tag indicating to perform blurring processing on the first control (Darsa, [0063]: teaches an application using a flag <read on blurring tag> to identify pixel processing features associated with overlay window 504, where the flag can be set <read on blurring processing> to set RGB and alpha values for said overlay window <read on first control>). Regarding Claims 67 and 75, the combination of Darsa and Conn discloses the control blurring method and the control blurring apparatus of Claims 65 and 73 respectively. Additionally, Darsa further discloses wherein the dynamic parameter comprises one or more of coordinates, a size, a corner radius, or transparency of the first control (Darsa, [0023]: teaches the UI subsystem 104 creating a window <read on first control> that has associated alpha and occlusion parameters that can be used to combine one or more UI elements having transparency properties with a video stream being captured or played). Regarding Claims 69, 77, and 82, the combination of Darsa and Conn discloses the control blurring method, the control blurring apparatus, and the non-transitory computer-readable storage medium of Claims 65, 73, and 81 respectively. Additionally, Darsa further discloses wherein the performing composition process comprises: obtaining the drawing result and the dynamic parameter of the first frame image from the result encapsulation result stored in the frame buffer (Darsa, [0026]: teaches a compositor 110 copying "portions of the composition buffer into the primary buffer through a number of dirty blit operations for minimally sized rectangles," where "the UI subsystem 104 can track dirty rectangles by maintaining a list of rectangles <read on encapsulating drawing result of first frame image and dynamic parameter> whose content has been modified by one or more applications (e.g., dirty)," where "the dirty rectangle/tiling information can be sent to the compositor 110 for use in processing the associated pixel data"); determining, based on the dynamic parameter, an area corresponding to the first control in the drawing result of the first frame image (Darsa, [0056]: teaches "the compositor 110 can blend the UI element 312 and the overlay 307 in overlapping areas <read on determining area corresponding to first control> so that the UI element 312 shows over or is superimposed with the video pixel data in the resulting display 309"); performing, [[based on the static parameter]], blurring processing on the area corresponding to the first control in the drawing result of the first frame image, to obtain a blurred area image (Darsa, [0060]: teaches the compositor 110 operating "to blit the opaque UI element 406 to the composition buffer 404 which includes an alpha destination value of one ( α d s t = 1 ) ," where "since the UI element 410 includes an amount of transparency, the compositor 110 can operate to calculate the final alpha values <read on performing blurring processing> associated with the superimposed UI element 410 <read on obtaining blurred area image> in conjunction with the overlay window 400 <read on first control>" of the video pixel data that is associated with the video stream <read on drawing result of first frame image>); and performing composition process based on the drawing result of the first frame image and the blurred area image (Darsa, [0052]: teaches the compositor 110 performing compositing operations <read on performing composition process in composition phase> "using the composition buffer," where "the compositor 110 can operate to update information stored in the composition buffer when the video generator 108 and/or the UI subsystem 104 require an update"; [0056]: teaches "the compositor 110 can blend the UI element 312 and the overlay 307 in overlapping areas so that the UI element 312 shows over or is superimposed with the video pixel data <read on drawing result of first frame image> in the resulting display 309"; [0056]: teaches "the compositor 110 can blend the UI element 312 and the overlay 307 in overlapping areas so that the UI element 312 <read on blurred area image> shows over or is superimposed with the video pixel data in the resulting display 309," where "the composition buffer 306 now includes the UI element 312 which has an alpha destination value of 0.5 ( α . d s t = 0.5 ) for locations within the overlay window 300 and an alpha destination value of one for ( α d s t = 1 ) for locations outside of the overlay window 300"), to obtain the composition result having the blur effect of the first frame image (Darsa, [0056]: teaches "the compositor 110 can blend the UI element 312 and the overlay 307 in overlapping areas so that the UI element 312 shows over or is superimposed with the video pixel data in the resulting display 309," where "the composition buffer 306 now includes the UI element 312 which has an alpha destination value of 0.5 ( α d s t = 0.5 ) for locations within the overlay window 300 and an alpha destination value of one for ( α d s t = 1 ) for locations outside of the overlay window 300 <read on obtaining composition result having the blur effect of first frame image>"; [0060]: teaches the compositor 110 calculating "the final alpha values associated with the superimposed UI element 410 in conjunction with the overlay window 400"). However, Darsa does not expressly disclose performing, based on the static parameter, blurring processing on the area corresponding to the first control in the drawing result of the first frame image, to obtain a blurred area image. Conn discloses performing, based on the static parameter, blurring processing on the area corresponding to the first control in the drawing result of the first frame image, to obtain a blurred area image (Conn, [0030]: teaches the user interface 130 being overlaid over background image 120, where the user interface adjustment module 110 modifies the blur level <read on static parameter> of background image 120). Conn is analogous art with respect to Darsa because they are from the same field of endeavor, namely handling overlaid window characteristics, such as UI elements and transparency/blur control. Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to implement a user interface adjustment module to add blur to static images, such as backgrounds, whenever dynamic UI elements are overlaid on screen as taught by Conn into the teaching of Darsa. The suggestion for doing so would allow the user to better distinguish foreground elements on the display from the background, thereby improving the overall user experience and usability. Therefore, it would have been obvious to combine Conn with Darsa. Regarding Claims 83, 85, and 87, the combination of Darsa and Conn discloses the control blurring method, the control blurring apparatus, and the non-transitory computer-readable storage medium of Claims 65, 73, and 81 respectively. Additionally, Darsa further discloses wherein the drawing result of the first frame image comprises a plurality of drawing results of a plurality of layers (Darsa, FIG. 4 teaches a plurality of window overlays <read on plurality of layers> being displayed on screen <read on plurality of drawing results>), PNG media_image4.png 341 469 media_image4.png Greyscale the first control is in at least one of the plurality of layers (Darsa, [0059]: teaches overlay window 400 <read on first control> including an alpha source value), and the [[static]] parameter comprises a layer sequence for overlaying of drawing results of the first frame image (Darsa, [0045]: teaches the UI subsystem 104 setting a flag <read on parameter> to identify a window as an overlay window, where "the compositor 110 can read the flag and treat the associated window as being opaque for z-order operations <read on layer sequence> and having alpha values equal to zero when performing compositing operations," where the layered windows are displayed on screen <read on drawing results of first frame image>). However, Darsa does not expressly disclose the static parameter comprises a layer sequence for overlaying of drawing results of the first frame image. Conn discloses the static parameter comprises a layer sequence for overlaying of drawing results of the first frame image (Conn, [0029]: teaches the user interface 130 being overlaid over the background image 120 by the user interface adjustment module 110, where module 110 blurs the background image 120 using a blur value <read on static parameter> based on the identified color characteristics of said background image 120). Conn is analogous art with respect to Darsa because they are from the same field of endeavor, namely handling overlaid window characteristics, such as UI elements and transparency/blur control. Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to implement a user interface adjustment module to add blur to static images, such as backgrounds, whenever dynamic UI elements are overlaid on screen as taught by Conn into the teaching of Darsa. The suggestion for doing so would allow the user to better distinguish foreground elements on the display from the background, thereby improving the overall user experience and usability. Therefore, it would have been obvious to combine Conn with Darsa. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Harper et al. (US 20130201196 A1) discloses a reentrant compositing window manager application; Sayre et al. (US 20080297518 A1) discloses determining the poses of objects based on motion blur parameters; and Smith (US 10365758 B1) discloses a device that detects a touch contact point associated with a user interface and using blur radii. Any inquiry concerning this communication or earlier communications from the examiner should be directed to KARL TRUONG whose telephone number is (703)756-5915. The examiner can normally be reached 7:30 AM - 5:00 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kent Chang can be reached at (571) 272-7667. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /K.D.T./Examiner, Art Unit 2614 /KENT W CHANG/Supervisory Patent Examiner, Art Unit 2614
Read full office action

Prosecution Timeline

May 11, 2023
Application Filed
Feb 05, 2025
Non-Final Rejection — §103
May 14, 2025
Response Filed
Jun 02, 2025
Final Rejection — §103
Sep 08, 2025
Response after Non-Final Action
Nov 11, 2025
Request for Continued Examination
Nov 14, 2025
Response after Non-Final Action
Jan 26, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12573149
DATA PROCESSING METHOD AND APPARATUS, DEVICE, COMPUTER-READABLE STORAGE MEDIUM, AND COMPUTER PROGRAM PRODUCT
2y 5m to grant Granted Mar 10, 2026
Patent 12561875
ANIMATION FRAME DISPLAY METHOD AND APPARATUS, DEVICE, AND STORAGE MEDIUM
2y 5m to grant Granted Feb 24, 2026
Patent 12494013
AUTODECODING LATENT 3D DIFFUSION MODELS
2y 5m to grant Granted Dec 09, 2025
Patent 12456258
SYSTEMS AND METHODS FOR GENERATING A SHADOW MESH
2y 5m to grant Granted Oct 28, 2025
Patent 12444020
FLEXIBLE IMAGE ASPECT RATIO USING MACHINE LEARNING
2y 5m to grant Granted Oct 14, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
52%
Grant Probability
83%
With Interview (+31.0%)
2y 7m
Median Time to Grant
High
PTA Risk
Based on 29 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month