Prosecution Insights
Last updated: April 19, 2026
Application No. 18/832,857

METHOD AND APPARATUS FOR PROCESSING EFFECT IMAGE, ELECTRONIC DEVICE, AND STORAGE MEDIUM

Non-Final OA §103
Filed
Jul 24, 2024
Examiner
NGUYEN, PHU K
Art Unit
2616
Tech Center
2600 — Communications
Assignee
Lemon Inc.
OA Round
1 (Non-Final)
86%
Grant Probability
Favorable
1-2
OA Rounds
2y 10m
To Grant
93%
With Interview

Examiner Intelligence

Grants 86% — above average
86%
Career Allow Rate
1019 granted / 1184 resolved
+24.1% vs TC avg
Moderate +7% lift
Without
With
+7.3%
Interview Lift
resolved cases with interview
Typical timeline
2y 10m
Avg Prosecution
40 currently pending
Career history
1224
Total Applications
across all art units

Statute-Specific Performance

§101
7.1%
-32.9% vs TC avg
§103
66.6%
+26.6% vs TC avg
§102
3.8%
-36.2% vs TC avg
§112
4.6%
-35.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1184 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1-3, 8-9, 19-22, 25, 27 and 29 are rejected under 35 U.S.C. 103 as being unpatentable over FU et al (High Relief from Brush Painting) in view of Abraham et al (Fluid brush) and CHU et al (Detail-Preserving Paint Modeling for 3D Brushes). As per claim 1, Fu teaches the claimed “method for processing an effect image,” comprising: “obtaining an image to be processed” (Fu, Fig. 1 - Overview of the relief generation pipeline (e.g., input image)); “determining a brush model to be rendered that corresponds to the image to be processed” (Fu, 3 OVERVIEW OF PROPOSED RESEARCH - As shown in Fig. 1, a given painting is first decomposed into a set of layers in terms of several specified palette color values (see the decomposed layers in Fig. 1). Second, a new palette color value is determined based on the unclassified regions. Third, the layers are recomputed accordingly in an iterative way (see the refined layers in Fig. 1). To make the regions represent separate brushstrokes, they can be further merged or split. The key point is to extract the overlapped brushstrokes. Overlapped strokes make colors blend. To tackle it, layer decomposition is employed here, which decomposes the painting into a set of translucent layers. In brush paintings, mostly a brushstroke only utilizes a single palette color. Layer decomposition helps classify brushstrokes separately into different layers based on the palette color, so that every layer contains the strokes which are well separated); and “rendering, according to a drawing parameter, a line to be rendered in the brush model to be rendered” (Fu, page 3, 2nd column - This method combines user indications and shape inflation to model smooth bas-relief shapes from line drawings. However, our research aims at paintings with brushstrokes, which are different from 2D line drawings. Line drawing based approaches are limited to using information contained in a line drawing, while a brushstroke does not only contain contour lines but also delineates a region, which contain information such as color, texture, opacity. It is crucial to identify and extract brushstrokes from a painting; 3 Overview of the proposed Research - We introduce the edge tangent flow (ETF) field and the coherent line [13] to enhance such features in paintings, which are in favor of preserving the completeness of the strokes in every layer and effectively correct the errors due to wrong layer decomposition... To tackle this challenge, an inpainting technique is employed here. The coherent line is further involved in the MSERs algorithm [4] again for extracting strokes, which both preserves stroke continuity and removes spurious edges within one layer). It is noted that Fu does not explicitly teach “obtaining the effect image corresponding to the image to be processed”; however, Fu’s discussion and comparison to the prior arts, Fu suggests the step of “obtaining the effect image” as claimed (Fu, 7 RESULTS AND ANALYSIS, figures 16 (a) and (c) - our approach can better preserve the thin strokes and the fine details of brushstrokes; figures 17(a) and (c) - the overlapped brushstrokes with different colors can be extracted by layer decomposition, in which the information of each brushstrokes is retained; figures 18(a) and (c) - Instead of using the raw image as texture, the extracted brushstrokes with opacity values are mapped to the relief surface in our method, which can better show the features and inter-relations of brushstrokes on high relief) (see also Abraham, 4.1 Strokes and Touches - Since a stroke S is composed of a sequence of touches generated as point samples during stylus movement, an uneven distribution of touches could slow performance or lead to visual artifacts; Figure 8: The range of stylus controls based on pressure, tilt and brush stroke velocity, as well as a combination of these stroke controls; Figure 9: Effects in the Fluid Brush medium demonstrate the range of techniques artists discovered and used to create artwork in the Fluid Brush system; Chu, 3.1 Paint Simulation Pipeline - Figure 6: Imprinting a brush footprint along a stroke trajectory; Figure 12: User interface of our paint system). Thus, it would have been obvious, in view of Abraham and Chu, to configure Fu’s method as claimed by applying the effect image according to the drawing line of the brush model. The motivation is to simulating the art work of an artist using brush stroke on the digital environment. Claim 2 adds into claim 1 “wherein the determining a brush model to be rendered that corresponds to the image to be processed comprises: determining, according to the image to be processed and a target contrast view, the brush model to be rendered” (Fu, 4.4 Brushstroke Completion - As shown in Fig. 4, some brushstrokes may have other opaque brushstrokes overlapped above, which bring about the gaps to break the brushstrokes within a layer. Obviously, to make brushstrokes complete and smooth, these gaps need to be filled. It is natural to involve user interventions, such as manual masks or sketches, which specify the gaps to be filled. Once the overlapped regions are determined, we employ the patch-based inpainting techniques [43] here, that is, within one layer, the gap is specified by a mask, while the patches are extracted from the outside of the gap in all layers and are utilized to create the patch dictionary). It is noted that Applicant’s Disclosure, the claimed “target contrast view” is defined in paragraph [0039] – “The target contrast view may be a view having a preset gray value. Optionally, the preset gray value may be 0, that is, the target contrast view is a pure black view or a pure color view with a gray value close to 0.” Claim 3 adds into claim 1 “wherein the determining a brush model to be rendered that corresponds to the image to be processed comprises: taking a drawn trajectory on a display interface as the image to be processed” (Abraham, 4 Fluid Brush System, Figure 3 - The stroke S consists of a cubically interpolated sequence of touches t and segments s. Particles generated in s are constrained within a perpendicular distance of the stroke width (r), which is linearly interpolated between segments, to create smooth variation in stroke width); and “obtaining two adjacent pause points in the drawn trajectory, and determining, according to attributes of the pause point and the image to be processed, the brush model to be rendered; wherein the attributes of the pause point comprise pause duration and a pause instant at the pause point” (Abraham, Figure 8 - Stroke density and form can also be controlled by stylus velocity. To create the stroke example “stroke velocity stylus control" in Figure 8, the artist used a slow brush speed for the wider part of the stroke before accelerating into a final, fast motion to draw the stroke’s tail. This results in the start of the stroke maintaining a wispier appearance with slower motion that thins into a denser line with a faster rate of flow; 3 Design Study: Traditional Brush Controls - Additional effects include the pause in brush movement, which creates the distinctive end caps of a stroke, or using the follow through for a strong line that trails off based on pressure. Applying more pressure can create a blossom of ink, which can fill in a region with a solid ink presence, and the water saturation of the brush leads to gradient effects within a stroke… Some artists drew on calligraphic techniques to emulate the pause and follow through effects using similar brush controls to generate Fluid Brush’s bleeding and calligraphic effects. In the “User Reaction" section, we show the visual appearance of these techniques and how artists incorporated them into art made within the Fluid Brush system). It is obvious that on Abraham’s trajectory of the stroke S consisting of a sequence of touches t and segment s in which two consecutive touches ti and t(i+1) can be pause points associated with time ti and t(i+1) (e.g., Abraham, 6.1 User-generated Results - Bamboo relies almost entirely on bleeding to create the bold, segregated lines of the bamboo stalk. This emulates the pause used in ink painting to create the hard, stylized edges of this traditional painting subject). Thus, it would have been obvious, in view of Abraham and Chu, to configure Fu’s method as claimed by applying the “pause” effect on the drawing line of the brush model. The motivation is to simulating the art work of an artist using brush stroke on the digital environment. Claim 8 adds into claim 1, “wherein the drawing parameter comprises a drawing paint and a paint parameter, the drawing paint is a liquid paint, the liquid paint comprises ink or syrup” (Abraham, 4.1 Strokes and Touches - Since a stroke S is composed of a sequence of touches generated as point samples during stylus movement, an uneven distribution of touches could slow performance or lead to visual artifacts; Figure 9: Effects in the Fluid Brush medium demonstrate the range of techniques artists discovered and used to create artwork in the Fluid Brush system), and “the paint parameter comprises a paint depth of the liquid paint and a reflected light quantity of the paint consistent with the reflected light quantity information” (Abraham, 6.1 User-generated Results, Ghost takes a painterly scene, and uses soft strokes for the twinkling stars, while relying on more typical laminar strokes to generate swirling fog (which shows the “paint depth” (e.g., hidden behind the lady) of the liquid paint). Thus, it would have been obvious, in view of Abraham and Chu, to configure Fu’s method as claimed by applying the “liquid” paint effect on the drawing line of the brush model. The motivation is to simulating the art work of an artist using brush stroke on the digital environment. Claim 9 adds into claim 8 “wherein the rendering, according to a drawing parameter, a line to be rendered in the brush model to be rendered, and obtaining the effect image corresponding to the image to be processed” (Abraham, 4.1 Strokes and Touches - Since a stroke S is composed of a sequence of touches generated as point samples during stylus movement, an uneven distribution of touches could slow performance or lead to visual artifacts) comprise: “determining a gray value of a pixel in the line to be rendered and an illumination parameter corresponding to the image to be processed” (Abraham, Figure 3 - Particles generated in s are constrained within a perpendicular distance of the stroke width (r), which is linearly interpolated between segments, to create smooth variation in stroke width); “retrieving a corresponding drawing paint and a corresponding paint parameter according to the gray value and the illumination parameter; and rendering a corresponding pixel based on the drawing paint and the paint parameter, and obtaining the effect image” (Abraham, Figure 8: The range of stylus controls based on pressure, tilt and brush stroke velocity, as well as a combination of these stroke controls). Thus, it would have been obvious, in view of Abraham and Chu, to configure Fu’s method as claimed by applying the “liquid” paint effect on the drawing line of the brush model. The motivation is to simulating the art work of an artist using brush stroke on the digital environment. Claims 19, 21-22, 27 and 20, 29 claim an electronic device based on the method of claims 1-3, 8-9; therefore, they are rejected under a similar rationale. Claim 25 adds into claim 22 “determine a dark color area and a light color area according to a gray value, and obtain a first predetermined drawing parameter to be fused that corresponds to the dark color area and a second predetermined drawing parameter to be fused that corresponds to the light color area” (Abraham, Figure 8 – the overlapped dark and light color areas; 6 User Reaction - (d) The blending effect is achieved by overlaying strokes from multiple brush types to create a unique set of textures and flow patterns. This is similar to dry brush in watercolor for creating variation in stroke texture); and “process the first drawing parameter to be fused and the second drawing parameter to be fused based on an interpolation operation, and determine the drawing parameter corresponding to a respective gray value” (Abraham, 4.1 Strokes and Touches - This space also allows us to smooth S’s line quality by further subdividing the stroke using cubic interpolation. We determine subdivision placement by ensuring each subdivision segment is about r distance from its neighboring touches. For example, in Figure 3, if t0, t2 and t5 were the touches generated from the stylus, t1, t3 and t4 would be created during the subdivision process at regular intervals. Just as the positions of additional touches are based on cubic interpolation, so are their segment parameters (shown in Table 1) to maintain a smooth, visually appealing line quality). Thus, it would have been obvious, in view of Abraham and Chu, to configure Fu’s method as claimed by applying the overlapped paint effect on the drawing line of the brush model. The motivation is to simulating the art work of an artist using brush stroke on the digital environment. Claims 24, 26 and 28 are rejected under 35 U.S.C. 103 as being unpatentable over FU et al (High Relief from Brush Painting) in view of Abraham et al (Fluid brush) and CHU et al (Detail-Preserving Paint Modeling for 3D Brushes), and further in view of FAUL et al (Perceived roughness of glossy objects: The influence of Fresnel effects and correlated image statistics). Claim 24 adds into claim 22 “determine the gray information of a corresponding pixel through a scalar multiplication operation on the width change rate” (Abraham, Figure 3 - Particles generated in s are constrained within a perpendicular distance of the stroke width (r), which is linearly interpolated between segments, to create smooth variation in stroke width). It is noted that Abraham does not teach “a corresponding Fresnel value,” however, Faul teaches that the Fresnel value is a well-known characteristic of perceived surface of painted object (Faul, page 2, Image generation with glossy surfaces - Besides the direction of the specularly reflected light, a second important aspect is the relative fraction of the incident light energy that is reflected in a certain direction. This fraction is described in optics by Fresnel’s equations, which exist in two versions, one for metals and the other for nonconductors (dielectrics). Typical dielectrics are partially transparent materials such as glass, liquids, and plastics. In both classes of materials, Fresnel’s equations depend on the refractive index and the angle of incidence of the light, but in very different ways). Thus, it would be obvious, in view of Faul, Chu, and Abraham, to configure Fu’s method as claimed by assigning a Fresnel value on the liquid paint’s property on the appearance of the drawing line of the brush model. The motivation is to simulating the art work of an artist using brush stroke on the digital environment. Claim 26 adds into claim 24 “determine reflected light quantity information corresponding to gray information under different illumination parameters” (Abraham, Figure 3 - Particles generated in s are constrained within a perpendicular distance of the stroke width (r), which is linearly interpolated between segments, to create smooth variation in stroke width); and “update, based on the reflected light quantity information, the drawing parameter corresponding to respective gray information” (Faul, page 2, Image generation with glossy surfaces - Besides the direction of the specularly reflected light, a second important aspect is the relative fraction of the incident light energy that is reflected in a certain direction. This fraction is described in optics by Fresnel’s equations, which exist in two versions, one for metals and the other for nonconductors (dielectrics). Typical dielectrics are partially transparent materials such as glass, liquids, and plastics. In both classes of materials, Fresnel’s equations depend on the refractive index and the angle of incidence of the light, but in very different ways). Thus, it would be obvious, in view of Faul, Chu, and Abraham, to configure Fu’s method as claimed by assigning a Fresnel value on the liquid paint’s property on the appearance of the drawing line under different illumination parameters of the brush model. The motivation is to simulating the art work of an artist using brush stroke on the digital environment. Claim 28 adds into claim 26 “determine a gray value of a pixel in the line to be rendered and an illumination parameter corresponding to the image to be processed” (Faul, page 2, Image generation with glossy surfaces - Besides the direction of the specularly reflected light, a second important aspect is the relative fraction of the incident light energy that is reflected in a certain direction. This fraction is described in optics by Fresnel’s equations, which exist in two versions, one for metals and the other for nonconductors (dielectrics). Typical dielectrics are partially transparent materials such as glass, liquids, and plastics. In both classes of materials, Fresnel’s equations depend on the refractive index and the angle of incidence of the light, but in very different ways); “retrieve a corresponding drawing paint and a corresponding paint parameter according to the gray value and the illumination parameter” (Abraham, Figure 3 - Particles generated in s are constrained within a perpendicular distance of the stroke width (r), which is linearly interpolated between segments, to create smooth variation in stroke width); and “render a corresponding pixel based on the drawing paint and the paint parameter, and obtain the effect image” (Faul, Figure 2. Example stimuli. Three shapes “blob”, “blob2”, and “sphere” were used, as well as an “indoor” and an “outdoor” illumination. All stimuli shown are rendered with Fresnel-BRDF and lowest roughness). Thus, it would be obvious, in view of Faul, Chu, and Abraham, to configure Fu’s method as claimed by assigning a Fresnel value on the liquid paint’s property on the appearance of the drawing line under different illumination parameters of the brush model. The motivation is to simulating the art work of an artist using brush stroke on the digital environment. Claims 4-7, 23 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. The following is a statement of reasons for the indication of allowable subject matter: The allowable feature in claim 4 and its depended claims 5-7 (or similarly, claim 23) is “determining a line width change rate according to width information of the line in the brush model to be processed; determining a Fresnel value of a pixel according to a normal vector of the pixel in the brush model to be processed and a camera normal vector corresponding to the brush model to be processed; determining, according to the Fresnel value of the pixel and a corresponding width change rate, gray information of the pixel in the brush model to be processed; and determining the drawing parameter according to the gray information.” Any inquiry concerning this communication or earlier communications from the examiner should be directed to PHU K NGUYEN whose telephone number is (571)272-7645. The examiner can normally be reached M-F 8-5pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Daniel F. Hajnik can be reached at (571) 272-7642. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /PHU K NGUYEN/Primary Examiner, Art Unit 2616
Read full office action

Prosecution Timeline

Jul 24, 2024
Application Filed
Jan 10, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602147
ZOOM ACTION BASED IMAGE PRESENTATION
2y 5m to grant Granted Apr 14, 2026
Patent 12602874
FRAGMENTATION MODEL GENERATION METHOD AND APPARATUS, AND DEVICE AND STORAGE MEDIUM
2y 5m to grant Granted Apr 14, 2026
Patent 12602836
METHOD TO GENERATE DISPLACEMENT FOR SYMMETRY MESH
2y 5m to grant Granted Apr 14, 2026
Patent 12599485
SYSTEMS AND METHODS FOR ORTHOPEDIC IMPLANTS
2y 5m to grant Granted Apr 14, 2026
Patent 12597206
MECHANICAL WEIGHT INDEX MAPS FOR MESH RIGGING
2y 5m to grant Granted Apr 07, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
86%
Grant Probability
93%
With Interview (+7.3%)
2y 10m
Median Time to Grant
Low
PTA Risk
Based on 1184 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month