Prosecution Insights
Last updated: April 19, 2026
Application No. 18/753,347

Image Processing Method, Electronic Device, and Non-Transitory Readable Storage Medium

Non-Final OA §102§103
Filed
Jun 25, 2024
Examiner
GE, JIN
Art Unit
2619
Tech Center
2600 — Communications
Assignee
Vivo Mobile Communication Co., Ltd.
OA Round
1 (Non-Final)
80%
Grant Probability
Favorable
1-2
OA Rounds
2y 9m
To Grant
98%
With Interview

Examiner Intelligence

Grants 80% — above average
80%
Career Allow Rate
416 granted / 520 resolved
+18.0% vs TC avg
Strong +18% interview lift
Without
With
+18.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
38 currently pending
Career history
558
Total Applications
across all art units

Statute-Specific Performance

§101
9.0%
-31.0% vs TC avg
§103
60.2%
+20.2% vs TC avg
§102
12.0%
-28.0% vs TC avg
§112
11.0%
-29.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 520 resolved cases

Office Action

§102 §103
DETAILED ACTION Claims 1-20 are pending in the present application. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority Acknowledgment is made of applicant's claim for foreign priority under 35 U.S.C. 119(a)-(d). The certified copy of China patent application number CN2021116398682 filed on 12/29/2021 has been received and made of record. Information Disclosure Statement The information disclosure statements (IDS) submitted on 06/25/2024 and 02/17/2025 are in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-6, 9-14, and 17-20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by U.S. PGPubs 2013/0235069 to Ubillos et al. Regarding claim 1, Ubillos et al. teach an image processing method (par 0004), wherein the method comprises: displaying a first image and a target control (Fig 3, par 0093-0095, “FIG. 3 illustrates an example GUI 300 of an image editing application of some embodiments that provides a set of UI controls for adjusting color values of only a portion of an image that is associated with a type of content at five different stages 305, 310, 315, 320, and 325”); receiving a first input to the target control by a user (Fig 3, par 101, “The fourth stage 320 illustrates the GUI 300 after the user has begun adjusting the saturation of the image 355 by moving the knob of the UI control 352 to the right, as indicated by the arrow 362”, Fig 16, par 0187, “the process receives (at 1615) a user input on a particular UI control. As mentioned above, some of the UI controls are for adjusting color values of different portions of the image. In some embodiments, the application includes a set of pre-defined ranges of color values for the UI controls”); and in response to the first input, adjusting the first image to a target effect corresponding to the first input, wherein the target control is used for adjusting color or light shadow (Fig 3, par 0104, “The image editing application of some embodiments also provides a color adjustment UI control that adjusts the saturation of only a portion of an image”, Fig 16, par 0188-0189, “The process 1600 shows that in some embodiments, the process first identifies the set of pixels in the image that falls within the range of color values, and then performs the adjustment to the set of identified set of pixels”, par 0269, “The UI control 3130 (knob 3130) is for adjusting the shadows of the image, the UI controls 3135 and 3140 (knobs 3135 and 3140) are for adjusting the contrast of the image, the UI control 3145 (knob 3145) is for adjusting the brightness of the image, and the UI control 3150 (knob 3150) is for adjusting the highlights of the image”). . Regarding claim 2, Ubillos et al. teach all the limitation of claim 1, and further teach wherein the target control is used for adjusting color, and before the adjusting the first image to a target effect corresponding to the first input, the method further comprises: obtaining a type of the first image; and the adjusting the first image to a target effect corresponding to the first input comprises: in a case that the first image is of a first type, adjusting, in a first manner, the first image to a first effect corresponding to the first input; and in a case that the first image is of a second type, adjusting, in a second manner, the first image to a second effect corresponding to the first input (par 0071-0073, par 0077, “the image editing application provides a set of UI controls for adjusting color values of only a portion of an image that is related to a type of content (e.g., sky, foliage, etc.) that is associated with a color range. When an input is received through a UI control for adjusting color values of a type of content on an image, the application automatically identifies a set of pixels in the image that are associated with the type of content. ….the set of UI controls includes a UI control for adjusting color values of only the sky colors in an image. In these embodiments, the application first identifies a set of pixels in the image with color values that fall within a pre-defined range of sky color values. The application then applies a color adjustment to only the identified set of pixels. In addition to adjusting color values of sky colors, the set of UI controls in some embodiments also include a UI control for adjusting color values of only the foliage colors in the image”, par 0091, “provides a set of UI controls for adjusting color values of only a portion of an image that is associated with a type of content. In some embodiments, each UI control is for adjusting color values related to a different type of content in the image. When an input is received through a particular UI control, the application automatically identifies a set of pixels in the image that are related to the type of content that is controlled by the particular UI control. The application then adjusts only the color values of the identified set of pixels based on the user input.”, par 0185-0189, “The process begins by performing (at 1605) a content analysis on the image ….he process then adjusts (at 1630) the color values of the set of pixels that have been identified at operation 1625. The process 1600 shows that in some embodiments, the process first identifies the set of pixels in the image that falls within the range of color values, and then performs the adjustment to the set of identified set of pixels” ….identify different contents in the image and provide different color adjust for these contents). Regarding claim 3, Ubillos et al. teach all the limitation of claim 2, and further teach wherein the adjusting, in a first manner, the first image to a first effect corresponding to the first input comprises: determining a preset channel of the first image; adjusting a first color of the preset channel to a first target color value corresponding to the first input; and adjusting a second color of the first image to a second target color value corresponding to the first input; wherein the first effect is indicated by the first target color value and the second target color value (par 0008, “in response to the user input, the application of some embodiments may perform a saturation adjustment, a contrast adjustment, and a brightness adjustment to the color values of the image”, par 0119, “FIG. 4 illustrates an example operation of adjusting the saturation of an image. Instead of adjusting the saturation of an image, some embodiments provide a skin-tone UI control that allows the user to adjust the color temperature of an image. A color temperature is a characteristic of visible light that reflects off of the objects in the image. A warmer light that is hitting the objects in the image creates a warmer color tone (i.e., more red and yellow) to the colors of the objects in the image while a cooler light that is hitting the objects in the image creates a cooler color tone (i.e., more blue and cyan) to the colors of the objects in the image. Thus, adjusting the color temperature of an image means adding more red/yellow or adding more cyan/blue to the image”, par 0185-0190, “ the process retrieves the range of color values associated with the particular UI control from the media storage and identifies the pixels with color values that fall within the range of color values. In other embodiments, the ranges of color values are defined within the executable codes for performing the color adjustments. In these other embodiments, the identification operation is executed at the same time as the process performs the color adjustments to the image ….the process then adjusts (at 1630) the color values of the set of pixels that have been identified at operation 1625. The process 1600 shows that in some embodiments, the process first identifies the set of pixels in the image that falls within the range of color values, and then performs the adjustment to the set of identified set of pixels”, par 0193, “the application of some other embodiments performs more than one type of adjustment to the color values of the image in response to a single user input on a particular UI control. For example, when a user provides an input to the sky UI control 354, the application of some embodiments performs a saturation adjustment, a contrast adjustment, and a brightness adjustment to the color values of a portion of the image. In these embodiments, the application uses the single user input to determine an adjustment value for adjusting contrast, an adjustment value for adjusting the saturation, and an adjustment value for adjusting brightness for the image, and applies these separate adjustment values to the color values of the image”). Regarding claim 4, Ubillos et al. teach all the limitation of claim 3, and further teach wherein the preset channel comprises a first channel, a second channel, and a third channel; and the adjusting a first color of the preset channel to a first target color value corresponding to the first input comprises at least one of the following: adjusting a first color of the first channel to a first sub-color value corresponding to the first channel in the first target color value; adjusting a first color of the second channel to a second sub-color value corresponding to the second channel in the first target color value; or adjusting a first color of the third channel to a third sub-color value corresponding to the third channel in the first target color value (par 0008, “in response to the user input, the application of some embodiments may perform a saturation adjustment, a contrast adjustment, and a brightness adjustment to the color values of the image”, par 0119, “FIG. 4 illustrates an example operation of adjusting the saturation of an image. Instead of adjusting the saturation of an image, some embodiments provide a skin-tone UI control that allows the user to adjust the color temperature of an image. A color temperature is a characteristic of visible light that reflects off of the objects in the image. A warmer light that is hitting the objects in the image creates a warmer color tone (i.e., more red and yellow) to the colors of the objects in the image while a cooler light that is hitting the objects in the image creates a cooler color tone (i.e., more blue and cyan) to the colors of the objects in the image. Thus, adjusting the color temperature of an image means adding more red/yellow or adding more cyan/blue to the image”, par 0185-0190, “ the process retrieves the range of color values associated with the particular UI control from the media storage and identifies the pixels with color values that fall within the range of color values. In other embodiments, the ranges of color values are defined within the executable codes for performing the color adjustments. In these other embodiments, the identification operation is executed at the same time as the process performs the color adjustments to the image ….the process then adjusts (at 1630) the color values of the set of pixels that have been identified at operation 1625. The process 1600 shows that in some embodiments, the process first identifies the set of pixels in the image that falls within the range of color values, and then performs the adjustment to the set of identified set of pixels”, par 0193, “the application of some other embodiments performs more than one type of adjustment to the color values of the image in response to a single user input on a particular UI control. For example, when a user provides an input to the sky UI control 354, the application of some embodiments performs a saturation adjustment, a contrast adjustment, and a brightness adjustment to the color values of a portion of the image. In these embodiments, the application uses the single user input to determine an adjustment value for adjusting contrast, an adjustment value for adjusting the saturation, and an adjustment value for adjusting brightness for the image, and applies these separate adjustment values to the color values of the image” ….adjust color temperature, saturation, brightness, contrast, and so on). Regarding claim 5, Ubillos et al. teach all the limitation of claim 2, and further teach wherein the adjusting, in a second manner, the first image to a second effect corresponding to the first input comprises: determining a main color channel, an auxiliary color channel, and a variegated color channel of the first image; adjusting a third color of a target channel to a third target color value corresponding to the first input, wherein the target channel is the main color channel or the variegated color channel; adjusting a third color of the auxiliary color channel to a fourth target color value corresponding to the first input; and adjusting a fourth color of the first image to a fifth target color value corresponding to the first input; wherein the second effect is indicated by the third target color value, the fourth target color value, and the fifth target color value (par 0008, “in response to the user input, the application of some embodiments may perform a saturation adjustment, a contrast adjustment, and a brightness adjustment to the color values of the image”, par 0119, “FIG. 4 illustrates an example operation of adjusting the saturation of an image. Instead of adjusting the saturation of an image, some embodiments provide a skin-tone UI control that allows the user to adjust the color temperature of an image. A color temperature is a characteristic of visible light that reflects off of the objects in the image. A warmer light that is hitting the objects in the image creates a warmer color tone (i.e., more red and yellow) to the colors of the objects in the image while a cooler light that is hitting the objects in the image creates a cooler color tone (i.e., more blue and cyan) to the colors of the objects in the image. Thus, adjusting the color temperature of an image means adding more red/yellow or adding more cyan/blue to the image”, par 0123, par 0127, “the image editing application of some embodiments also provides a UI control for adjusting only a portion of the image (e.g., the sky colors or the foliage colors of the image)”, par 0185-0190, “ the process retrieves the range of color values associated with the particular UI control from the media storage and identifies the pixels with color values that fall within the range of color values. In other embodiments, the ranges of color values are defined within the executable codes for performing the color adjustments. In these other embodiments, the identification operation is executed at the same time as the process performs the color adjustments to the image ….the process then adjusts (at 1630) the color values of the set of pixels that have been identified at operation 1625. The process 1600 shows that in some embodiments, the process first identifies the set of pixels in the image that falls within the range of color values, and then performs the adjustment to the set of identified set of pixels”, par 0193, “the application of some other embodiments performs more than one type of adjustment to the color values of the image in response to a single user input on a particular UI control. For example, when a user provides an input to the sky UI control 354, the application of some embodiments performs a saturation adjustment, a contrast adjustment, and a brightness adjustment to the color values of a portion of the image. In these embodiments, the application uses the single user input to determine an adjustment value for adjusting contrast, an adjustment value for adjusting the saturation, and an adjustment value for adjusting brightness for the image, and applies these separate adjustment values to the color values of the image” …..adjust sky color, foilage color and skin tone). Regarding claim 6, Ubillos et al. teach all the limitation of claim 2, and further teach wherein the adjusting, in a second manner, the first image to a second effect corresponding to the first input comprises: determining a target color temperature adjustment amount corresponding to a target hue range in two color temperature adjustment amounts corresponding to the first input, wherein the target hue range is a hue range in which a hue of a main color of the first image is located; and adjusting a color temperature of the first image based on the target color temperature adjustment amount; wherein the second effect is indicated by an after-adjustment color temperature of the first image (par 0004, “the image editing application provides a set of UI controls for adjusting color values of only a portion of an image that is related to a type of content (e.g., sky, foliage, etc.) that is associated with a color range “, par 0077, “The application of some embodiments first defines different ranges of color values to be associated with different types of contents. The application then determines whether the color values of a set of pixels that corresponds to the selected location in the image fall within a range of color values associated with a particular type of content, and displays a set of on-image UI controls that is associated with the particular type of content “, par 0118-0120, “the application is able to identify colors that fall within the defined skin-tone color area 560 (e.g., color 555) and colors that do not fall within the defined skin-tone color region 560 (e.g., color 545) ….Instead of adjusting the saturation of an image, some embodiments provide a skin-tone UI control that allows the user to adjust the color temperature of an image. A color temperature is a characteristic of visible light that reflects off of the objects in the image”, par 0123, “the application adjust the color temperature of the image in the YIQ color space by adjusting only the values along the red/green color component and the yellow/blue color component without changing the values along the white/black component. In some embodiments, the application adjusts the color temperature of the image in the YIQ color space because applying adjustments in the YIQ color space instead of the original color space of the color values of the image (e.g., the RGB color space) creates a more visibly pleasing results”, par 0126, “the application uses the same technique as described above to identify pixels in the image with color values that fall within a pre-defined range of color values that is associated with skin-tone colors. The application then only adjusts the color temperature of the identified pixels within the image”). Regarding claim 9, Ubillos et al. teach an electronic device, comprising a processor, a memory, and a program or instructions stored in the memory and executable on the processor, wherein the program or the instructions, when executed by the processor, cause the electronic device to perform (par 0089, par 0392-0393). The remaining limitations of the claim are similar in scope to claim 1 and rejected under the same rationale. Regarding claims 10-14, Ubillos et al. teach all the limitation of claim 9, the claims 10-14 are similar in scope to claims 2-6 and are rejected under the same rational. Regarding claims 17, Ubillos et al. teach an electronic device, comprising a processor, a memory, and a program or instructions stored in the memory and executable on the processor, wherein the program or the instructions, when executed by the processor, cause the electronic device to perform (par 0392-0393). The remaining limitations of the claim are similar in scope to claim 1 and rejected under the same rationale. Regarding claims 18-20, Ubillos et al. teach all the limitation of claim 17, the claims 18-20 are similar in scope to claims 2-4 and are rejected under the same rational. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 7 and 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over U.S. PGPubs 2013/0235069 to Ubillos et al. in view of U.S. PGPubs 2006/0028483 to Kondo et al.. Regarding claim 7, Ubillos et al. teach all the limitation of claim 5, and further teach wherein the determining a main color channel, an auxiliary color channel, and a variegated color channel of the first image comprises: determining a main color and an auxiliary color of the first image; determining all unit color channels corresponding to a first preset range as the main color channel; determining all unit color channels corresponding to a second preset range as the auxiliary color channel; and determining an unit color channel other than the main color channel and the auxiliary color channel in the color wheel as the variegated color channel (par 0008, “in response to the user input, the application of some embodiments may perform a saturation adjustment, a contrast adjustment, and a brightness adjustment to the color values of the image”, par 0119, “FIG. 4 illustrates an example operation of adjusting the saturation of an image. Instead of adjusting the saturation of an image, some embodiments provide a skin-tone UI control that allows the user to adjust the color temperature of an image. A color temperature is a characteristic of visible light that reflects off of the objects in the image. A warmer light that is hitting the objects in the image creates a warmer color tone (i.e., more red and yellow) to the colors of the objects in the image while a cooler light that is hitting the objects in the image creates a cooler color tone (i.e., more blue and cyan) to the colors of the objects in the image. Thus, adjusting the color temperature of an image means adding more red/yellow or adding more cyan/blue to the image”, par 0123, par 0127, “the image editing application of some embodiments also provides a UI control for adjusting only a portion of the image (e.g., the sky colors or the foliage colors of the image)”, par 0185-0190, “ the process retrieves the range of color values associated with the particular UI control from the media storage and identifies the pixels with color values that fall within the range of color values. In other embodiments, the ranges of color values are defined within the executable codes for performing the color adjustments. In these other embodiments, the identification operation is executed at the same time as the process performs the color adjustments to the image ….the process then adjusts (at 1630) the color values of the set of pixels that have been identified at operation 1625. The process 1600 shows that in some embodiments, the process first identifies the set of pixels in the image that falls within the range of color values, and then performs the adjustment to the set of identified set of pixels”, par 0193, “the application of some other embodiments performs more than one type of adjustment to the color values of the image in response to a single user input on a particular UI control. For example, when a user provides an input to the sky UI control 354, the application of some embodiments performs a saturation adjustment, a contrast adjustment, and a brightness adjustment to the color values of a portion of the image. In these embodiments, the application uses the single user input to determine an adjustment value for adjusting contrast, an adjustment value for adjusting the saturation, and an adjustment value for adjusting brightness for the image, and applies these separate adjustment values to the color values of the image” …..adjust sky color, foilage color and skin tone), but keep silent for teach determining all unit color channels corresponding to a first preset angle range in a color wheel as the main color channel, wherein a center of the first preset angle range coincides with a first position of the main color in the color wheel; determining all unit color channels corresponding to a second preset angle range in the color wheel as the auxiliary color channel, wherein a center of the second preset angle range coincides with a second position of the auxiliary color in the color wheel; and determining an unit color channel other than the main color channel and the auxiliary color channel in the color wheel as the variegated color channel. In related endeavor, Kondo et al. teach determining all unit color channels corresponding to a preset angle range in the color wheel as a color channel, wherein a center of the second preset angle range coincides with a second position of the color in the color wheel (par 0011-0012, “a user interface unit including a color designation part enabling selective designation of any color from among a plurality of colors, and an adjustment degree designation part able to instruct the degree of adjustment of at least attributes of the color designated by the color designation part, and a control unit for instructing the image generation unit so as to generate an image able to display at least a color designated by the color designation part of the user interface”, Figs 5-6, par 0064-0070, “the palette 1716 is displayed divided into six color systems, specifically, as shown FIG. 6, a red color system (R), magenta color system (M), yellow color system (Y), green color system (G), blue color system (B), and cobalt color system (C). Basically, the palette 1716 displays the six color systems divided into ranges of 60 degrees each, but this can be freely changed. Further, when the position serving as the axis of the color to be displayed is further designated from the color system range by the color position instruction part 1712, the palette 1716 moves (rotates) about the color axis CAX in accordance with a designation operation as instructed by a broken line in for example FIG. 6. The information concerning the color system, axis of the color, range, saturation, and hue designated by the user interface unit 17 having such the GUI function is supplied to the CPU 16. The CPU 16 can display the color designated by the color designation part 1711 and instructs the adjustment unit 13 including the output image generation unit 123 to add the RCP display screen 171 as the graphical instruction region so as to generate the image. When receiving the instructions of the color position instruction part 1712, the range designation part 1713, the saturation designation part 1714, and/or hue designation part 1715 of the RCP display screen 171, the CPU 16 instructs the adjustment unit 13 including the output image generation unit 133 to display the color instructed by the color designation part 1711 at the position and/or within the range of the designated color and with the designated saturation and/or hue. The CPU 16 instructs the adjustment unit 13 including the output image generation unit 133 to display in color an image showing only the color system designated by the color designation part 1711 and the palette 1716 when the color designation part 1711, the color position instruction part 1712, and the range designation part 1713 are selected. The CPU 16 generates the image at the output image generation unit 133 under the control of the CPU 16 to display an image showing the color systems other than the color system designated by the color designation part 1711 and the palette 1716 (all color systems displayed) when the saturation designation part 1714 and the hue designation part 1715 are selected”). It would have been obvious to a person of ordinary skill in the art at the time before the effective filing data of the claimed invention to modified Ubillos et al. to include determining all unit color channels corresponding to a preset angle range in the color wheel as a color channel, wherein a center of the second preset angle range coincides with a second position of the color in the color wheel as taught by Kondo et al. to introduce color wheel in adjusting color value through include determining all unit color channels corresponding to a preset angle range in the color wheel as a color channel, wherein a center of the second preset angle range coincides with a second position of the color in the color wheel, as taught by Kondo et al., apply to determine color channel of a main color, an auxiliary color, and variegated color of the first image separately, as taught by Ubillos et al., to teach wherein the determining a main color channel, an auxiliary color channel, and a variegated color channel of the first image comprises: determining a main color and an auxiliary color of the first image; determining all unit color channels corresponding to a first preset angle range in a color wheel as the main color channel, wherein a center of the first preset angle range coincides with a first position of the main color in the color wheel; determining all unit color channels corresponding to a second preset angle range in the color wheel as the auxiliary color channel, wherein a center of the second preset angle range coincides with a second position of the auxiliary color in the color wheel; and determining an unit color channel other than the main color channel and the auxiliary color channel in the color wheel as the variegated color channel to visually determine which part of the image is color need adjusted in order to reproduce the preferred color to improve image quality during editing. Regarding claim 15, Ubillos et al. teach all the limitation of claim 13, the claim 15 is similar in scope to claim 7 and is rejected under the same rational. Claim(s) 8 and 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over U.S. PGPubs 2013/0235069 to Ubillos et al. in view of U.S. PGPubs 2006/0028483 to Kondo et al., further in view of China PGPubs CN109427036 to Zhang et al.. Regarding claim 8, Ubillos et al. as modified by Kondo et al. teach all the limitation of claim 7, but keep silent for teaching wherein the determining a main color and an auxiliary color of the first image comprises: determining a color with a largest sum of saturation and brightness in colors of the first image as the main color; determining at least one color region in the first image based on the main color and the color wheel, wherein each color region corresponds to one color, and an included angle between a position of a color corresponding to each color region in the color wheel and the first position is greater than or equal to a preset angle; and determining a color corresponding to a color region with a largest area in the at least one color region as the auxiliary color. In related endeavor, Zhang et al. teach wherein the determining a main color and an auxiliary color of the first image comprises: determining a color with a largest sum of saturation and brightness in colors of the first image as the main color; determining at least one color region in the first image based on the main color and the color wheel, wherein each color region corresponds to one color, and an included angle between a position of a color corresponding to each color region in the color wheel and the first position is greater than or equal to a preset angle; and determining a color corresponding to a color region with a largest area in the at least one color region as the auxiliary color (par 0065, “the parameter value of the auxiliary color is determined according to the hue parameter value of the primary color, and the skin color is determined and output according to the parameter value of the primary color and the parameter value of the auxiliary color; because the auxiliary color matched with the main color can be obtained according to the color matching principle, for example, the matching between the main color and the auxiliary color may include: the auxiliary color is similar to the main color, or the auxiliary color is complementary to the main color, and the like, so that the matching between the auxiliary color and the main color can form a skin color with more harmonious color matching and/or a skin color with more impact color matching, and the color quality of the skin can be improved”, par 0089-0091, “determining a parameter value of a dominant color; the parameter values for the dominant color may include hue parameter values. Optionally, the parameter values for characterizing the dominant color may comprise a set of parameter values, for example, a hue parameter value indicating a corresponding spectral color on the color circle, a saturation parameter value indicating a degree of proximity to the spectral color, and a brightness parameter value indicating a degree of brightness of the color may be comprised in the set of parameter values. The skin of the application may typically include a plurality of colors, which may include a primary color that is the primary color of the skin and at least one secondary color that is a color used to co-ordinate with the primary color. The parameter values of the primary color correspond to the primary color and the parameter values of the secondary color correspond to the secondary color“, Fig 3, par 0095-0098, “after determining the hue parameter value of the primary color, the hue parameter value of the secondary color can be determined according to the hue parameter value of the primary color by combining the color circle. That is, a certain angle and/or angle range corresponding to the auxiliary color on the color wheel is determined, so that the range of the auxiliary color is consistent with the similar color and/or the complementary color of the main color, and the certain angle corresponding to the color wheel of the auxiliary color is the hue parameter value in the parameter values of the auxiliary color. In addition, the parameter values of the auxiliary color may also include a saturation parameter value and a brightness parameter value, and thus, the preset saturation parameter value and brightness parameter value of the auxiliary color may be combined to obtain the parameter values of the auxiliary color“ …. determine main color and auxiliary color base on a saturation parameter value and a brightness parameter value using color wheel). It would have been obvious to a person of ordinary skill in the art at the time before the effective filing data of the claimed invention to modified Ubillos et al. as modified by Kondo et al. to include wherein the determining a main color and an auxiliary color of the first image comprises: determining a color with a largest sum of saturation and brightness in colors of the first image as the main color; determining at least one color region in the first image based on the main color and the color wheel, wherein each color region corresponds to one color, and an included angle between a position of a color corresponding to each color region in the color wheel and the first position is greater than or equal to a preset angle; and determining a color corresponding to a color region with a largest area in the at least one color region as the auxiliary color as taught by Zhang et al. to determining the parameter value of the auxiliary color according to the hue parameter value of the main color to improve the color matching quality of skin color and improve the treatment efficiency of skin. Regarding claim 16, Ubillos et al. as modified by Kondo et al. teach all the limitation of claim 15, the claim 16 is similar in scope to claim 8 and is rejected under the same rational. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Jin Ge whose telephone number is (571)272-5556. The examiner can normally be reached 8:00 to 5:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jason Chan can be reached at (571)272-3022. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. JIN . GE Examiner Art Unit 2619 /JIN GE/ Primary Examiner, Art Unit 2619
Read full office action

Prosecution Timeline

Jun 25, 2024
Application Filed
Mar 08, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12592024
QUANTIFICATION OF SENSOR COVERAGE USING SYNTHETIC MODELING AND USES OF THE QUANTIFICATION
2y 5m to grant Granted Mar 31, 2026
Patent 12586296
METHODS AND PROCESSORS FOR RENDERING A 3D OBJECT USING MULTI-CAMERA IMAGE INPUTS
2y 5m to grant Granted Mar 24, 2026
Patent 12579704
VIDEO GENERATION METHOD AND APPARATUS, DEVICE, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 17, 2026
Patent 12573164
DESIGN DEVICE, PRODUCTION METHOD, AND STORAGE MEDIUM STORING DESIGN PROGRAM
2y 5m to grant Granted Mar 10, 2026
Patent 12573151
PERSONALIZED DEFORMABLE MESH BY FINETUNING ON PERSONALIZED TEXTURE
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
80%
Grant Probability
98%
With Interview (+18.0%)
2y 9m
Median Time to Grant
Low
PTA Risk
Based on 520 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month