Prosecution Insights
Last updated: April 19, 2026
Application No. 18/696,662

SYSTEMS AND METHOD FOR SKIN COLOR DETERMINATION

Non-Final OA §102§103
Filed
Mar 28, 2024
Examiner
WOLFSON, ETHAN NOAH
Art Unit
2673
Tech Center
2600 — Communications
Assignee
Fitskin Inc.
OA Round
1 (Non-Final)
Grant Probability
Favorable
1-2
OA Rounds
2y 9m
To Grant

Examiner Intelligence

Grants only 0% of cases
0%
Career Allow Rate
0 granted / 0 resolved
-62.0% vs TC avg
Minimal +0% lift
Without
With
+0.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
15 currently pending
Career history
15
Total Applications
across all art units

Statute-Specific Performance

§101
14.3%
-25.7% vs TC avg
§103
51.4%
+11.4% vs TC avg
§102
20.0%
-20.0% vs TC avg
§112
8.6%
-31.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 0 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Specification Objections The specification is objected to because of the following informalities: In paragraph [0027], line 1, “in system 102…” should read “in system …” in order to avoid typographical issue. Appropriate correction is required. Claim Objections Claims 2-12, 14-20, 25-27 and 29-31 are objected to because of the following informalities: In claim 2, line 1, the term “the system of claim 1 wherein” should be changed to “the system of claim 1, wherein” in order to avoid typographical issue. In claim 3, line 1, the term “the system of claim 2 wherein” should be changed to “the system of claim 2, wherein” in order to avoid typographical issue. In claim 4, line 1, the term “the system of claim 1 wherein” should be changed to “the system of claim 1, wherein” in order to avoid typographical issue. In claim 5, line 1, the term “the system of claim 1 wherein” should be changed to “the system of claim 1, wherein” in order to avoid typographical issue. In claim 6, line 1, the term “the system of claim 5 wherein” should be changed to “the system of claim 5, wherein” in order to avoid typographical issue. In claim 7, line 1, the term “the system of claim 5 wherein” should be changed to “the system of claim 5, wherein” in order to avoid typographical issue. In claim 8, line 1, the term “the system of claim 5 wherein” should be changed to “the system of claim 5, wherein” in order to avoid typographical issue. In claim 9, line 1, the term “the system of claim 1 wherein” should be changed to “the system of claim 1, wherein” in order to avoid typographical issue. In claim 10, line 1, the term “the system of claim 9 wherein” should be changed to “the system of claim 9, wherein” in order to avoid typographical issue. In claim 11, line 1, the term “the system of claim 10 wherein” should be changed to “the system of claim 10, wherein” in order to avoid typographical issue. In claim 12, line 1, the term “the system of claim 1 wherein” should be changed to “the system of claim 1, wherein” in order to avoid typographical issue. In claim 14, line 1, the term “the system of claim 13 wherein” should be changed to “the system of claim 13, wherein” in order to avoid typographical issue. In claim 15, line 1, the term “the system of claim 14 wherein” should be changed to “the system of claim 14, wherein” in order to avoid typographical issue. In claim 16, line 1, the term “the system of claim 13 wherein” should be changed to “the system of claim 13, wherein” in order to avoid typographical issue. In claim 17, line 1, the term “the system of claim 13 wherein” should be changed to “the system of claim 13, wherein” in order to avoid typographical issue. In claim 18, line 1, the term “the system of claim 17 wherein” should be changed to “the system of claim 17, wherein” in order to avoid typographical issue. In claim 19, line 1, the term “the system of claim 17 wherein” should be changed to “the system of claim 17, wherein” in order to avoid typographical issue. In claim 20, line 1, the term “the system of claim 17 wherein” should be changed to “the system of claim 17, wherein” in order to avoid typographical issue. In claim 26, line 1, the term “the system of claim 25 wherein” should be changed to “the system of claim 25, wherein” in order to avoid typographical issue. In claim 27, line 1, the term “the system of claim 26 wherein” should be changed to “the system of claim 26, wherein” in order to avoid typographical issue. In claim 29, line 1, the term “the system of claim 28 wherein” should be changed to “the system of claim 28, wherein” in order to avoid typographical issue. In claim 30, line 1, the term “the system of claim 29 wherein” should be changed to “the system of claim 29, wherein” in order to avoid typographical issue. In claim 31, line 1, the term “the system of claim 30 wherein” should be changed to “the system of claim 30, wherein” in order to avoid typographical issue. In claim 7, line 2, the term “a desired speed and a desired” should be changed to “a desired speed, and a desired” in order to avoid typographical issue. In claim 7, line 2, the term “a desired speed and a desired” should be changed to “a desired speed, and a desired” in order to avoid typographical issue. In claim 11, line 3, the term “the user skin color and the transposed” should be changed to “the user skin color, and the transposed” in order to avoid typographical issue. In claim 5, line 4, the term “below a image” should be changed to “below an image” in order to avoid typographical issue. In claim 17, line 4, the term “below a image” should be changed to “below an image” in order to avoid typographical issue. In claim 25, line 9, the term “below a image” should be changed to “below an image” in order to avoid typographical issue. In claim 29, line 3, the term “below a image” should be changed to “below an image” in order to avoid typographical issue. In claim 4, Line 4 the term “if the average pixel” should be changed to “when the average pixel” in order for the subject matter to be positively recited since applicant`s disclosure in the specification dated 03/28/2024 in for example paragraph [0032] states an alternative possibility of the steps not being performed. Please see MPEP 2111. 04. In claim 5, Line 4 the term “if the average pixel” should be changed to “when the average pixel” in order for the subject matter to be positively recited since applicant`s disclosure in the specification dated 03/28/2024 in for example paragraph [0032] states an alternative possibility of the steps not being performed. Please see MPEP 2111. 04. In claim 16, Line 4 the term “if the average pixel” should be changed to “when the average pixel” in order for the subject matter to be positively recited since applicant`s disclosure in the specification dated 03/28/2024 in for example paragraph [0032] states an alternative possibility of the steps not being performed. Please see MPEP 2111. 04. In claim 17, Line 4 the term “if the average pixel” should be changed to “when the average pixel” in order for the subject matter to be positively recited since applicant`s disclosure in the specification dated 03/28/2024 in for example paragraph [0032] states an alternative possibility of the steps not being performed. Please see MPEP 2111. 04. In claim 25, Line 9 the term “if the average pixel” should be changed to “when the average pixel” in order for the subject matter to be positively recited since applicant`s disclosure in the specification dated 03/28/2024 in for example paragraph [0032] states an alternative possibility of the steps not being performed. Please see MPEP 2111. 04. In claim 29, Line 2 the term “if the average pixel” should be changed to “when the average pixel” in order for the subject matter to be positively recited since applicant`s disclosure in the specification dated 03/28/2024 in for example paragraph [0032] states an alternative possibility of the steps not being performed. Please see MPEP 2111. 04. Appropriate correction is required. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1, 4-5, 13, 16-17, 25, and 28-29 are rejected under 35 U.S.C. 102(a)(1)/(a)(2) as being anticipated by GUPTA et al. (US 20190180083 A1), hereinafter referenced as GUPTA. Regarding claim 1, GUPTA explicitly teaches a system for user skin color determination of a user (Fig. 3, #314 called a color determination module. Paragraph [0049]), the system comprising: a skin analysis assembly (Fig. 3, #314 called a color determination module. Paragraph [0049]-GUPTA discloses the color determination module 314 represents functionality to identify skin colors from a detected face.), configured to: obtain a user skin image of the user (Fig. 3. Paragraph [0040]-GUPTA discloses the object recognition module 116 and the skin selection module 118 in this example are incorporated as part of a system to generate data effective to select a portion of a digital image 302 that corresponds to exposed skin of a person depicted in the image.), comprising one or more user skin image pixels (Fig. 3. Paragraph [0049]-GUPTA discloses the color determination module 314 represents functionality to identify skin colors from a detected face. In particular, the detected face data 320 indicates pixels that correspond to the skin of detected faces.); perform color processing on the user skin image to arrive at a user skin color (Fig. 3. Paragraph [0049]-GUPTA discloses the color determination module 314 represents functionality to identify skin colors from a detected face. In particular, the detected face data 320 indicates pixels that correspond to the skin of detected faces and the color determination module 314 determines colors of these pixels.); and output a result of the color processing (Fig. 3. Paragraph [0049]-GUPTA discloses the color determination module 314 memorializes these determinations by producing the face color data 324, which indicates the skin colors identified by the color determination module 314. The face color data 324 may indicate the identified colors in a variety of ways without departing from the spirit or scope of the described techniques. By way of example, the color determination module 314 may generate the face color data 324 as a list of colors identified in the depicted face skin (e.g., a list of different RGB values), as a range of colors identified (e.g., a range of RGB values), and so forth. The skin colors indicated by the face color data 324 are then used to identify exposed skin in the digital image.), the result comprising the user skin color (Fig. 3. Paragraph [0049]-GUPTA discloses the color determination module 314 memorializes these determinations by producing the face color data 324, which indicates the skin colors identified by the color determination module 314.). Regarding claim 4, GUPTA explicitly teaches the system of claim 1, GUPTA further explicitly teaches wherein performing color processing further comprises (Fig. 3. Paragraph [0049]-GUPTA discloses the color determination module 314 represents functionality to identify skin colors from a detected face. In particular, the detected face data 320 indicates pixels that correspond to the skin of detected faces and the color determination module 314 determines colors of these pixels.): determining an average pixel condition color score for the user skin image pixels in the user skin image (Fig. 1. Paragraph [0054]-GUPTA discloses the skin selection module 118 computes contours of the portions of the digital image 302 that are indicated by the initially generated skin selection data. The skin selection module 118 may compute an average intensity of these contours.); and if the average pixel condition color score is below an image condition color threshold then (Figs. 1 and 3. Paragraph [0055]-GUPTA discloses the skin selection module 118 identifies portions of the grayscale image that have an average intensity less than an intensity threshold): calculating a per-pixel condition color threshold (Figs. 1 and 3. Paragraph [0053]-GUPTA discloses the skin selection module 118 may apply a threshold to a mask indicative of the skin selection, such as the grayscale mask (having values from 0-255) discussed just above. The threshold may correspond simply to a particular value (e.g., 192), such that the skin selection module 118 compares values indicated by the grayscale image to the threshold. If the comparison indicates that a pixel has a grayscale image value less than this threshold, then the skin selection module 118 identifies the pixel as not being human skin.); generating a condition color mask based on the per-pixel condition color threshold (Figs. 1 and 6. Paragraph [0053]-GUPTA discloses the skin selection module 118 may apply a threshold to a mask indicative of the skin selection, such as the grayscale mask (having values from 0-255) discussed just above.); and removing pixels, from the user skin image pixels, that exceed the per-pixel condition color threshold (Fig. 6. Paragraph [0081]-GUPTA discloses the grayscale mask is modified based on the detection to remove the false positives and false negatives. By way of example, the skin selection module 118 detects false positives and false negatives based on the average intensities computed at block 610. In one or more implementations, false positives are pixels that initially are identified as corresponding to skin but that do not actually correspond to skin whereas false negatives are pixels that initially are not identified as corresponding to skin but that actually do correspond to skin. The skin selection module 118 modifies the grayscale mask values of detected false positives to the value (e.g., for black) indicating the respective pixel does not match the determined skin colors.); and obtaining a user skin color from the user skin image pixels (Fig. 3. Paragraph [0049]-GUPTA discloses the color determination module 314 represents functionality to identify skin colors from a detected face. In particular, the detected face data 320 indicates pixels that correspond to the skin of detected faces and the color determination module 314 determines colors of these pixels.). Regarding claim 5, GUPTA explicitly teaches the system of claim 1, GUPTA further explicitly teaches wherein performing color processing further comprises (Fig. 3. Paragraph [0049]-GUPTA discloses the color determination module 314 represents functionality to identify skin colors from a detected face. In particular, the detected face data 320 indicates pixels that correspond to the skin of detected faces and the color determination module 314 determines colors of these pixels.): determining an average pixel redness score for the user skin image pixels in the user skin image (Fig. 1. Paragraph [0054]-GUPTA discloses the skin selection module 118 computes contours of the portions of the digital image 302 that are indicated by the initially generated skin selection data. The skin selection module 118 may compute an average intensity of these contours. Therefore, it would have been obvious to one of ordinary skill of the art at the time the invention was made to have specified the system to compute average intensity of the redness of the contours, since GUPTA discloses computing the intensity for a greyscale image. Thus, in order to have a system to specify the redness of a user’s skin.); and if the average pixel redness score is below a image redness threshold then (Figs. 1 and 3. Paragraph [0055]-GUPTA discloses the skin selection module 118 identifies portions of the grayscale image that have an average intensity less than an intensity threshold. Therefore, it would have been obvious to one of ordinary skill of the art at the time the invention was made to have specified the system to compute average intensity of the redness of the image, since GUPTA discloses computing the intensity for a greyscale image. Thus, in order to have a system to specify the redness of a user’s skin.): calculating a per-pixel redness threshold (Figs. 1 and 3. Paragraph [0053]-GUPTA discloses the skin selection module 118 may apply a threshold to a mask indicative of the skin selection, such as the grayscale mask (having values from 0-255) discussed just above. The threshold may correspond simply to a particular value (e.g., 192), such that the skin selection module 118 compares values indicated by the grayscale image to the threshold. If the comparison indicates that a pixel has a grayscale image value less than this threshold, then the skin selection module 118 identifies the pixel as not being human skin. Therefore, it would have been obvious to one of ordinary skill of the art at the time the invention was made to have specified the system to compute the redness threshold, since GUPTA discloses computing the intensity for a greyscale image. Thus, in order to have a system to specify the redness of a user’s skin.); generating a redness mask based on the per-pixel redness threshold (Figs. 1 and 6. Paragraph [0053]-GUPTA discloses the skin selection module 118 may apply a threshold to a mask indicative of the skin selection, such as the grayscale mask (having values from 0-255) discussed just above. Therefore, it would have been obvious to one of ordinary skill of the art at the time the invention was made to have specified the system to generate a redness mask based on the redness of the image, since GUPTA discloses a greyscale mask and threshold. Thus, in order to have a system to specify the redness of a user’s skin.); and removing pixels, from the user skin image pixels, that exceed the per-pixel redness threshold (Fig. 6. Paragraph [0081]-GUPTA discloses the grayscale mask is modified based on the detection to remove the false positives and false negatives. By way of example, the skin selection module 118 detects false positives and false negatives based on the average intensities computed at block 610. In one or more implementations, false positives are pixels that initially are identified as corresponding to skin but that do not actually correspond to skin whereas false negatives are pixels that initially are not identified as corresponding to skin but that actually do correspond to skin. The skin selection module 118 modifies the grayscale mask values of detected false positives to the value (e.g., for black) indicating the respective pixel does not match the determined skin colors. Therefore, it would have been obvious to one of ordinary skill of the art at the time the invention was made to have specified the system to remove pixels exceed the redness threshold, since GUPTA discloses a greyscale threshold. Thus, in order to have a system to specify the redness of a user’s skin.); and obtaining a user skin color from the user skin image pixels (Fig. 3. Paragraph [0049]-GUPTA discloses the color determination module 314 represents functionality to identify skin colors from a detected face. In particular, the detected face data 320 indicates pixels that correspond to the skin of detected faces and the color determination module 314 determines colors of these pixels.). Regarding claim 13, GUPTA explicitly teaches a method for user skin color determination of a user (Fig. 3, #314 called a color determination module. Paragraph [0049]), the method comprising: obtaining, by a skin analysis assembly (Fig. 1, #114 called image processing system. Paragraph [0031]), a user skin image of the user (Fig. 3. Paragraph [0049]-GUPTA discloses the color determination module 314 represents functionality to identify skin colors from a detected face. In particular, the detected face data 320 indicates pixels that correspond to the skin of detected faces.), comprising one or more user skin image pixels (Fig. 3. Paragraph [0049]-GUPTA discloses the color determination module 314 represents functionality to identify skin colors from a detected face. In particular, the detected face data 320 indicates pixels that correspond to the skin of detected faces.); performing color processing on the user skin image to arrive at a user skin color (Fig. 3. Paragraph [0049]-GUPTA discloses the color determination module 314 represents functionality to identify skin colors from a detected face. In particular, the detected face data 320 indicates pixels that correspond to the skin of detected faces and the color determination module 314 determines colors of these pixels); and outputting a result of the color processing (Fig. 3. Paragraph [0049]-GUPTA discloses the color determination module 314 memorializes these determinations by producing the face color data 324, which indicates the skin colors identified by the color determination module 314. The face color data 324 may indicate the identified colors in a variety of ways without departing from the spirit or scope of the described techniques. By way of example, the color determination module 314 may generate the face color data 324 as a list of colors identified in the depicted face skin (e.g., a list of different RGB values), as a range of colors identified (e.g., a range of RGB values), and so forth. The skin colors indicated by the face color data 324 are then used to identify exposed skin in the digital image.), the result comprising the user skin color (Fig. 3. Paragraph [0049]-GUPTA discloses the color determination module 314 memorializes these determinations by producing the face color data 324, which indicates the skin colors identified by the color determination module 314). Regarding claim 16, GUPTA explicitly teaches the method of claim 13, GUPTA further explicitly teaches wherein performing color processing further comprises (Fig. 3. Paragraph [0049]-GUPTA discloses the color determination module 314 represents functionality to identify skin colors from a detected face. In particular, the detected face data 320 indicates pixels that correspond to the skin of detected faces and the color determination module 314 determines colors of these pixels.): determining an average pixel condition color score for the user skin image pixels in the user skin image (Fig. 1. Paragraph [0054]-GUPTA discloses the skin selection module 118 computes contours of the portions of the digital image 302 that are indicated by the initially generated skin selection data. The skin selection module 118 may compute an average intensity of these contours.); and if the average pixel condition color score is below an image condition color threshold then (Figs. 1 and 3. Paragraph [0055]-GUPTA discloses the skin selection module 118 identifies portions of the grayscale image that have an average intensity less than an intensity threshold): calculating a per-pixel condition color threshold (Figs. 1 and 3. Paragraph [0053]-GUPTA discloses the skin selection module 118 may apply a threshold to a mask indicative of the skin selection, such as the grayscale mask (having values from 0-255) discussed just above. The threshold may correspond simply to a particular value (e.g., 192), such that the skin selection module 118 compares values indicated by the grayscale image to the threshold. If the comparison indicates that a pixel has a grayscale image value less than this threshold, then the skin selection module 118 identifies the pixel as not being human skin.); generating a condition color mask based on the per-pixel condition color threshold (Figs. 1 and 6. Paragraph [0053]-GUPTA discloses the skin selection module 118 may apply a threshold to a mask indicative of the skin selection, such as the grayscale mask (having values from 0-255) discussed just above.); and removing pixels, from the user skin image pixels, that exceed the per-pixel condition color threshold (Fig. 6. Paragraph [0081]-GUPTA discloses the grayscale mask is modified based on the detection to remove the false positives and false negatives. By way of example, the skin selection module 118 detects false positives and false negatives based on the average intensities computed at block 610. In one or more implementations, false positives are pixels that initially are identified as corresponding to skin but that do not actually correspond to skin whereas false negatives are pixels that initially are not identified as corresponding to skin but that actually do correspond to skin. The skin selection module 118 modifies the grayscale mask values of detected false positives to the value (e.g., for black) indicating the respective pixel does not match the determined skin colors.); and obtaining a user skin color from the user skin image pixels (Fig. 3. Paragraph [0049]-GUPTA discloses the color determination module 314 represents functionality to identify skin colors from a detected face. In particular, the detected face data 320 indicates pixels that correspond to the skin of detected faces and the color determination module 314 determines colors of these pixels.). Regarding claim 17, GUPTA explicitly teaches the method of claim 13, GUPTA further explicitly teaches wherein the color processing further comprises (Fig. 3. Paragraph [0049]-GUPTA discloses the color determination module 314 represents functionality to identify skin colors from a detected face. In particular, the detected face data 320 indicates pixels that correspond to the skin of detected faces and the color determination module 314 determines colors of these pixels.): determining an average pixel redness score for the user skin image pixels in the user skin image (Fig. 1. Paragraph [0054]-GUPTA discloses the skin selection module 118 computes contours of the portions of the digital image 302 that are indicated by the initially generated skin selection data. The skin selection module 118 may compute an average intensity of these contours. Therefore, it would have been obvious to one of ordinary skill of the art at the time the invention was made to have specified the system to compute average intensity of the redness of the contours, since GUPTA discloses computing the intensity for a greyscale image. Thus, in order to have a system to specify the redness of a user’s skin.); and if the average pixel redness score is below a image redness threshold then (Figs. 1 and 3. Paragraph [0055]-GUPTA discloses the skin selection module 118 identifies portions of the grayscale image that have an average intensity less than an intensity threshold. Therefore, it would have been obvious to one of ordinary skill of the art at the time the invention was made to have specified the system to compute average intensity of the redness of the image, since GUPTA discloses computing the intensity for a greyscale image. Thus, in order to have a system to specify the redness of a user’s skin.): calculating a per-pixel redness threshold (Figs. 1 and 3. Paragraph [0053]-GUPTA discloses the skin selection module 118 may apply a threshold to a mask indicative of the skin selection, such as the grayscale mask (having values from 0-255) discussed just above. The threshold may correspond simply to a particular value (e.g., 192), such that the skin selection module 118 compares values indicated by the grayscale image to the threshold. If the comparison indicates that a pixel has a grayscale image value less than this threshold, then the skin selection module 118 identifies the pixel as not being human skin. Therefore, it would have been obvious to one of ordinary skill of the art at the time the invention was made to have specified the system to compute the redness threshold, since GUPTA discloses computing the intensity for a greyscale image. Thus, in order to have a system to specify the redness of a user’s skin.); generating a redness mask based on the per-pixel redness threshold (Figs. 1 and 6. Paragraph [0053]-GUPTA discloses the skin selection module 118 may apply a threshold to a mask indicative of the skin selection, such as the grayscale mask (having values from 0-255) discussed just above. Therefore, it would have been obvious to one of ordinary skill of the art at the time the invention was made to have specified the system to generate a redness mask based on the redness of the image, since GUPTA discloses a greyscale mask and threshold. Thus, in order to have a system to specify the redness of a user’s skin.); and removing pixels, from the user skin image pixels, that exceed the per-pixel redness threshold (Fig. 6. Paragraph [0081]-GUPTA discloses the grayscale mask is modified based on the detection to remove the false positives and false negatives. By way of example, the skin selection module 118 detects false positives and false negatives based on the average intensities computed at block 610. In one or more implementations, false positives are pixels that initially are identified as corresponding to skin but that do not actually correspond to skin whereas false negatives are pixels that initially are not identified as corresponding to skin but that actually do correspond to skin. The skin selection module 118 modifies the grayscale mask values of detected false positives to the value (e.g., for black) indicating the respective pixel does not match the determined skin colors. Therefore, it would have been obvious to one of ordinary skill of the art at the time the invention was made to have specified the system to remove pixels exceed the redness threshold, since GUPTA discloses a greyscale threshold. Thus, in order to have a system to specify the redness of a user’s skin.); and obtaining a user skin color from the user skin image pixels obtaining a user skin color from the user skin image pixels (Fig. 3. Paragraph [0049]-GUPTA discloses the color determination module 314 represents functionality to identify skin colors from a detected face. In particular, the detected face data 320 indicates pixels that correspond to the skin of detected faces and the color determination module 314 determines colors of these pixels.). Regarding claim 25, GUPTA explicitly teaches a system for user skin color determination of a user (Fig. 3, #314 called a color determination module. Paragraph [0049]), the system comprising: a skin analysis assembly (Fig. 3, #314 called a color determination module. Paragraph [0049]-GUPTA discloses the color determination module 314 represents functionality to identify skin colors from a detected face.), configured to: obtain a user skin image of the user, (Fig. 3. Paragraph [0040]-GUPTA discloses the object recognition module 116 and the skin selection module 118 in this example are incorporated as part of a system to generate data effective to select a portion of a digital image 302 that corresponds to exposed skin of a person depicted in the image.), comprising one or more user skin image pixels (Fig. 3. Paragraph [0049]-GUPTA discloses the color determination module 314 represents functionality to identify skin colors from a detected face. In particular, the detected face data 320 indicates pixels that correspond to the skin of detected faces.); perform color processing on the user skin image to arrive at a user skin color, wherein the color processing further comprises (Fig. 3. Paragraph [0049]-GUPTA discloses the color determination module 314 represents functionality to identify skin colors from a detected face. In particular, the detected face data 320 indicates pixels that correspond to the skin of detected faces and the color determination module 314 determines colors of these pixels.): determining an average pixel redness score for the user skin image pixels in the user skin image (Fig. 1. Paragraph [0054]-GUPTA discloses the skin selection module 118 computes contours of the portions of the digital image 302 that are indicated by the initially generated skin selection data. The skin selection module 118 may compute an average intensity of these contours. Therefore, it would have been obvious to one of ordinary skill of the art at the time the invention was made to have specified the system to compute average intensity of the redness of the contours, since GUPTA discloses computing the intensity for a greyscale image. Thus, in order to have a system to specify the redness of a user’s skin.); and if the average pixel redness score is below a image redness threshold then (Figs. 1 and 3. Paragraph [0055]-GUPTA discloses the skin selection module 118 identifies portions of the grayscale image that have an average intensity less than an intensity threshold. Therefore, it would have been obvious to one of ordinary skill of the art at the time the invention was made to have specified the system to compute average intensity of the redness of the image, since GUPTA discloses computing the intensity for a greyscale image. Thus, in order to have a system to specify the redness of a user’s skin.): calculating a per-pixel redness threshold (Figs. 1 and 3. Paragraph [0053]-GUPTA discloses the skin selection module 118 may apply a threshold to a mask indicative of the skin selection, such as the grayscale mask (having values from 0-255) discussed just above. The threshold may correspond simply to a particular value (e.g., 192), such that the skin selection module 118 compares values indicated by the grayscale image to the threshold. If the comparison indicates that a pixel has a grayscale image value less than this threshold, then the skin selection module 118 identifies the pixel as not being human skin. Therefore, it would have been obvious to one of ordinary skill of the art at the time the invention was made to have specified the system to compute the redness threshold, since GUPTA discloses computing the intensity for a greyscale image. Thus, in order to have a system to specify the redness of a user’s skin.); generating a redness mask based on the per-pixel redness threshold (Figs. 1 and 6. Paragraph [0053]-GUPTA discloses the skin selection module 118 may apply a threshold to a mask indicative of the skin selection, such as the grayscale mask (having values from 0-255) discussed just above. Therefore, it would have been obvious to one of ordinary skill of the art at the time the invention was made to have specified the system to generate a redness mask based on the redness of the image, since GUPTA discloses a greyscale mask and threshold. Thus, in order to have a system to specify the redness of a user’s skin.); and removing pixels, from the user skin image pixels, that exceed the per-pixel redness threshold (Fig. 6. Paragraph [0081]-GUPTA discloses the grayscale mask is modified based on the detection to remove the false positives and false negatives. By way of example, the skin selection module 118 detects false positives and false negatives based on the average intensities computed at block 610. In one or more implementations, false positives are pixels that initially are identified as corresponding to skin but that do not actually correspond to skin whereas false negatives are pixels that initially are not identified as corresponding to skin but that actually do correspond to skin. The skin selection module 118 modifies the grayscale mask values of detected false positives to the value (e.g., for black) indicating the respective pixel does not match the determined skin colors. Therefore, it would have been obvious to one of ordinary skill of the art at the time the invention was made to have specified the system to remove pixels exceed the redness threshold, since GUPTA discloses a greyscale threshold. Thus, in order to have a system to specify the redness of a user’s skin.); and calculating a user skin color from the user skin image pixels (Fig. 3. Paragraph [0049]-GUPTA discloses the color determination module 314 represents functionality to identify skin colors from a detected face. In particular, the detected face data 320 indicates pixels that correspond to the skin of detected faces and the color determination module 314 determines colors of these pixels. Further in Paragraph [0049]-GUPTA discloses the color determination module 314 memorializes these determinations by producing the face color data 324, which indicates the skin colors identified by the color determination module 314. The face color data 324 may indicate the identified colors in a variety of ways without departing from the spirit or scope of the described techniques.); and output a result of the color processing (Fig. 3. Paragraph [0049]-GUPTA discloses the color determination module 314 memorializes these determinations by producing the face color data 324, which indicates the skin colors identified by the color determination module 314. The face color data 324 may indicate the identified colors in a variety of ways without departing from the spirit or scope of the described techniques. By way of example, the color determination module 314 may generate the face color data 324 as a list of colors identified in the depicted face skin (e.g., a list of different RGB values), as a range of colors identified (e.g., a range of RGB values), and so forth. The skin colors indicated by the face color data 324 are then used to identify exposed skin in the digital image.), the result comprising the user skin color (Fig. 3. Paragraph [0049]-GUPTA discloses the color determination module 314 memorializes these determinations by producing the face color data 324, which indicates the skin colors identified by the color determination module 314.). Regarding claim 28, GUPTA explicitly teaches a method for user skin color determination of a user (Fig. 3, #314 called a color determination module. Paragraph [0049]), the method comprising: obtaining a user skin image of the user (Fig. 3. Paragraph [0040]-GUPTA discloses the object recognition module 116 and the skin selection module 118 in this example are incorporated as part of a system to generate data effective to select a portion of a digital image 302 that corresponds to exposed skin of a person depicted in the image.), comprising one or more user skin image pixels (Fig. 3. Paragraph [0049]-GUPTA discloses the color determination module 314 represents functionality to identify skin colors from a detected face. In particular, the detected face data 320 indicates pixels that correspond to the skin of detected faces.); performing color processing on the user skin image to determine a user skin color (Fig. 3. Paragraph [0049]-GUPTA discloses the color determination module 314 represents functionality to identify skin colors from a detected face. In particular, the detected face data 320 indicates pixels that correspond to the skin of detected faces and the color determination module 314 determines colors of these pixels); and outputting the result of the color processing (Fig. 3. Paragraph [0049]-GUPTA discloses the color determination module 314 memorializes these determinations by producing the face color data 324, which indicates the skin colors identified by the color determination module 314. The face color data 324 may indicate the identified colors in a variety of ways without departing from the spirit or scope of the described techniques. By way of example, the color determination module 314 may generate the face color data 324 as a list of colors identified in the depicted face skin (e.g., a list of different RGB values), as a range of colors identified (e.g., a range of RGB values), and so forth. The skin colors indicated by the face color data 324 are then used to identify exposed skin in the digital image.), the result comprising the user skin color (Fig. 3. Paragraph [0049]-GUPTA discloses the color determination module 314 memorializes these determinations by producing the face color data 324, which indicates the skin colors identified by the color determination module 314). Regarding claim 29, GUPTA explicitly teaches the method of claim 28, GUPTA further explicitly teaches further comprising determining an average pixel redness score user skin image pixels in the user skin image (Fig. 1. Paragraph [0054]-GUPTA discloses the skin selection module 118 computes contours of the portions of the digital image 302 that are indicated by the initially generated skin selection data. The skin selection module 118 may compute an average intensity of these contours. Therefore, it would have been obvious to one of ordinary skill of the art at the time the invention was made to have specified the system to compute average intensity of the redness of the contours, since GUPTA discloses computing the intensity for a greyscale image. Thus, in order to have a system to specify the redness of a user’s skin.); and if the average pixel redness score is below a image redness threshold then (Figs. 1 and 3. Paragraph [0055]-GUPTA discloses the skin selection module 118 identifies portions of the grayscale image that have an average intensity less than an intensity threshold. Therefore, it would have been obvious to one of ordinary skill of the art at the time the invention was made to have specified the system to compute average intensity of the redness of the image, since GUPTA discloses computing the intensity for a greyscale image. Thus, in order to have a system to specify the redness of a user’s skin.): calculating a per-pixel redness threshold (Figs. 1 and 3. Paragraph [0053]-GUPTA discloses the skin selection module 118 may apply a threshold to a mask indicative of the skin selection, such as the grayscale mask (having values from 0-255) discussed just above. The threshold may correspond simply to a particular value (e.g., 192), such that the skin selection module 118 compares values indicated by the grayscale image to the threshold. If the comparison indicates that a pixel has a grayscale image value less than this threshold, then the skin selection module 118 identifies the pixel as not being human skin. Therefore, it would have been obvious to one of ordinary skill of the art at the time the invention was made to have specified the system to compute the redness threshold, since GUPTA discloses computing the intensity for a greyscale image. Thus, in order to have a system to specify the redness of a user’s skin.); generating a redness mask based on the per-pixel redness threshold (Figs. 1 and 6. Paragraph [0053]-GUPTA discloses the skin selection module 118 may apply a threshold to a mask indicative of the skin selection, such as the grayscale mask (having values from 0-255) discussed just above. Therefore, it would have been obvious to one of ordinary skill of the art at the time the invention was made to have specified the system to generate a redness mask based on the redness of the image, since GUPTA discloses a greyscale mask and threshold. Thus, in order to have a system to specify the redness of a user’s skin.); and removing pixels, from the user skin image pixels, that exceed the per-pixel redness threshold (Fig. 6. Paragraph [0081]-GUPTA discloses the grayscale mask is modified based on the detection to remove the false positives and false negatives. By way of example, the skin selection module 118 detects false positives and false negatives based on the average intensities computed at block 610. In one or more implementations, false positives are pixels that initially are identified as corresponding to skin but that do not actually correspond to skin whereas false negatives are pixels that initially are not identified as corresponding to skin but that actually do correspond to skin. The skin selection module 118 modifies the grayscale mask values of detected false positives to the value (e.g., for black) indicating the respective pixel does not match the determined skin colors. Therefore, it would have been obvious to one of ordinary skill of the art at the time the invention was made to have specified the system to remove pixels exceed the redness threshold, since GUPTA discloses a greyscale threshold. Thus, in order to have a system to specify the redness of a user’s skin.); and calculating a user skin color from the user skin image pixels (Fig. 3. Paragraph [0049]-GUPTA discloses the color determination module 314 represents functionality to identify skin colors from a detected face. In particular, the detected face data 320 indicates pixels that correspond to the skin of detected faces and the color determination module 314 determines colors of these pixels.). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 2-3 and 14-15 are rejected under 35 U.S.C. 103 as being unpatentable over GUPTA et al. (US 20190180083 A1), hereinafter referenced as GUPTA, in view of RATTNER et al. (US 20190125249 A1), hereinafter referenced as RATTNER. Regarding claim 2, GUPTA explicitly teaches the system of claim 1, GUPTA fails to explicitly teach wherein the skin analysis assembly comprises a skin analysis device attached to a mobile device. However, RATTNER explicitly teaches wherein the skin analysis assembly comprises a skin analysis device attached to a mobile device (Fig. 1. Paragraph [0218]-RATTNER discloses Electronic device: a device, having a camera, onto which a skin analysis device can be attached, that may preferably be mobile (such as mobile phones and tablets), exemplary electronic devices including smart phones, tablets, digital cameras, personal computers, televisions and the like.) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of GUPTA of a system for user skin color determination of a user, the system comprising: a skin analysis assembly, configured to: obtain a user skin image of the user, comprising one or more user skin image pixels; perform color processing on the user skin image to arrive at a user skin color; and output a result of the color processing, the result comprising the user skin color with the teachings of RATTNER of wherein the skin analysis assembly comprises a skin analysis device attached to a mobile device. Wherein having GUPTA’s skin analysis system wherein the skin analysis assembly comprises a skin analysis device attached to a mobile device. The motivation behind the modification would have been to obtain skin analysis system and color determination system that enhances the accuracy of detecting skin and determining skin color in an image. Since both GUPTA and RATTNER are skin analysis systems, wherein GUPTA Human skin is an example of digital image content that client device users may desire to select for performing image operations, while RATTNER a need in the art for an improved method and system capable of skin analysis using electronic devices such as smartphones. Please see GUPTA et al. (US 20190180083 A1), Paragraph [0014], and RATTNER et al. (US 20190125249 A1), Paragraph [0009]. Regarding claim 3, GUPTA in view of RATTNER explicitly teaches the system of claim 2, GUPTA fails to explicitly teach wherein the user skin image is at a magnification of not less than 10x. However, RATTNER explicitly teaches wherein the user skin image is at a magnification of not less than 10x (Fig. 1. Paragraph [0270]-RATTNER discloses lens 34 may be a magnification lens that has a magnification factor as appropriate for the skin surface being imaged (for example a 30× lens 34 for skin analysis and a different magnification for hair analysis). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of GUPTA in view of RATTNER of a system for user skin color determination of a user, the system comprising: a skin analysis assembly, configured to: obtain a user skin image of the user, comprising one or more user skin image pixels; perform color processing on the user skin image to arrive at a user skin color; and output a result of the color processing, the result comprising the user skin color with the teachings of RATTNER of wherein the user skin image is at a magnification of not less than 10x. Wherein having GUPTA’s skin analysis system wherein the user skin image is at a magnification of not less than 10x. The motivation behind the modification would have been to obtain skin analysis system and color determination system that enhances the accuracy of detecting skin and determining skin color in an image. Since both GUPTA and RATTNER are skin analysis systems, wherein GUPTA Human skin is an example of digital image content that client device users may desire to select for performing image operations, while RATTNER a need in the art for an improved method and system capable of skin analysis using electronic devices such as smartphones. Please see GUPTA et al. (US 20190180083 A1), Paragraph [0014], and RATTNER et al. (US 20190125249 A1), Paragraph [0009]. Regarding claim 14, GUPTA explicitly teaches the method of claim 13, GUPTA fails to explicitly teach wherein the skin analysis assembly comprises a skin analysis device attached to a mobile device. However, RATTNER explicitly teaches wherein the skin analysis assembly comprises a skin analysis device attached to a mobile device (Fig. 1. Paragraph [0218]-RATTNER discloses Electronic device: a device, having a camera, onto which a skin analysis device can be attached, that may preferably be mobile (such as mobile phones and tablets), exemplary electronic devices including smart phones, tablets, digital cameras, personal computers, televisions and the like.) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of GUPTA of a method for user skin color determination of a user, the method comprising: obtaining, by a skin analysis assembly, a user skin image of the user, comprising one or more user skin image pixels, with the teachings of RATTNER of wherein the skin analysis assembly comprises a skin analysis device attached to a mobile device. Wherein having GUPTA’s skin analysis method wherein the skin analysis assembly comprises a skin analysis device attached to a mobile device. The motivation behind the modification would have been to obtain skin analysis system and color determination system that enhances the accuracy of detecting skin and determining skin color in an image. Since both GUPTA and RATTNER are skin analysis systems, wherein GUPTA Human skin is an example of digital image content that client device users may desire to select for performing image operations, while RATTNER a need in the art for an improved method and system capable of skin analysis using electronic devices such as smartphones. Please see GUPTA et al. (US 20190180083 A1), Paragraph [0014], and RATTNER et al. (US 20190125249 A1), Paragraph [0009]. Regarding claim 15, GUPTA in view of RATTNER explicitly teach the method of claim 14, GUPTA fails to explicitly teach wherein the user skin image is at a magnification of not less than 10x. However, RATTNER explicitly teaches wherein the user skin image is at a magnification of not less than 10x (Fig. 1. Paragraph [0270]-RATTNER discloses lens 34 may be a magnification lens that has a magnification factor as appropriate for the skin surface being imaged (for example a 30× lens 34 for skin analysis and a different magnification for hair analysis). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of GUPTA of a method for user skin color determination of a user, the method comprising: obtaining, by a skin analysis assembly, a user skin image of the user, comprising one or more user skin image pixels, with the teachings of RATTNER of wherein the user skin image is at a magnification of not less than 10x. Wherein having GUPTA’s skin analysis method wherein the user skin image is at a magnification of not less than 10x. The motivation behind the modification would have been to obtain skin analysis system and color determination system that enhances the accuracy of detecting skin and determining skin color in an image. Since both GUPTA and RATTNER are skin analysis systems, wherein GUPTA Human skin is an example of digital image content that client device users may desire to select for performing image operations, while RATTNER a need in the art for an improved method and system capable of skin analysis using electronic devices such as smartphones. Please see GUPTA et al. (US 20190180083 A1), Paragraph [0014], and RATTNER et al. (US 20190125249 A1), Paragraph [0009]. Claims 9, 12, 21, 24, 26, 30 are rejected under 35 U.S.C. 103 as being unpatentable over GUPTA et al. (US 20190180083 A1), hereinafter referenced as GUPTA, in view of BHATTI et al. (US 20070071314 A1), hereinafter referenced as BHATTI. Regarding claim 9, GUPTA explicitly teaches the system of claim 1, GUPTA further explicitly teaches wherein the system is further configured to (Fig. 3, #314 called a color determination module. Paragraph [0049]): GUPTA fails to explicitly teach apply a color transposition to the user skin color, based on a transposition between the system and an alternative skin color system, and wherein the result further comprises a transposed user skin color. However, BHATTI explicitly teaches apply a color transposition to the user skin color (Fig. 2. Paragraph [0035]-BHATTI discloses image analysis system 205 is for generating a skin color estimate 413 of subject 203 based upon an analysis of image 202. Further in Paragraph [0036]-BHATTI discloses image analysis system 205 can determine a transformation, or "color correction function," which accounts for the discrepancy between the characteristics of imaged reference color set 204 and control reference color set 208.), based on a transposition between the system and an alternative skin color system (Fig. 2. Paragraph [0036]-BHATTI discloses image analysis system 205 can determine a transformation, or "color correction function," which accounts for the discrepancy between the characteristics of imaged reference color set 204 and control reference color set 208.), and wherein the result further comprises a transposed user skin color (Fig. 2. Paragraph [0042]-BHATTI discloses the determined color correction function is applied to the color description of one or more of the selected skin pixels located in the image.). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of GUPTA of a system for user skin color determination of a user, the system comprising: a skin analysis assembly, configured to: obtain a user skin image of the user, comprising one or more user skin image pixels; perform color processing on the user skin image to arrive at a user skin color; and output a result of the color processing, the result comprising the user skin color with the teachings of BHATTI of apply a color transposition to the user skin color, based on a transposition between the system and an alternative skin color system, and wherein the result further comprises a transposed user skin color. Wherein having GUPTA’s skin analysis system apply a color transposition to the user skin color, based on a transposition between the system and an alternative skin color system, and wherein the result further comprises a transposed user skin color. The motivation behind the modification would have been to obtain skin analysis system and color determination system that enhances the accuracy of detecting skin and determining skin color in an image. Since both GUPTA and BHATTI are skin analysis systems, wherein GUPTA Human skin is an example of digital image content that client device users may desire to select for performing image operations, while BATTI the customer is presented with a smaller range of products from which to choose, but which are more suited for that customer based upon her needs. Please see GUPTA et al. (US 20190180083 A1), Paragraph [0014], and BHATTI et al. (US 20070071314 A1), Paragraph [0007]. Regarding claim 12, GUPTA explicitly teaches the system of claim 1, GUPTA further explicitly teaches wherein the system is further configured to (Fig. 3, #314 called a color determination module. Paragraph [0049]): GUPTA fails to explicitly teach calibrate the skin analysis assembly, by taking one or more images of a calibration card, to obtain a calibration transposition that is applied to the user skin image. However, BHATTI explicitly teaches calibrate the skin analysis assembly (Fig. 2. Paragraph [0036]-BHATTI discloses by comparing the characteristics of control reference color set 208 with the characteristics of the reference color set 204 captured in the image, image analysis system 205 can determine a transformation, or "color correction function," which accounts for the discrepancy between the characteristics of imaged reference color set 204 and control reference color set 208.), by taking one or more images of a calibration card (Fig. 2. Paragraph [0034]-BHATTI discloses system 200 comprises an image capture device 201 for capturing an image 202 comprising a subject (e.g., 203) and a imaged reference color set 204 (wherein the imaged reference color set is a calibration card).), to obtain a calibration transposition that is applied to the user skin image (Fig. 2. Paragraph [0036]-BHATTI discloses image analysis system 205 can determine a transformation, or "color correction function," which accounts for the discrepancy between the characteristics of imaged reference color set 204 and control reference color set 208.). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of GUPTA of a system for user skin color determination of a user, the system comprising: a skin analysis assembly, configured to: obtain a user skin image of the user, comprising one or more user skin image pixels; perform color processing on the user skin image to arrive at a user skin color; and output a result of the color processing, the result comprising the user skin color with the teachings of BHATTI of calibrate the skin analysis assembly, by taking one or more images of a calibration card, to obtain a calibration transposition that is applied to the user skin image. Wherein having GUPTA’s skin analysis system calibrate the skin analysis assembly, by taking one or more images of a calibration card, to obtain a calibration transposition that is applied to the user skin image. The motivation behind the modification would have been to obtain skin analysis system and color determination system that enhances the accuracy of detecting skin and determining skin color in an image. Since both GUPTA and BHATTI are skin analysis systems, wherein GUPTA Human skin is an example of digital image content that client device users may desire to select for performing image operations, while BATTI the customer is presented with a smaller range of products from which to choose, but which are more suited for that customer based upon her needs. Please see GUPTA et al. (US 20190180083 A1), Paragraph [0014], and BHATTI et al. (US 20070071314 A1), Paragraph [0007]. Regarding claim 21, GUPTA explicitly teaches the method of claim 13, GUPTA further explicitly teaches the method further comprising (Fig. 3, #314 called a color determination module. Paragraph [0049]): GUPTA fails to explicitly teach applying a color transposition to the user skin color, based on a transposition between the system and an alternative skin color system, and wherein the result further comprises a transposed user skin color. However, BHATTI explicitly teaches applying a color transposition to the user skin color (Fig. 2. Paragraph [0035]-BHATTI discloses image analysis system 205 is for generating a skin color estimate 413 of subject 203 based upon an analysis of image 202. Further in Paragraph [0036]-BHATTI discloses image analysis system 205 can determine a transformation, or "color correction function," which accounts for the discrepancy between the characteristics of imaged reference color set 204 and control reference color set 208.), based on a transposition between the system and an alternative skin color system (Fig. 2. Paragraph [0036]-BHATTI discloses image analysis system 205 can determine a transformation, or "color correction function," which accounts for the discrepancy between the characteristics of imaged reference color set 204 and control reference color set 208.), and wherein the result further comprises a transposed user skin color (Paragraph [0042]-BHATTI discloses the determined color correction function is applied to the color description of one or more of the selected skin pixels located in the image.). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of GUPTA of a method for user skin color determination of a user, the method comprising: obtaining, by a skin analysis assembly, a user skin image of the user, comprising one or more user skin image pixels; performing color processing on the user skin image to arrive at a user skin color; and outputting a result of the color processing, the result comprising the user skin color with the teachings of BHATTI of applying a color transposition to the user skin color, based on a transposition between the system and an alternative skin color system, and wherein the result further comprises a transposed user skin color. Wherein having GUPTA’s skin analysis system applying a color transposition to the user skin color, based on a transposition between the system and an alternative skin color system, and wherein the result further comprises a transposed user skin color. The motivation behind the modification would have been to obtain skin analysis system and color determination system that enhances the accuracy of detecting skin and determining skin color in an image. Since both GUPTA and BHATTI are skin analysis systems, wherein GUPTA Human skin is an example of digital image content that client device users may desire to select for performing image operations, while BATTI the customer is presented with a smaller range of products from which to choose, but which are more suited for that customer based upon her needs. Please see GUPTA et al. (US 20190180083 A1), Paragraph [0014], and BHATTI et al. (US 20070071314 A1), Paragraph [0007]. Regarding claim 24, GUPTA explicitly teaches the method of claim 13, GUPTA further explicitly teaches the method further comprising (Fig. 3, #314 called a color determination module. Paragraph [0049]): GUPTA fails to explicitly teach calibrating the skin analysis assembly, by taking one or more images of a calibration card, to obtain a calibration transposition that is applied to the user skin image. However, BHATTI explicitly teaches calibrating the skin analysis assembly (Fig. 2. Paragraph [0036]-BHATTI discloses by comparing the characteristics of control reference color set 208 with the characteristics of the reference color set 204 captured in the image, image analysis system 205 can determine a transformation, or "color correction function," which accounts for the discrepancy between the characteristics of imaged reference color set 204 and control reference color set 208.), by taking one or more images of a calibration card (Fig. 2. Paragraph [0034]-BHATTI discloses system 200 comprises an image capture device 201 for capturing an image 202 comprising a subject (e.g., 203) and a imaged reference color set 204 (wherein the imaged reference color set is a calibration card).), to obtain a calibration transposition that is applied to the user skin image (Paragraph [0036]-BHATTI discloses image analysis system 205 can determine a transformation, or "color correction function," which accounts for the discrepancy between the characteristics of imaged reference color set 204 and control reference color set 208.). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of GUPTA of a method for user skin color determination of a user, the method comprising: obtaining, by a skin analysis assembly, a user skin image of the user, comprising one or more user skin image pixels; performing color processing on the user skin image to arrive at a user skin color; and outputting a result of the color processing, the result comprising the user skin color with the teachings of BHATTI of calibrating the skin analysis assembly, by taking one or more images of a calibration card, to obtain a calibration transposition that is applied to the user skin image. Wherein having GUPTA’s skin analysis system calibrating the skin analysis assembly, by taking one or more images of a calibration card, to obtain a calibration transposition that is applied to the user skin image. The motivation behind the modification would have been to obtain skin analysis system and color determination system that enhances the accuracy of detecting skin and determining skin color in an image. Since both GUPTA and BHATTI are skin analysis systems, wherein GUPTA Human skin is an example of digital image content that client device users may desire to select for performing image operations, while BATTI the customer is presented with a smaller range of products from which to choose, but which are more suited for that customer based upon her needs. Please see GUPTA et al. (US 20190180083 A1), Paragraph [0014], and BHATTI et al. (US 20070071314 A1), Paragraph [0007]. Regarding claim 26, GUPTA explicitly teaches the system of claim 25, GUPTA further explicitly teaches wherein the system is further configured to (Fig. 3, #314 called a color determination module. Paragraph [0049]): GUPTA fails to explicitly teach perform a color transposition to the user skin color, based on a transposition between the system and an alternative skin color system, and wherein the result further comprises a transposed user skin color. However, BHATTI explicitly teaches perform a color transposition to the user skin color (Fig. 2. Paragraph [0035]-BHATTI discloses image analysis system 205 is for generating a skin color estimate 413 of subject 203 based upon an analysis of image 202. Further in Paragraph [0036]-BHATTI discloses image analysis system 205 can determine a transformation, or "color correction function," which accounts for the discrepancy between the characteristics of imaged reference color set 204 and control reference color set 208.), based on a transposition between the system and an alternative skin color system (Paragraph [0036]-BHATTI discloses image analysis system 205 can determine a transformation, or "color correction function," which accounts for the discrepancy between the characteristics of imaged reference color set 204 and control reference color set 208.), and wherein the result further comprises a transposed user skin color (Paragraph [0042]-BHATTI discloses the determined color correction function is applied to the color description of one or more of the selected skin pixels located in the image.). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of GUPTA of a system for user skin color determination of a user, the system comprising: a skin analysis assembly, configured to: obtain a user skin image of the user, comprising one or more user skin image pixels; perform color processing on the user skin image to arrive at a user skin color, wherein the color processing further comprises: with the teachings of BHATTI of perform a color transposition to the user skin color, based on a transposition between the system and an alternative skin color system, and wherein the result further comprises a transposed user skin color. Wherein having GUPTA’s skin analysis system perform a color transposition to the user skin color, based on a transposition between the system and an alternative skin color system, and wherein the result further comprises a transposed user skin color. The motivation behind the modification would have been to obtain skin analysis system and color determination system that enhances the accuracy of detecting skin and determining skin color in an image. Since both GUPTA and BHATTI are skin analysis systems, wherein GUPTA Human skin is an example of digital image content that client device users may desire to select for performing image operations, while BATTI the customer is presented with a smaller range of products from which to choose, but which are more suited for that customer based upon her needs. Please see GUPTA et al. (US 20190180083 A1), Paragraph [0014], and BHATTI et al. (US 20070071314 A1), Paragraph [0007]. Regarding claim 30, GUPTA explicitly teaches the method of claim 29, GUPTA fails to explicitly teach further comprising:performing a color transposition to the user skin color, based on a transposition between the system and an alternative skin color system, and wherein the result further comprises a transposed user skin color. However, BHATTI explicitly teaches performing a color transposition to the user skin color (Paragraph [0035]-BHATTI discloses image analysis system 205 is for generating a skin color estimate 413 of subject 203 based upon an analysis of image 202. Further in Paragraph [0036]-BHATTI discloses image analysis system 205 can determine a transformation, or "color correction function," which accounts for the discrepancy between the characteristics of imaged reference color set 204 and control reference color set 208.), based on a transposition between the system and an alternative skin color system (Paragraph [0036]-BHATTI discloses image analysis system 205 can determine a transformation, or "color correction function," which accounts for the discrepancy between the characteristics of imaged reference color set 204 and control reference color set 208.), and wherein the result further comprises a transposed user skin color (Paragraph [0042]-BHATTI discloses the determined color correction function is applied to the color description of one or more of the selected skin pixels located in the image.). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of GUPTA of a method for user skin color determination of a user, the method comprising: obtaining a user skin image of the user, comprising one or more user skin image pixels; performing color processing on the user skin image to determine a user skin color; and outputting the result of the color processing, the result comprising the user skin color the teachings of BHATTI of performing a color transposition to the user skin color, based on a transposition between the system and an alternative skin color system, and wherein the result further comprises a transposed user skin color. Wherein having GUPTA’s skin analysis system performing a color transposition to the user skin color, based on a transposition between the system and an alternative skin color system, and wherein the result further comprises a transposed user skin color. The motivation behind the modification would have been to obtain skin analysis system and color determination system that enhances the accuracy of detecting skin and determining skin color in an image. Since both GUPTA and BHATTI are skin analysis systems, wherein GUPTA Human skin is an example of digital image content that client device users may desire to select for performing image operations, while BATTI the customer is presented with a smaller range of products from which to choose, but which are more suited for that customer based upon her needs. Please see GUPTA et al. (US 20190180083 A1), Paragraph [0014], and BHATTI et al. (US 20070071314 A1), Paragraph [0007]. Claims 10-11, 22-23, 27, and 31 are rejected under 35 U.S.C. 103 as being unpatentable over GUPTA et al. (US 20190180083 A1), hereinafter referenced as GUPTA, in view of BHATTI et al. (US 20070071314 A1), hereinafter referenced as BHATTI, and further in view of HARVILLE et al. (US 20070058858 A1), hereinafter referenced as HARVILLE. Regarding claim 10, GUPTA in view of BHATTI explicitly teach the system of claim 9, GUPTA further explicitly teaches wherein the system is further configured to (Fig. 3, #314 called a color determination module. Paragraph [0049]): GUPTA fails to explicitly teach based on one or more of the user skin color and the transposed user skin color. However, BHATTI explicitly teaches based on one or more of the user skin color and the transposed user skin color (Fig. 2. Paragraph [0036]-BHATTI discloses image analysis system 205 can determine a transformation, or "color correction function," which accounts for the discrepancy between the characteristics of imaged reference color set 204 and control reference color set 208.). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of GUPTA in view of BHATTI of a system for user skin color determination of a user, the system comprising: a skin analysis assembly, configured to: obtain a user skin image of the user, comprising one or more user skin image pixels; perform color processing on the user skin image to arrive at a user skin color; and output a result of the color processing, the result comprising the user skin color with the teachings of BHATTI of based on one or more of the user skin color and the transposed user skin color. Wherein having GUPTA’s skin analysis system based on one or more of the user skin color and the transposed user skin color. The motivation behind the modification would have been to obtain skin analysis system and color determination system that enhances the accuracy of detecting skin and determining skin color in an image. Since both GUPTA and BHATTI are skin analysis systems, wherein GUPTA Human skin is an example of digital image content that client device users may desire to select for performing image operations, while BATTI the customer is presented with a smaller range of products from which to choose, but which are more suited for that customer based upon her needs. Please see GUPTA et al. (US 20190180083 A1), Paragraph [0014], and BHATTI et al. (US 20070071314 A1), Paragraph [0007]. GUPTA in view of BHATTI fail to explicitly teach provide a product recommendation, and wherein the result further comprises a product recommendation. However, HARVILLE explicitly teaches provide a product recommendation (Fig. 1. Paragraph [0024]-HARVILLE discloses in step 140 of FIG. 1, at least one product which corresponds with the classification color is recommended. In embodiments of the present invention, a result is generated in which at least one product is recommended to a user.), and wherein the result further comprises a product recommendation (Fig. 1. Paragraph [0024]-HARVILLE discloses in step 140 of FIG. 1, at least one product which corresponds with the classification color is recommended. In embodiments of the present invention, a result is generated in which at least one product is recommended to a user.). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of GUPTA in view of BHATTI of a system for user skin color determination of a user, the system comprising: a skin analysis assembly, configured to: obtain a user skin image of the user, comprising one or more user skin image pixels; perform color processing on the user skin image to arrive at a user skin color; and output a result of the color processing, the result comprising the user skin color with the teachings of HARVILLE of provide a product recommendation, and wherein the result further comprises a product recommendation. Wherein having GUPTA’s skin analysis system provide a product recommendation, and wherein the result further comprises a product recommendation. The motivation behind the modification would have been to obtain skin analysis system and color determination system that enhances the accuracy of detecting skin and determining skin color in an image. Since both GUPTA and HARVILLE relate to skin analysis systems, wherein GUPTA Human skin is an example of digital image content that client device users may desire to select for performing image operations, while HARVILLE the customer is presented with a smaller range of products from which to choose, but which are more suited for that customer based upon her needs. Please see GUPTA et al. (US 20190180083 A1), Paragraph [0014], and HARVILLE et al. (US 20070058858 A1), Paragraph [0005]. Regarding claim 11, GUPTA in view of BHATTI and further in view of HARVILLE explicitly teach the system of claim 10, GUPTA further explicitly teaches wherein the system is further configured to (Fig. 3, #314 called a color determination module. Paragraph [0049]): GUPTA fails to explicitly teach the user skin color and the transposed user skin color. However, BHATTI explicitly teaches the user skin color and the transposed user skin color (Fig. 2. Paragraph [0036]-BHATTI discloses image analysis system 205 can determine a transformation, or "color correction function," which accounts for the discrepancy between the characteristics of imaged reference color set 204 and control reference color set 208.). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of GUPTA in view of BHATTI and further in view of HARVILLE of a system for user skin color determination of a user, the system comprising: a skin analysis assembly, configured to: obtain a user skin image of the user, comprising one or more user skin image pixels; perform color processing on the user skin image to arrive at a user skin color; and output a result of the color processing, the result comprising the user skin color with the teachings of BHATTI of the user skin color and the transposed user skin color Wherein having GUPTA’s skin analysis system the user skin color and the transposed user skin color. The motivation behind the modification would have been to obtain skin analysis system and color determination system that enhances the accuracy of detecting skin and determining skin color in an image. Since both GUPTA and BHATTI are skin analysis systems, wherein GUPTA Human skin is an example of digital image content that client device users may desire to select for performing image operations, while BATTI the customer is presented with a smaller range of products from which to choose, but which are more suited for that customer based upon her needs. Please see GUPTA et al. (US 20190180083 A1), Paragraph [0014], and BHATTI et al. (US 20070071314 A1), Paragraph [0007]. GUPTA in view of BHATTI fail to explicitly teach receive an empirical feedback, based on one or more of the product recommendation. However, HARVILLE explicitly teaches receive an empirical feedback (Fig. 2. Paragraph [0045]-HARVILLE discloses subject 203 can identify additional parameters using, for example, a web interface. These parameters can be used by product recommendation system 220 to further identify the product(s) in which the user is interested. For example, users can indicate that they are interested in clothing, hair coloring, makeup, etc. The users may further indicate specific product groups in which they are interested such as eye makeup, foundation, lipstick, etc. Demographic information may also be collected to further refine a product consultation (wherein the additional parameters are feedback).), based on one or more of the product recommendation (Fig. 1. Paragraph [0024]-HARVILLE discloses in step 140 of FIG. 1, at least one product which corresponds with the classification color is recommended. In embodiments of the present invention, a result is generated in which at least one product is recommended to a user.), Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of GUPTA in view of BHATTI of a system for user skin color determination of a user, the system comprising: a skin analysis assembly, configured to: obtain a user skin image of the user, comprising one or more user skin image pixels; perform color processing on the user skin image to arrive at a user skin color; and output a result of the color processing, the result comprising the user skin color with the teachings of HARVILLE of receive an empirical feedback, based on one or more of the product recommendation. Wherein having GUPTA’s skin analysis system receive an empirical feedback, based on one or more of the product recommendation. The motivation behind the modification would have been to obtain skin analysis system and color determination system that enhances the accuracy of detecting skin and determining skin color in an image. Since both GUPTA and HARVILLE relate to skin analysis systems, wherein GUPTA Human skin is an example of digital image content that client device users may desire to select for performing image operations, while HARVILLE the customer is presented with a smaller range of products from which to choose, but which are more suited for that customer based upon her needs. Please see GUPTA et al. (US 20190180083 A1), Paragraph [0014], and HARVILLE et al. (US 20070058858 A1), Paragraph [0005]. Regarding claim 22, GUPTA in view of BHATTI explicitly teach the method of claim 21, GUPTA further explicitly teaches the method further comprising (Fig. 3, #314 called a color determination module. Paragraph [0049]): GUPTA fails to explicitly teach based on one or more of the user skin color and the transposed user skin color. However, BHATTI explicitly teaches based on one or more of the user skin color and the transposed user skin color (Fig. 2. Paragraph [0036]-BHATTI discloses image analysis system 205 can determine a transformation, or "color correction function," which accounts for the discrepancy between the characteristics of imaged reference color set 204 and control reference color set 208.), Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of GUPTA in view of BHATTI of a method for user skin color determination of a user, the method comprising: obtaining, by a skin analysis assembly, a user skin image of the user, comprising one or more user skin image pixels; performing color processing on the user skin image to arrive at a user skin color; and outputting a result of the color processing, the result comprising the user skin color with the teachings of BHATTI of based on one or more of the user skin color and the transposed user skin color. Wherein having GUPTA’s skin analysis method based on one or more of the user skin color and the transposed user skin color. The motivation behind the modification would have been to obtain skin analysis system and color determination system that enhances the accuracy of detecting skin and determining skin color in an image. Since both GUPTA and BHATTI are skin analysis systems, wherein GUPTA Human skin is an example of digital image content that client device users may desire to select for performing image operations, while BATTI the customer is presented with a smaller range of products from which to choose, but which are more suited for that customer based upon her needs. Please see GUPTA et al. (US 20190180083 A1), Paragraph [0014], and BHATTI et al. (US 20070071314 A1), Paragraph [0007]. GUPTA in view of BHATTI fail to explicitly teach providing a product recommendation, and wherein the result further comprises a product recommendation. However, HARVILLE explicitly teaches providing a product recommendation (Fig. 1. Paragraph [0024]-HARVILLE discloses in step 140 of FIG. 1, at least one product which corresponds with the classification color is recommended. In embodiments of the present invention, a result is generated in which at least one product is recommended to a user.), and wherein the result further comprises a product recommendation (Fig. 1. Paragraph [0024]-HARVILLE discloses in step 140 of FIG. 1, at least one product which corresponds with the classification color is recommended. In embodiments of the present invention, a result is generated in which at least one product is recommended to a user.). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of GUPTA in view of BHATTI of a method for user skin color determination of a user, the method comprising: obtaining, by a skin analysis assembly, a user skin image of the user, comprising one or more user skin image pixels; performing color processing on the user skin image to arrive at a user skin color; and outputting a result of the color processing, the result comprising the user skin color with the teachings of HARVILLE of providing a product recommendation, and wherein the result further comprises a product recommendation. Wherein having GUPTA’s skin analysis method providing a product recommendation, and wherein the result further comprises a product recommendation. The motivation behind the modification would have been to obtain skin analysis system and color determination system that enhances the accuracy of detecting skin and determining skin color in an image. Since both GUPTA and HARVILLE relate to skin analysis systems, wherein GUPTA Human skin is an example of digital image content that client device users may desire to select for performing image operations, while HARVILLE the customer is presented with a smaller range of products from which to choose, but which are more suited for that customer based upon her needs. Please see GUPTA et al. (US 20190180083 A1), Paragraph [0014], and HARVILLE et al. (US 20070058858 A1), Paragraph [0005]. Regarding claim 23, GUPTA in view of BHATTI and further in view of HARVILLE explicitly teach the method of claim 22, GUPTA further explicitly teaches the method further comprising (Fig. 3, #314 called a color determination module. Paragraph [0049]): GUPTA fails to explicitly teach the user skin color and the transposed user skin color. However, BHATTI explicitly teaches the user skin color and the transposed user skin color (Paragraph [0036]-BHATTI discloses image analysis system 205 can determine a transformation, or "color correction function," which accounts for the discrepancy between the characteristics of imaged reference color set 204 and control reference color set 208.). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of GUPTA in view of BHATTI and further in view of HARVILLE of a method for user skin color determination of a user, the method comprising: obtaining, by a skin analysis assembly, a user skin image of the user, comprising one or more user skin image pixels; performing color processing on the user skin image to arrive at a user skin color; and outputting a result of the color processing, the result comprising the user skin color with the teachings of BHATTI of the user skin color and the transposed user skin color. Wherein having GUPTA’s skin analysis method the user skin color and the transposed user skin color. The motivation behind the modification would have been to obtain skin analysis system and color determination system that enhances the accuracy of detecting skin and determining skin color in an image. Since both GUPTA and BHATTI are skin analysis systems, wherein GUPTA Human skin is an example of digital image content that client device users may desire to select for performing image operations, while BATTI the customer is presented with a smaller range of products from which to choose, but which are more suited for that customer based upon her needs. Please see GUPTA et al. (US 20190180083 A1), Paragraph [0014], and BHATTI et al. (US 20070071314 A1), Paragraph [0007]. GUPTA in view of BHATTI fail to explicitly teach receiving an empirical feedback, based on one or more of the product recommendation. However, HARVILLE explicitly teaches receiving an empirical feedback (Paragraph [0045]-HARVILLE discloses subject 203 can identify additional parameters using, for example, a web interface. These parameters can be used by product recommendation system 220 to further identify the product(s) in which the user is interested. For example, users can indicate that they are interested in clothing, hair coloring, makeup, etc. The users may further indicate specific product groups in which they are interested such as eye makeup, foundation, lipstick, etc. Demographic information may also be collected to further refine a product consultation (wherein the additional parameters are feedback).), based on one or more of the product recommendation (Paragraph [0024]-HARVILLE discloses in step 140 of FIG. 1, at least one product which corresponds with the classification color is recommended. In embodiments of the present invention, a result is generated in which at least one product is recommended to a user.). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of GUPTA in view of BHATTI of a method for user skin color determination of a user, the method comprising: obtaining, by a skin analysis assembly, a user skin image of the user, comprising one or more user skin image pixels; performing color processing on the user skin image to arrive at a user skin color; and outputting a result of the color processing, the result comprising the user skin color with the teachings of BHATTI of receiving an empirical feedback, based on one or more of the product recommendation. Wherein having GUPTA’s skin analysis method receiving an empirical feedback, based on one or more of the product recommendation. The motivation behind the modification would have been to obtain skin analysis system and color determination system that enhances the accuracy of detecting skin and determining skin color in an image. Since both GUPTA and HARVILLE relate to skin analysis systems, wherein GUPTA Human skin is an example of digital image content that client device users may desire to select for performing image operations, while HARVILLE the customer is presented with a smaller range of products from which to choose, but which are more suited for that customer based upon her needs. Please see GUPTA et al. (US 20190180083 A1), Paragraph [0014], and HARVILLE et al. (US 20070058858 A1), Paragraph [0005]. Regarding claim 27, GUPTA in view of BHATTI explicitly teach the system of claim 26, GUPTA further explicitly teaches wherein the system is further configured to (Fig. 3, #314 called a color determination module. Paragraph [0049]): GUPTA fails to explicitly teach based on one or more of the user skin color and the transposed user skin color. However, BHATTI explicitly teaches based on one or more of the user skin color and the transposed user skin color (Fig. 2. Paragraph [0036]-BHATTI discloses image analysis system 205 can determine a transformation, or "color correction function," which accounts for the discrepancy between the characteristics of imaged reference color set 204 and control reference color set 208.). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of GUPTA in view of BHATTI of a system for user skin color determination of a user, the system comprising: a skin analysis assembly, configured to: obtain a user skin image of the user, comprising one or more user skin image pixels; perform color processing on the user skin image to arrive at a user skin color, wherein the color processing further comprises: with the teachings of BHATTI of perform a color transposition to the user skin color, based on a transposition between the system and an alternative skin color system, and wherein the result further comprises a transposed user skin color. Wherein having GUPTA’s skin analysis system perform a color transposition to the user skin color, based on a transposition between the system and an alternative skin color system, and wherein the result further comprises a transposed user skin color. The motivation behind the modification would have been to obtain skin analysis system and color determination system that enhances the accuracy of detecting skin and determining skin color in an image. Since both GUPTA and BHATTI are skin analysis systems, wherein GUPTA Human skin is an example of digital image content that client device users may desire to select for performing image operations, while BATTI the customer is presented with a smaller range of products from which to choose, but which are more suited for that customer based upon her needs. Please see GUPTA et al. (US 20190180083 A1), Paragraph [0014], and BHATTI et al. (US 20070071314 A1), Paragraph [0007]. GUPTA in view of BHATTI fail to explicitly teach perform a product recommendation, and wherein the result further comprises a product recommendation. However, HARVILLE explicitly teaches perform a product recommendation (Fig. 1. Paragraph [0024]-HARVILLE discloses in step 140 of FIG. 1, at least one product which corresponds with the classification color is recommended. In embodiments of the present invention, a result is generated in which at least one product is recommended to a user.), and wherein the result further comprises a product recommendation (Fig. 1. Paragraph [0024]-HARVILLE discloses in step 140 of FIG. 1, at least one product which corresponds with the classification color is recommended. In embodiments of the present invention, a result is generated in which at least one product is recommended to a user.). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of GUPTA in view of BHATTI of a system for user skin color determination of a user, the system comprising: a skin analysis assembly, configured to: obtain a user skin image of the user, comprising one or more user skin image pixels; perform color processing on the user skin image to arrive at a user skin color, wherein the color processing further comprises: with the teachings of BHATTI of perform a color transposition to the user skin color, based on a transposition between the system and an alternative skin color system, and wherein the result further comprises a transposed user skin color. Wherein having GUPTA’s skin analysis system perform a color transposition to the user skin color, based on a transposition between the system and an alternative skin color system, and wherein the result further comprises a transposed user skin color. The motivation behind the modification would have been to obtain skin analysis system and color determination system that enhances the accuracy of detecting skin and determining skin color in an image. Since both GUPTA and HARVILLE relate to skin analysis systems, wherein GUPTA Human skin is an example of digital image content that client device users may desire to select for performing image operations, while HARVILLE the customer is presented with a smaller range of products from which to choose, but which are more suited for that customer based upon her needs. Please see GUPTA et al. (US 20190180083 A1), Paragraph [0014], and HARVILLE et al. (US 20070058858 A1), Paragraph [0005]. Regarding claim 31, GUPTA in view of BHATTI explicitly teaches the method of claim 30, GUPTA fails to explicitly teach further comprising: based on one or more of the user skin color and the transposed user skin color. However, BHATTI explicitly teaches based on one or more of the user skin color and the transposed user skin color (Fig. 2. Paragraph [0036]-BHATTI discloses image analysis system 205 can determine a transformation, or "color correction function," which accounts for the discrepancy between the characteristics of imaged reference color set 204 and control reference color set 208.), Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of GUPTA in view of BHATTI of a method for user skin color determination of a user, the method comprising: obtaining a user skin image of the user, comprising one or more user skin image pixels; performing color processing on the user skin image to determine a user skin color; and outputting the result of the color processing, the result comprising the user skin color the teachings of BHATTI of based on one or more of the user skin color and the transposed user skin color. Wherein having GUPTA’s skin analysis method based on one or more of the user skin color and the transposed user skin color. The motivation behind the modification would have been to obtain skin analysis system and color determination system that enhances the accuracy of detecting skin and determining skin color in an image. Since both GUPTA and BHATTI are skin analysis systems, wherein GUPTA Human skin is an example of digital image content that client device users may desire to select for performing image operations, while BATTI the customer is presented with a smaller range of products from which to choose, but which are more suited for that customer based upon her needs. Please see GUPTA et al. (US 20190180083 A1), Paragraph [0014], and BHATTI et al. (US 20070071314 A1), Paragraph [0007]. GUPTA in view of BHATTI fail to explicitly teach handling a product recommendation, and wherein the result further comprises a product recommendation. However, HARVILLE explicitly teaches handling a product recommendation (Fig. 1. Paragraph [0024]-HARVILLE discloses in step 140 of FIG. 1, at least one product which corresponds with the classification color is recommended. In embodiments of the present invention, a result is generated in which at least one product is recommended to a user.), and wherein the result further comprises a product recommendation (Fig. 1. Paragraph [0024]-HARVILLE discloses in step 140 of FIG. 1, at least one product which corresponds with the classification color is recommended. In embodiments of the present invention, a result is generated in which at least one product is recommended to a user.). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of GUPTA in view of BHATTI of a method for user skin color determination of a user, the method comprising: obtaining a user skin image of the user, comprising one or more user skin image pixels; performing color processing on the user skin image to determine a user skin color; and outputting the result of the color processing, the result comprising the user skin color the teachings of HARVILLE of handling a product recommendation, and wherein the result further comprises a product recommendation. Wherein having GUPTA’s skin analysis method handling a product recommendation, and wherein the result further comprises a product recommendation. The motivation behind the modification would have been to obtain skin analysis system and color determination system that enhances the accuracy of detecting skin and determining skin color in an image. Since both GUPTA and HARVILLE relate to skin analysis systems, wherein GUPTA Human skin is an example of digital image content that client device users may desire to select for performing image operations, while HARVILLE the customer is presented with a smaller range of products from which to choose, but which are more suited for that customer based upon her needs. Please see GUPTA et al. (US 20190180083 A1), Paragraph [0014], and HARVILLE et al. (US 20070058858 A1), Paragraph [0005]. Claims 8 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over GUPTA et al. (US 20190180083 A1), hereinafter referenced as GUPTA, in view of STEINBERG (US 6873743 B2), hereinafter referenced as STEINBERG. Regarding claim 8, GUPTA explicitly teaches the system of claim 5, GUPTA fails to explicitly teach wherein the calculating the per-pixel redness threshold is by testing various per-pixel redness threshold values and choosing the per-pixel redness threshold value that minimizes the delta E from skin images that already had LAB values However, STEINBERG explicitly teaches wherein the calculating the per-pixel redness threshold is by testing various per-pixel redness threshold values and choosing the per-pixel redness threshold value that minimizes the delta E from skin images that already had LAB values (Fig. 2(a). Col. 8. Lines [29-33]-STEINBERG discloses this is easily achieved by setting a threshold on the "a" and "L" components of a pixel. Thus if the conditions "a">"a.sub.RED " and "L">"L.sub.RED " are met, then a pixel is sufficiently red to be a potential member of a potential red-eye segment. Further in Col. 9. Lines [46-53]-STEINBERG discloses At this point we have a set of potential red-eye segments for the image however, as will be apparent to those skilled in the art, most of the segments identified in this initial stage of the detection process will not be due to red-eye defects. The next stage of our detection process is to eliminate those segments that do not satisfy a range of other characteristics and criteria normally associated with a segment that is a valid red-eye defect). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of GUPTA of a system for user skin color determination of a user, the system comprising: a skin analysis assembly, configured to: obtain a user skin image of the user, comprising one or more user skin image pixels; perform color processing on the user skin image to arrive at a user skin color; and output a result of the color processing, the result comprising the user skin color with the teachings of STEINBERG of apply a color transposition to the user skin color, based on a transposition between the system and an alternative skin color system, and wherein the result further comprises a transposed user skin color. Wherein having GUPTA’s skin analysis system apply a color transposition to the user skin color, based on a transposition between the system and an alternative skin color system, and wherein the result further comprises a transposed user skin color. The motivation behind the modification would have been to obtain skin analysis system and color determination system that enhances the accuracy of detecting skin and determining skin color in an image. Since both GUPTA and STEINBERG are related to skin detection systems, wherein GUPTA Human skin is an example of digital image content that client device users may desire to select for performing image operations, while STEINBERG color artifacts due to the misclassification of the membership of pixels in a defect are avoided. Please see GUPTA et al. (US 20190180083 A1), Paragraph [0014], and STEINBERG (US 6873743 B2), Col. 5. Regarding claim 20, GUPTA explicitly teaches the method of claim 17, GUPTA fails to explicitly teach wherein the calculating the per-pixel redness threshold is by testing various per-pixel redness threshold values and choosing the per-pixel redness threshold value that minimizes the delta E from skin images that already had LAB values However, STEINBERG explicitly teaches wherein the calculating the per-pixel redness threshold is by testing various per-pixel redness threshold values and choosing the per-pixel redness threshold value that minimizes the delta E from skin images that already had LAB values (Fig. 2(a). Col. 8. Lines [29-33]-STEINBERG discloses this is easily achieved by setting a threshold on the "a" and "L" components of a pixel. Thus if the conditions "a">"a.sub.RED " and "L">"L.sub.RED " are met, then a pixel is sufficiently red to be a potential member of a potential red-eye segment. Further in Col. 9. Lines [46-53]-STEINBERG discloses At this point we have a set of potential red-eye segments for the image however, as will be apparent to those skilled in the art, most of the segments identified in this initial stage of the detection process will not be due to red-eye defects. The next stage of our detection process is to eliminate those segments that do not satisfy a range of other characteristics and criteria normally associated with a segment that is a valid red-eye defect). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of GUPTA of a method for user skin color determination of a user, the method comprising: obtaining, by a skin analysis assembly, a user skin image of the user, comprising one or more user skin image pixels, with the teachings of STEINBERG of apply a color transposition to the user skin color, based on a transposition between the system and an alternative skin color system, and wherein the result further comprises a transposed user skin color. Wherein having GUPTA’s skin analysis method apply a color transposition to the user skin color, based on a transposition between the system and an alternative skin color system, and wherein the result further comprises a transposed user skin color. The motivation behind the modification would have been to obtain skin analysis system and color determination system that enhances the accuracy of detecting skin and determining skin color in an image. Since both GUPTA and STEINBERG are related to skin detection systems, wherein GUPTA Human skin is an example of digital image content that client device users may desire to select for performing image operations, while STEINBERG color artifacts due to the misclassification of the membership of pixels in a defect are avoided. Please see GUPTA et al. (US 20190180083 A1), Paragraph [0014], and STEINBERG (US 6873743 B2), Col. 5. Claims 6 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over GUPTA et al. (US 20190180083 A1), hereinafter referenced as GUPTA, in view of LEE et al. (US 20210279547 A1), hereinafter referenced as LEE. Regarding claim 6, GUPTA explicitly teaches the system of claim 5, GUPTA fails to explicitly teach wherein the determining further comprises using a human perception mimicking algorithm developed using empirical A/B testing results. However, LEE explicitly teaches wherein the determining further comprises using a human perception mimicking algorithm developed using empirical A/B testing results (Fig. 2. Paragraph [0133]-LEE discloses this is the first attempt to formally test the possibility that computational models mimicking the way the brain solves general problems can lead to practical solutions to key challenges in machine learning.). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of GUPTA of a system for user skin color determination of a user, the system comprising: a skin analysis assembly, configured to: obtain a user skin image of the user, comprising one or more user skin image pixels; perform color processing on the user skin image to arrive at a user skin color; and output a result of the color processing, the result comprising the user skin color, with the teachings of LEE of wherein the determining further comprises using a human perception mimicking algorithm developed using empirical A/B testing results. Wherein having GUPTA’s skin analysis system wherein the determining further comprises using a human perception mimicking algorithm developed using empirical A/B testing results. The motivation behind the modification would have been to obtain skin analysis system and color determination system that enhances the accuracy of detecting skin and determining skin color in an image. Since both GUPTA and LEE are related to computer modules receiving human input, wherein GUPTA human skin is an example of digital image content that client device users may desire to select for performing image operations, while LEE humans' RL enables minimal supervision learning. Please see GUPTA et al. (US 20190180083 A1), Paragraph [0014], and LEE et al. (US 20210279547 A1), Paragraph [0098]. Regarding claim 18, GUPTA explicitly teaches the method of claim 17, GUPTA fails to explicitly teach wherein the determining further comprises using a human perception mimicking algorithm developed using empirical A/B testing results. However, LEE explicitly teaches wherein the determining further comprises using a human perception mimicking algorithm developed using empirical A/B testing results (Fig. 2. Paragraph [0133]-LEE discloses this is the first attempt to formally test the possibility that computational models mimicking the way the brain solves general problems can lead to practical solutions to key challenges in machine learning.). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of GUPTA of a method for user skin color determination of a user, the method comprising: obtaining, by a skin analysis assembly, a user skin image of the user, comprising one or more user skin image pixels, with the teachings of LEE of wherein the determining further comprises using a human perception mimicking algorithm developed using empirical A/B testing results. Wherein having GUPTA’s skin analysis method wherein the determining further comprises using a human perception mimicking algorithm developed using empirical A/B testing results. The motivation behind the modification would have been to obtain skin analysis system and color determination system that enhances the accuracy of detecting skin and determining skin color in an image. Since both GUPTA and LEE are related to computer modules receiving human input, wherein GUPTA human skin is an example of digital image content that client device users may desire to select for performing image operations, while LEE humans' RL enables minimal supervision learning. Please see GUPTA et al. (US 20190180083 A1), Paragraph [0014], and LEE et al. (US 20210279547 A1), Paragraph [0098]. Claims 7 and 19 are rejected under 35 U.S.C. 103 as being unpatentable over GUPTA et al. (US 20190180083 A1), hereinafter referenced as GUPTA, in view of CHEN et al. (US 20050169520 A1), hereinafter referenced as CHEN. Regarding claim 7, GUPTA explicitly teaches the system of claim 5, GUPTA fails to explicitly teach wherein the image redness threshold is based on one or more of a processing power of the system, a desired speed and a desired accuracy of the calculated user skin color. However, CHEN explicitly teaches wherein the image redness threshold is based on one or more of a processing power of the system, a desired speed and a desired accuracy of the calculated user skin color (Fig. 1. Paragraph [0071]-CHEN discloses at least one characteristic value of each pixel in each of the first number of candidates for red eye region is evaluated. If the evaluated characteristic value does not meet a standard set for red eye pixel, the evaluated pixel is removed from the relevant candidate for red eye region. Further in Paragraph [0042]-CHEN discloses according to the method, apparatus and storage medium for detecting a red eye of the present invention, red eyes are detected based on candidate for red eye regions from which skin color pixels have been removed. Thus, speed and accuracy of detecting red eyes are greatly increased.). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of GUPTA of a system for user skin color determination of a user, the system comprising: a skin analysis assembly, configured to: obtain a user skin image of the user, comprising one or more user skin image pixels; perform color processing on the user skin image to arrive at a user skin color; and output a result of the color processing, the result comprising the user skin color with the teachings of CHEN of wherein the image redness threshold is based on one or more of a processing power of the system, a desired speed and a desired accuracy of the calculated user skin color. Wherein having GUPTA’s skin analysis system wherein the image redness threshold is based on one or more of a processing power of the system, a desired speed and a desired accuracy of the calculated user skin color. The motivation behind the modification would have been to obtain skin analysis system and color determination system that enhances the accuracy of detecting skin and determining skin color in an image. Since both GUPTA and CHEN involve detecting aspects of the human body, wherein GUPTA human skin is an example of digital image content that client device users may desire to select for performing image operations, while CHEN speed and accuracy of detecting red eyes are greatly increased. Please see GUPTA et al. (US 20190180083 A1), Paragraph [0014], and CHEN et al. (US 20050169520 A1), Paragraph [0098]. Regarding claim 19, GUPTA explicitly teaches the method of claim 17, GUPTA fails to explicitly teach wherein the image redness threshold is based on one or more of a processing power of the system, a desired speed and a desired accuracy of the calculated user skin color However, CHEN explicitly teaches wherein the image redness threshold is based on one or more of a processing power of the system, a desired speed and a desired accuracy of the calculated user skin color (Fig. 1. Paragraph [0071]-CHEN discloses at least one characteristic value of each pixel in each of the first number of candidates for red eye region is evaluated. If the evaluated characteristic value does not meet a standard set for red eye pixel, the evaluated pixel is removed from the relevant candidate for red eye region. Further in Paragraph [0042]-CHEN discloses according to the method, apparatus and storage medium for detecting a red eye of the present invention, red eyes are detected based on candidate for red eye regions from which skin color pixels have been removed. Thus, speed and accuracy of detecting red eyes are greatly increased.). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of GUPTA of a method for user skin color determination of a user, the method comprising: obtaining, by a skin analysis assembly, a user skin image of the user, comprising one or more user skin image pixels, with the teachings of CHEN of wherein the image redness threshold is based on one or more of a processing power of the system, a desired speed and a desired accuracy of the calculated user skin color. Wherein having GUPTA’s skin analysis system wherein the image redness threshold is based on one or more of a processing power of the system, a desired speed and a desired accuracy of the calculated user skin color. The motivation behind the modification would have been to obtain skin analysis system and color determination system that enhances the accuracy of detecting skin and determining skin color in an image. Since both GUPTA and CHEN involve detecting aspects of the human body, wherein GUPTA human skin is an example of digital image content that client device users may desire to select for performing image operations, while CHEN speed and accuracy of detecting red eyes are greatly increased. Please see GUPTA et al. (US 20190180083 A1), Paragraph [0014], and CHEN et al. (US 20050169520 A1), Paragraph [0098]. Conclusion Listed below are the prior arts made of record and not relied upon but are considered pertinent to applicant’s disclosure. Chhibber et al. (US 20070064979 A1) - Systems and methods are provided for automatic identification of a person based on an analysis of the person's skin. In one embodiment, a method for automatically identifying a person comprises acquiring white-light and UV images of a portion of the person's skin, generating a skin mask from the white-light image, and comparing the skin mask with a pre-stored skin mask of the person. If a substantial match is not found between the two skin masks, the person is not identified, and an error message such as "wrong person" or "person unknown" is returned. Otherwise, the method proceeds to obtain results associated with certain skin conditions using at least the UV image. The results are compared with pre-stored results to determine if the person is the right person or the wrong person. PAI ET AL. (US 20180276732 A1) - A skin product fitting method and an electronic apparatus therefor are provided. The method includes following steps: capturing a skin image of a current user by using an image capturing device and analyzing the skin image to determine a skin concern; obtaining a skin type of the current user; and sorting a plurality of skin products stored in a product database at least according to the skin concern and the skin type so as to provide product information fitting a skin condition of the current user. SHIFFER ET AL. (US 20060257132 A1) - Various methods and apparatus for multi-level red eye correction in a digital image are described. In an embodiment, a method receives an input that identifies an actual red eye in a digital image and a red eye candidate score associated with the actual red eye. The red eye candidate score of the actual red eye exceeds a detection threshold value. The method corrects the coloration of the actual red eye with a level of correction based on the red eye candidate score. QU ET AL. (US 20100214421 A1) - A process for measuring skin color parameters from a digital photograph using novel color correction algorithms is described. The process includes measuring color values of a digital color photo, correcting the color deviation of each picture to that of a standard color, and converting the corrected RGB values and generating an output that is useful to L*a*b* values to describe changes in the color properties of the photographed skin. KORICHI et al. (US 20120300050 A1) - A method and apparatus for characterizing the tone of the skin or integuments uses an apparatus capturing a digital image of a skin zone. The image is defined by a multiplicity of pixels (N) and transmitted to a digital image processing device for splitting the digital image into R, G and B colour planes. The apparatus includes means for extracting each of these planes R, G, B; and on each plane, calculation means for logging the grey level value for each of the pixels, i.e. N values, which are optionally processed to obtain a value characteristic of the grey levels for each plane, corresponding to a value characteristic of the colour; as well as the luminosity value L*. The apparatus also characterizes the tone of the skin or integuments on the basis of the combination of the value characteristic of the colour and of the Luminosity value L*. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ETHAN N WOLFSON whose telephone number is (571)272-1898. The examiner can normally be reached Monday - Friday 8:00 am - 5:00 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chineyere Wills-Burns can be reached at (571) 272-9752. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ETHAN N WOLFSON/Examiner, Art Unit 2673 /CHINEYERE WILLS-BURNS/Supervisory Patent Examiner, Art Unit 2673
Read full office action

Prosecution Timeline

Mar 28, 2024
Application Filed
Feb 20, 2026
Non-Final Rejection — §102, §103 (current)

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
Grant Probability
2y 9m
Median Time to Grant
Low
PTA Risk
Based on 0 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month