Prosecution Insights
Last updated: April 19, 2026
Application No. 18/852,246

DYNAMICALLY UPDATED AUTOMATIC MAKEUP APPLICATION

Non-Final OA §102§103§112
Filed
Sep 27, 2024
Examiner
DANG, TRANG THANH
Art Unit
3656
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Gemma Robotics Inc.
OA Round
1 (Non-Final)
44%
Grant Probability
Moderate
1-2
OA Rounds
3y 3m
To Grant
75%
With Interview

Examiner Intelligence

Grants 44% of resolved cases
44%
Career Allow Rate
16 granted / 36 resolved
-7.6% vs TC avg
Strong +31% interview lift
Without
With
+30.7%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
24 currently pending
Career history
60
Total Applications
across all art units

Statute-Specific Performance

§101
7.9%
-32.1% vs TC avg
§103
39.8%
-0.2% vs TC avg
§102
21.0%
-19.0% vs TC avg
§112
28.7%
-11.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 36 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority This application is a U.S. national stage application under 35 U.S.C. § 371(b) of International Application No. PCT/IL2023/050334, filed March 30, 2023, which claims priority to U.S. Patent Application No. 63/325,322 filed March 30, 2022. Information Disclosure Statement The information disclosure statement (IDS) submitted on 12/20/2024 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Specification Applicant is reminded of the proper content of an abstract of the disclosure. A patent abstract is a concise statement of the technical disclosure of the patent and should include that which is new in the art to which the invention pertains. The abstract should not refer to purported merits or speculative applications of the invention and should not compare the invention with the prior art. If the patent is of a basic nature, the entire technical disclosure may be new in the art, and the abstract should be directed to the entire disclosure. If the patent is in the nature of an improvement in an old apparatus, process, product, or composition, the abstract should include the technical disclosure of the improvement. The abstract should also mention by way of example any preferred modifications or alternatives. Where applicable, the abstract should include the following: (1) if a machine or apparatus, its organization and operation; (2) if an article, its method of making; (3) if a chemical compound, its identity and use; (4) if a mixture, its ingredients; (5) if a process, the steps. Extensive mechanical and design details of an apparatus should not be included in the abstract. The abstract should be in narrative form and generally limited to a single paragraph within the range of 50 to 150 words in length. See MPEP § 608.01(b) for guidelines for the preparation of patent abstracts. The abstract of the disclosure is objected to because the abstract is not presented on a separate sheet, apart from any other text. A corrected abstract of the disclosure is required and must be presented on a separate sheet, apart from any other text. See MPEP § 608.01(b). Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: “control unit” in claims 40, 41. Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112: The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention. Claim 43 is rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. Claim 43 recites “wherein said machine is configured to automatically attach and detach nozzles from said airbrush”. However, the subject matter recited is not described in the specification in such a way as to reasonably convey possession of the claimed invention. The specification merely provides that the “Machine 100 may be configured to automatically attach and detach nozzles from Airbrush 120, ” (published paragraph [0045]) however does not provide any details regarding this “automatically attach and detach nozzles”. In other words, applicant has not shown possession of an invention/a design in which the nozzles are automatically attach and detach from the airbrush. MPEP 2163.03 states, "The written description requirement is not necessarily met when the claim language appears in ipsis verbis in the specification. "Even if a claim is supported by the specification, the language of the specification, to the extent possible, must describe the claimed invention so that one skilled in the art can recognize what is claimed. The appearance of mere indistinct words in a specification or a claim, even an original claim, does not necessarily satisfy that requirement." Enzo Biochem, Inc. v. Gen-Probe, Inc., 323 F.3d 956, 968, 63 USPQ2d 1609, 1616 (Fed. Cir. 2002)." MPEP 2163, subsection II, part 2, states, "Information which is well known in the art need not be described in detail in the specification. See, e.g., Hybritech, Inc. v. Monoclonal Antibodies, Inc., 802 F.2d 1367, 1379-80, 231 USPQ 81,90 (Fed. Cir. 1986). However, sufficient information must be provided to show that the inventor had possession of the invention as claimed." The design resulting in the nozzles being automatically attached and detached from the airbrush and so they must be described in sufficient detail in the specification. The examiner notes that presently, applicant has the burden to demonstrate possession of the claimed invention (see MPEP 2163, subsection II, part A, ""[t]he burden was then properly shifted to [inventor] to cite to the examiner where adequate written description could be found or to make an amendment to address the deficiency." Id.; see also Stored Value Solutions, Inc. v. Card Activation Techs., 499 Fed.App'x 5, 13-14 (Fed. Cir. 2012)'.). Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 28, 32, 33, 34 and 36 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Regarding claim 28, 1) it is unclear whether the limitation “the makeup plan” is referring to the limitation “makeup application plan” in claim 27. 2) It is unclear whether the limitation “the instructions of the makeup plan” is referring to the limitation “instructions for an automatic makeup applicator” in claim 27. The examiner assumes “the instructions of the makeup plan” is referring to “instructions for an automatic makeup applicator” for further examination. 3) Claim 28 recites the limitation "the face" in line 3. There is insufficient antecedent basis for this limitation in the claim. The examiner assumes “the face” is referring to “a face” for further examination. 4) It is unclear whether the limitation “said updating the makeup plan” is referring to the limitation “updating the makeup application plan” in claim 27. The examiner assumes “said updating the makeup plan” is referring to the limitation “updating the makeup application plan” for further examination. Therefore, this renders the claim indefinite. Appropriate correct is required. Regarding claim 32, 1) the claim recites the limitation, “the application distance at the first target area”. There is insufficient antecedent basis for this limitation in the claim. 2) the claim recites the limitation, “the application distance at the second target area”. There is insufficient antecedent basis for this limitation in the claim. Therefore, this renders the claim indefinite. Appropriate correct is required. The examiner assumes “wherein an application distance at the first target area is different than an application distance at the second target area” for further examination. Regarding claim 33, 1) it is unclear whether the limitation “a pressure” is referring to “application pressure” in claim 31. The examiner assumes “a pressure” is referring to “application pressure” in claim 31 for further examination. 2) It is unclear whether the limitation “automatic makeup applicator” is referring to “the automatic makeup applicator” in claim 31. There is insufficient antecedent basis for this limitation in the claim. The examiner assumes “automatic makeup applicator” is referring to “the automatic makeup applicator” in claim 31 for further examination. 3) The examiner assumes the limitation “wherein the pressure to be used by the automatic makeup applicator at the first target area is different than the pressure at the second target area” is referring to “wherein a pressure to be used by the automatic makeup applicator at the first target area is different than a pressure at the second target area” for further examination. There is insufficient antecedent basis for this limitation in the claim. Therefore, this renders the claim indefinite. Appropriate correct is required. Regarding claim 34, the term “sufficient” is a relative term which renders the claim indefinite. The term “sufficient” is not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. It is unclear what the metes and bounds regarding the claimed “sufficient time” and further how the claimed “sufficient time” is applied to avoid injury of the subject. Therefore, this renders the claim indefinite. Appropriate correct is required. Regarding claim 36, 1) it is unclear what the limitation “the process of applying the makeup application plan” is referring to. There is insufficient antecedent basis for this limitation in the claim. 2) The limitation “a simulated outcome” should read “the simulated outcome”. There is insufficient antecedent basis for this limitation in the claim. Therefore, this renders the claim indefinite. Appropriate correct is required. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 27, 29, 30, 38, 40, 42, and 46 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Staton et al. (US 20190053608 A1, hereinafter “Staton”). Regarding claim 27, Staton discloses a method (Staton, see at least Figs. 6, 7) comprising: obtaining a makeup application plan (Staton, at least Fig. 6, step S140, creating custom session file 1300), the makeup application plan comprises instructions for an automatic makeup applicator (Stanton, at least Fig. 7, par. [0036], “And using the custom session file 1300 and the then-current facial position information 1050, the processor 120 causes the carriage actuators 160 to selectively move the carriage in a defined manner”), the instructions, when implemented by the automatic makeup applicator, are configured to apply makeup materials on a subject in order to achieve a desired look for the subject (Staton, see at least Figs. 6, step S240, Fig. 7, steps S141-S144, par. [0036], “The personal preferences 1030 may include, for example, desired makeup style information, desired color palette information, available makeup, et cetera”; par. [0037, 0039], “… create the custom session file 1300 based on data from the facial-structure information 1010, the facial-tone information 1020, the user preference file 1030, and the baseline makeup application information 1220. And using the custom session file 1300 and the then-current facial position information 1050, the processor 120 causes the carriage actuators 160 to selectively move the carriage in a defined manner and causes the valves 175, 195 to selectively allow the makeup 20 to flow through the supply passage 174 and the output nozzle 172 to apply the makeup 20 to the face 10”); implementing by the automatic makeup applicator a portion of the makeup application plan (Staton, see at least Figs. 6, 8, par. [0038-0039], steps S150 to S180 are repeated until the custom session file 130 is fully executed to achieve the specific layout of the makeup mapped onto the face); obtaining sensor readings from a sensor during said implementing, wherein the sensor is configured to monitor movements of the subject (Staton, see at least Figs. 3, 6, step S150, par. [0029, 0036, 0038], scanning device 130 is configured to repeatedly obtain facial-structure information 1010 (which may include, for example, information regarding face shape and contour of the eyes, nose, cheeks, jaw, chin, and skin) and facial-position information 1050 (information regarding where the face 10 is located, preferably in real time to track movements of the person); in response to identifying, based on the sensor readings, a movement of the subject, updating the makeup application plan, whereby obtaining a dynamically updated makeup application plan (Staton, see at least Fig. 6, step S150-S170, par. [0036], “And using the custom session file 1300 and the then-current facial position information 1050, the processor 120 causes the carriage actuators 160 to selectively move the carriage in a defined manner and causes the valves 175, 195 to selectively allow the makeup 20 to flow through the supply passage 174 and the output nozzle 172 to apply the makeup 20 to the face 10”; par. [0038], “After step S140, the processor 120 at step S150 obtains updated facial-position information 1050 via the scanner 130 and stores the updated facial-position information 1050 in the computer memory 110. And at steps S160 and S170, the processor 120 causes the carriage actuators 160 to move the carriage 150 in accordance with the custom session file 1300 and the facial-position information 1050, and causes the valves 175, 195 to selectively allow the makeup 20 to flow through the supply passage 174 and the output nozzle 172 to apply the makeup 20 to the face 10”); and implementing the dynamically updated makeup application plan or portion thereof in order to achieve the desired look for the subject while taking into account the movement of the subject and an implementation of the portion of the makeup application plan (Staton, see at least Fig. 6, step S150-S170, par. [0038], “After step S140, the processor 120 at step S150 obtains updated facial-position information 1050 via the scanner 130 and stores the updated facial-position information 1050 in the computer memory 110. And at steps S160 and S170, the processor 120 causes the carriage actuators 160 to move the carriage 150 in accordance with the custom session file 1300 and the facial-position information 1050, and causes the valves 175, 195 to selectively allow the makeup 20 to flow through the supply passage 174 and the output nozzle 172 to apply the makeup 20 to the face 10. After steps S160 and S170, the processor 120 determines at S180 if the custom session file 1300 has been fully executed for the particular makeup 20. If not, the process returns to step S150”). Regarding claim 29, Staton teaches all the limiation of claim 27. Staton further discloses wherein the instructions comprise movement instructions that yield a 3D trajectory to be followed by the automatic makeup applicator in order to achieve the desired look (Staton, see at least Fig. 2, par. [0032-0033], the automatic makeup applicator is able to move in x, y, and z directions and comprises a rotating member 167, which may rotate in multiple dimensions; Figs. 7, 8, par. [0029], instruction file for controlling the carriage actuators 160 and the valves 175, 195 to achieve the specific layout of the makeup 20 mapped onto the face 10 based on three-dimensional information of the face 10. Examiner note: 3D trajectory is defined as trajectory in 3D space to achieve the chosen makeup look as described in par. [0047] of the instant specification); and wherein the instructions comprise material application instructions, each of which indicating an application location, a material to be applied and application properties to be implemented by the automatic makeup applicator (Staton, see at least Figs. 6, 7, par. [0036], “The cosmetic database 1200 includes makeup composition information 1210 (e.g., makeup identification information, makeup color information, et cetera) and baseline makeup application information 1220 (e.g., baseline routines for applying makeup to different face shapes; to different contours of the eyes, nose, cheeks, jaw, chin, and skin; to different skin tones; to obtain different makeup styles; et cetera)”; par. [0039], “At step S142, the processor 120 accesses the facial-structure information 1010 and the baseline makeup application information 1220, and look-up tables or logic causes the processor 120 to determine that the contour (makeup 20) should be applied to certain facial areas based on the oval face shape and the contour of the eyes, nose, cheeks, jaw, chin, and skin … And at step S144, the processor creates an instruction file for controlling the carriage actuators 160 and the valves 175, 195 to achieve the specific layout of the makeup 20 mapped onto the face 10 in step S143). Regarding claim 30, Staton teaches all the limitation of claim 27 as discussed above. Staton further discloses wherein said obtaining the makeup application plan comprises: obtaining a three-dimensional (3D) surface of a face of the subject (Staton, see at least Figs. 3, 6-8, par. [0029], “The scanner 130 uses three-dimensional sensors 132 to capture information about the face 10, including facial-structure information 1010 (which may include, for example, information regarding face shape and contour of the eyes, nose, cheeks, jaw, chin, and skin) and facial-position information 1050 (information regarding where the face 10 is located, preferably in real time)”); obtaining the desired look, wherein the desired look is determined based on a user input indicating a required result of makeup application on the face of the subject (Staton, see at least Figs. 3, 6, par. [0030, 0037], “At step S100 at method S10, the processor 120 obtains user preference information 1030 via the input 140 and stores the user preference information 1030 in the computer memory 110”); and generating the makeup application plan based on the user input and based on the 3D surface of the face (Staton, see at least Figs. 6, 7, step S140, par. [0037], generating custom session file 1300 based on based on data from the facial-structure information 1010, the facial-tone information 1020, the user preference file 1030, and the baseline makeup application information 1220). Regarding claim 38, Staton teaches all the limitation of claim 27 as discussed above. Staton further discloses wherein the automatic makeup applicator comprises an airbrush that is movable at 5 degrees of freedom, the airbrush is capable of translation movement in 3 axes and rotational movement in 2 axes (Staton, see at least Figs. 2, 4, par. [0033], “As shown in FIGS. 2 and 4, the carriage actuators 160 may include a drive mechanism 161 and rail 162 system which allows lateral travel, a drive mechanism 163 and rail 164 system which allows transverse (in the depicted orientation, vertical) travel, and a drive mechanism 165 and rail 166 system which allows further transverse (in the depicted orientation, front/back) travel … The carriage actuators 160 may further include a rotating member 167, which may rotate … in multiple dimensions (such as through a ball-and-socket joint)”; par. [0034], “The cosmetic unit 170 is coupled to and movable with the carriage 150, as shown in FIG. 2. As shown in FIG. 5, the cosmetic unit 170 may include a cosmetic airbrush 171”). Regarding claim 40, Staton discloses a machine (Staton, see at least Fig. 2) comprising: a robotic arm movable in 5 degrees of freedom, the 5 degrees of freedom comprise translation movement in 3 axes and rotational movement in 2 axes (Staton, see at least Figs. 2, 4, par. [0033], “As shown in FIGS. 2 and 4, the carriage actuators 160 may include a drive mechanism 161 and rail 162 system which allows lateral travel, a drive mechanism 163 and rail 164 system which allows transverse (in the depicted orientation, vertical) travel, and a drive mechanism 165 and rail 166 system which allows further transverse (in the depicted orientation, front/back) travel … The carriage actuators 160 may further include a rotating member 167, which may rotate … in multiple dimensions (such as through a ball-and-socket joint) … the carriage actuators 160 may include a robotic arm”); an airbrush mounted on said robotic arm (Staton, see at least Fig. 2, par. [0034], “The cosmetic unit 170 is coupled to and movable with the carriage 150, as shown in FIG. 2. As shown in FIG. 5, the cosmetic unit 170 may include a cosmetic airbrush 171”); a sensor for monitoring movement of a subject (Staton, see at least Fig. 2, par. [0029], “The scanner 130 uses three-dimensional sensors 132 to capture information about the face 10, including facial-structure information 1010 (which may include, for example, information regarding face shape and contour of the eyes, nose, cheeks, jaw, chin, and skin) and facial-position information 1050 (information regarding where the face 10 is located, preferably in real time)”); and a control unit (Staton, see at least Fig. 3, par. [0028], processor 120) for controlling movement of said robotic arm in accordance with a makeup application plan, said control unit is further configured to control application of said airbrush in accordance with the makeup application plan, wherein said control unit is configured to modify the makeup application plan based on sensor readings from said sensor (Staton, see at least par. [0036], "And still other programming causes the processor 120 to create the custom session file 1300 based on data from the facial-structure information 1010, the facial-tone information 1020, the user preference file 1030, and the baseline makeup application information 1220. And using the custom session file 1300 and the then-current facial position information 1050, the processor 120 causes the carriage actuators 160 to selectively move the carriage in a defined manner and causes the valves 175, 195 to selectively allow the makeup 20 to flow through the supply passage 174 and the output nozzle 172 to apply the makeup 20 to the face 10”). Regarding claim 42, Staton teaches all the limitations of claim 40. Staton further discloses further comprises a material mixer for providing a material to be applied by said airbrush (Staton, see at least Figs. 2, 11, par. [0041], mixing reservoir 281 for providing a material to be applied by said airbrush 270), wherein the makeup application plan defines different materials to be mixed for applying makeup on the subject (Staton, see at least par. [0043], “First, the makeup applicator 200 may create custom makeup 20 by mixing together different makeup 20 in the mixing reservoir 281”; par. [0044], “At step S241, the processor 220 causes the valves 285a, 285b to pass makeup 20a, 20b from the cosmetic reservoirs 280a, 280b to the mixing reservoir 281 to create makeup 20e, based on the custom session file 2300 created at step S240 and data from the sensors 286a, 286b”). Regarding claim 46, Staton discloses a computerized apparatus having a processor (Staton, see at least Fig. 3, processor 120), the processor being adapted to perform the steps of: obtaining a makeup application plan (Staton, at least Fig. 6, step S140, creating custom session file 1300), the makeup application plan comprises instructions for an automatic makeup applicator (Stanton, at least Fig. 7, par. [0036], “And using the custom session file 1300 and the then-current facial position information 1050, the processor 120 causes the carriage actuators 160 to selectively move the carriage in a defined manner”), the instructions, when implemented by the automatic makeup applicator, are configured to apply makeup materials on a subject in order to achieve a desired look for the subject (Staton, see at least Figs. 6, step S240, Fig. 7, steps S141-S144, par. [0036], “The personal preferences 1030 may include, for example, desired makeup style information, desired color palette information, available makeup, et cetera”; par. [0037, 0039], “… create the custom session file 1300 based on data from the facial-structure information 1010, the facial-tone information 1020, the user preference file 1030, and the baseline makeup application information 1220. And using the custom session file 1300 and the then-current facial position information 1050, the processor 120 causes the carriage actuators 160 to selectively move the carriage in a defined manner and causes the valves 175, 195 to selectively allow the makeup 20 to flow through the supply passage 174 and the output nozzle 172 to apply the makeup 20 to the face 10”); implementing by the automatic makeup applicator a portion of the makeup application plan (Staton, see at least Figs. 6, 8, par. [0038-0039], steps S150 to S180 are repeated until the custom session file 130 is fully executed to achieve the specific layout of the makeup mapped onto the face); obtaining sensor readings from a sensor during said implementing, wherein the sensor is configured to monitor movements of the subject (Staton, see at least Figs. 3, 6, step S150, par. [0029, 0036, 0038], scanning device 130 is configured to repeatedly obtain facial-structure information 1010 (which may include, for example, information regarding face shape and contour of the eyes, nose, cheeks, jaw, chin, and skin) and facial-position information 1050 (information regarding where the face 10 is located, preferably in real time to track movements of the person); in response to identifying, based on the sensor readings, a movement of the subject, updating the makeup application plan, whereby obtaining a dynamically updated makeup application plan (Staton, see at least Fig. 6, step S150-S170, par. [0036], “And using the custom session file 1300 and the then-current facial position information 1050, the processor 120 causes the carriage actuators 160 to selectively move the carriage in a defined manner and causes the valves 175, 195 to selectively allow the makeup 20 to flow through the supply passage 174 and the output nozzle 172 to apply the makeup 20 to the face 10”; par. [0038], “After step S140, the processor 120 at step S150 obtains updated facial-position information 1050 via the scanner 130 and stores the updated facial-position information 1050 in the computer memory 110. And at steps S160 and S170, the processor 120 causes the carriage actuators 160 to move the carriage 150 in accordance with the custom session file 1300 and the facial-position information 1050, and causes the valves 175, 195 to selectively allow the makeup 20 to flow through the supply passage 174 and the output nozzle 172 to apply the makeup 20 to the face 10”); and implementing the dynamically updated makeup application plan or portion thereof in order to achieve the desired look for the subject while taking into account the movement of the subject and an implementation of the portion of the makeup application plan (Staton, see at least Fig. 6, step S150-S170, par. [0038], “After step S140, the processor 120 at step S150 obtains updated facial-position information 1050 via the scanner 130 and stores the updated facial-position information 1050 in the computer memory 110. And at steps S160 and S170, the processor 120 causes the carriage actuators 160 to move the carriage 150 in accordance with the custom session file 1300 and the facial-position information 1050, and causes the valves 175, 195 to selectively allow the makeup 20 to flow through the supply passage 174 and the output nozzle 172 to apply the makeup 20 to the face 10. After steps S160 and S170, the processor 120 determines at S180 if the custom session file 1300 has been fully executed for the particular makeup 20. If not, the process returns to step S150”). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 28, and 31-33 are rejected under 35 U.S.C. 103 as being unpatentable over Staton et al. (US 20190053608 A1, hereinafter “Staton”) as applied to claims 27 above, and further in view of Whitelaw (US 20200285835 A1). Regarding claim 28, Staton teaches all the limitations of claim 27 as discussed above. Staton fails to explicitly teach wherein the instructions of the makeup plan comprise an instruction to apply the makeup materials from a predefined location in space that is distant from a surface of the face by a defined distance; and wherein said updating the makeup plan comprises modifying the instruction based on the movement of the subject so as to maintain the defined distance. Whitelaw teaches a method for automated makeup application allow a user to select and apply desired makeup styles to the user's face, wherein the instructions of the makeup plan comprise an instruction to apply the makeup materials from a predefined location in space that is distant from a surface of the face by a defined distance (Whitelaw, see at least Figs. 7A-B, par. [0030], “… In more detail, microcontroller 34 signals camera 56 to begin live tracking of the user's face and signals the robotic arm 40 to move to its initial position in front of the user's face …”); and wherein said updating the makeup plan comprises modifying the instruction based on the movement of the subject so as to maintain the defined distance (Whitelaw, see at least Figs. 7A-B, par. [0030], “In step 736, the camera 56 begins live tracking the user's face … The system continues to track the location of robotic arm 40 in relation to the user's face, keeping the correct distance away from the face as robotic arm 40 moves …”). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Staton to incorporate the teachings of Whitelaw and provide a method for automated makeup application allow a user to select and apply desired makeup styles to the user's face, wherein the instructions of the makeup plan comprise an instruction to apply the makeup materials from a predefined location in space that is distant from a surface of the face by a defined distance and wherein said updating the makeup plan comprises modifying the instruction based on the movement of the subject so as to maintain the defined distance. This modification would allow to keep an appropriate distance from the user's face as the automatic makeup applicator moves, imitate a pre-programmed human hand fashion (Whitelaw, par. [0030]). Regarding claim 31, Staton teaches all the limitations of claims 27 and 30. Staton further teaches wherein said generating comprises: determining, for a target area in the 3D surface of the face, a material to be applied (Staton, see at least Figs. 7, 8, par. [0029], “At step S142, the processor 120 accesses the facial-structure information 1010 and the baseline makeup application information 1220, and look-up tables or logic causes the processor 120 to determine that the contour (makeup 20) should be applied to certain facial areas based on the oval face shape and the contour of the eyes, nose, cheeks, jaw, chin, and skin. At step S143, the processor maps the facial areas determined in step S142 onto the specific facial structure of the face 10; an example mapping is illustrated in FIG. 8, showing contour makeup 20a and highlighting makeup 20c mapped to the face 10“), an application property of an application of the material (Staton, see at least Fig. 7, par. [0039], “And at step S144, the processor creates an instruction file for controlling the carriage actuators 160 and the valves 175, 195 to achieve the specific layout of the makeup 20 mapped onto the face 10 in step S143”), and an application orientation from which the material is to be applied on the target area, in order to achieve the desired look in the target area (Staton, see at least Fig. 7, par. [0033, 0038-0039], “The instruction file created in step S144 may be keyed to a particular facial position. Thus, as the facial-position information 1050 indicates that the facial position has changed, the processor 120 may adjust the actual operation of the carriage actuators 160 and the valves 175, 195 accordingly in steps S160, S170“); and generating one or more instructions that are configured to cause the automatic makeup applicator to apply the material from the application orientation on the target area using the application property (Staton, see at least Figs. 6, 7, par. [0033, 0038-0039]; “And at step S144, the processor creates an instruction file for controlling the carriage actuators 160 and the valves 175, 195 to achieve the specific layout of the makeup 20 mapped onto the face 10 in step S143. The instruction file created in step S144 may be keyed to a particular facial position. Thus, as the facial-position information 1050 indicates that the facial position has changed, the processor 120 may adjust the actual operation of the carriage actuators 160 and the valves 175, 195 accordingly in steps S160, S170”), wherein the application property comprises application pressure to be used when applying the material on the target area (Staton, see at least Figs. 5, 7, par. [0034-0036], “And using the custom session file 1300 and the then-current facial position information 1050, the processor 120 … causes the valves 175, 195 to selectively allow the makeup 20 to flow through the supply passage 174 and the output nozzle 172 to apply the makeup 20 to the face 10”). Staton fails to explicitly teach an application distance. Whitelaw teaches a method for automated makeup application allow a user to select and apply desired makeup styles to the user's face, wherein said generating comprises: determining, for a target area in the 3D surface of the face, an application distance from which the material is to be applied on the target area in order to achieve the desired look in the target area (Whitelaw, see at least Figs. 7A-B, par. [0030], “… In more detail, microcontroller 34 signals camera 56 to begin live tracking of the user's face and signals the robotic arm 40 to move to its initial position in front of the user's face …”); and generating one or more instructions that are configured to cause the automatic makeup applicator to apply the material from the application distance on the target area using the application property (Whitelaw, see at least Figs. 7A-B, par. [0030], “In step 736, the camera 56 begins live tracking the user's face … In step 738, the airbrush begins spraying the user's face. Specifically, microcontroller 34 signals plug triggers 70 to release the first airbrush nozzle 54 and begin spraying the user's face with formula. The system continues to track the location of robotic arm 40 in relation to the user's face, keeping the correct distance away from the face as robotic arm 40 moves, mimicking a pre-programmed human hand fashion …”). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Staton to incorporate the teachings of Whitelaw and provide a method for automated makeup application allow a user to select and apply desired makeup styles to the user's face, wherein said generating comprises: determining, for a target area in the 3D surface of the face, an application distance from which the material is to be applied on the target area in order to achieve the desired look in the target area; and generating one or more instructions that are configured to cause the automatic makeup applicator to apply the material from the application distance on the target area using the application property. This modification would allow to keep an appropriate distance from the user's face as the automatic makeup applicator moves, imitate a pre-programmed human hand fashion (Whitelaw, par. [0030]). Regarding claim 32, the combination of Staton and Whitelaw teaches all the limitations of claims 31. The combination of Staton and Whitelaw further teaches wherein said generating is performed with respect to a first target area and a second target area, wherein the application distance at the first target area is different than the application distance at the second target area (Whitelaw, see at least Figs. 7A-B, par. [0030], “In step 736, the camera 56 begins live tracking the user's face. In more detail, microcontroller 34 signals camera 56 to begin live tracking of the user's face and signals the robotic arm 40 to move to its initial position in front of the user's face. In step 738, the airbrush begins spraying the user's face. Specifically, microcontroller 34 signals plug triggers 70 to release the first airbrush nozzle 54 and begin spraying the user's face with formula. The system continues to track the location of robotic arm 40 in relation to the user's face, keeping the correct distance away from the face as robotic arm 40 moves, mimicking a pre-programmed human hand fashion”). Regarding claim 33, the combination of Staton and Whitelaw teaches all the limitations of claims 31. The combination of Staton and Whitelaw further teaches wherein said generating is performed with respect to a first target area and a second target area (Staton, see at least Figs. 7, 8, par. [0039], “At step S142, the processor 120 accesses the facial-structure information 1010 and the baseline makeup application information 1220, and look-up tables or logic causes the processor 120 to determine that the contour (makeup 20) should be applied to certain facial areas based on the oval face shape and the contour of the eyes, nose, cheeks, jaw, chin, and skin. At step S143, the processor maps the facial areas determined in step S142 onto the specific facial structure of the face 10; an example mapping is illustrated in FIG. 8, showing contour makeup 20a and highlighting makeup 20c mapped to the face 10”), wherein the application property is a pressure to be used by automatic makeup applicator (Staton, see at least Fig. 3, par. [0035], “A pressurized-air source 190, such as an air tank housing compressed air or a compressor for compressing air, is in communication with the cosmetic airbrush 171 (e.g., through tubing 194)”), wherein the application distance at the first target area is equal to the application distance at the second target area (Whitelaw, see at least Figs. 7A-B, par. [0030], “In step 736, the camera 56 begins live tracking the user's face. In more detail, microcontroller 34 signals camera 56 to begin live tracking of the user's face and signals the robotic arm 40 to move to its initial position in front of the user's face. In step 738, the airbrush begins spraying the user's face. Specifically, microcontroller 34 signals plug triggers 70 to release the first airbrush nozzle 54 and begin spraying the user's face with formula. The system continues to track the location of robotic arm 40 in relation to the user's face, keeping the correct distance away from the face as robotic arm 40 moves, mimicking a pre-programmed human hand fashion”), wherein the pressure to be used by the automatic makeup applicator at the first target area is different than the pressure at the second target area (Staton, see at least Figs. 7, 8, par. [0039], “At step S143, the processor maps the facial areas determined in step S142 onto the specific facial structure of the face 10; an example mapping is illustrated in FIG. 8, showing contour makeup 20a and highlighting makeup 20c mapped to the face 10 … And at step S144, the processor creates an instruction file for controlling … the valves 175, 195 to achieve the specific layout of the makeup 20 mapped onto the face 10 in step S143”). Claims 41 and 43 are rejected under 35 U.S.C. 103 as being unpatentable over Staton et al. (US 20190053608 A1, hereinafter “Staton”) as applied to claim 40 above, and further in view of Pettersson et al. (US 20120219699 A1, hereinafter “Pettersson”). Regarding claim 41, Staton teaches all the limitations of claim 40. Staton further discloses further comprises an air compressor, said air compressor is configured to cause application of material via said airbrush (Staton, see at least Fig. 5, par. [0035], “A pressurized-air source 190, such as an air tank housing compressed air or a compressor for compressing air, is in communication with the cosmetic airbrush 171 (e.g., through tubing 194)”), wherein said control unit is configured to instruct said air compressor in accordance with the makeup application plan (Staton, see at least Figs. 5, 10, par. [0035], “And another valve 195 in data communication with the processor 120 controls passage of the pressurized air”). Staton fails to explicitly teach provide different air pressure levels. Pettersson teaches a spattering device 9, e.g. an airbrush, comprises at least one nozzle 1 for expelling material to a surface of a subject (Pettersson, see at least Figs. 1, 2, par. [0051, 0108, 0159-0164]); wherein a control unit 8 (Pettersson, see at least Figs. 1, 2, par. [0132]) is configured to instruct to provide different air pressure levels in accordance with an application plan (Pettersson, see at least par. [0053, 0061, 0131]). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Staton to incorporate the teachings of Pettersson and provide a method, wherein said control unit is configured to instruct said air compressor provide different air pressure levels in accordance with the makeup application plan. This modification would allow to allow spraying material in accordance with the application plan (Pettersson, par. [0061]). Regarding claim 43, Staton teaches all the limitations of claim 40 as discussed above. Staton further teaches wherein said airbrush is attachable to multiple alternative nozzles (Staton, see at least Fig. 10, par. [0043], the makeup applicator 200 has multiple output nozzles 272), wherein said machine is configured to automatically attach and detach nozzles from said airbrush (Staton, see at least Fig. 10, par. [0044], the processor 220 causes the valves to selectively allow makeup flow through the supply passage and the output nozzle). Staton fails to explicitly teach nozzles having different sizes and shapes, thereby enabling different app
Read full office action

Prosecution Timeline

Sep 27, 2024
Application Filed
Dec 09, 2025
Non-Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12576884
RIGHT-OF-WAY-BASED SEMANTIC COVERAGE AND AUTOMATIC LABELING FOR TRAJECTORY GENERATION IN AUTONOMOUS SYTEMS
2y 5m to grant Granted Mar 17, 2026
Patent 12559074
AIRCRAFT SYSTEM
2y 5m to grant Granted Feb 24, 2026
Patent 12493302
LONGITUDINAL TRIM CONTROL MOVEMENT DURING TAKEOFF ROTATION
2y 5m to grant Granted Dec 09, 2025
Patent 12461529
ROBOT PATH PLANNING APPARATUS AND METHOD THEREOF
2y 5m to grant Granted Nov 04, 2025
Patent 12429878
Systems and Methods for Dynamic Object Removal from Three-Dimensional Data
2y 5m to grant Granted Sep 30, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
44%
Grant Probability
75%
With Interview (+30.7%)
3y 3m
Median Time to Grant
Low
PTA Risk
Based on 36 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month