DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 12/04/2025 has been entered.
Response to Arguments
Applicant’s arguments, see Remarks page 8, filed 12/04/2025, with respect to the rejections of claims 16-23, 25-32, and 34-35 under 35 U.S.C. 112(a) have been fully considered and are persuasive. The rejections of claims 16-23, 25-32, and 34-35 have been withdrawn.
Applicant’s arguments, see Remarks page 8, filed 12/04/2025, with respect to the rejections of claims 25 and 34 under 35 U.S.C. 112(b) have been fully considered and are persuasive. The rejections of claims 25 and 34 have been withdrawn.
Applicant’s arguments, see Remarks page 9, filed 12/04/2025, with respect to the rejections of claims 16-23, 25-32, and 34-35 under 35 U.S.C. 101 have been fully considered and are persuasive. The rejections of claims 16-23, 25-32, and 34-35 have been withdrawn.
Applicant’s arguments, see Remarks pages 9-10, filed 12/04/2025, with respect to the rejection of amended claim(s) 16, 27, and 35 under 35 U.S.C. 102(a)(1) have been fully considered and are moot in view of the new grounds of rejection (detailed in the rejections below) necessitated by Applicant’s amendment to the claim(s).
Claim Interpretation
Note that according to the Federal Circuit’s 2004 Superguide v. DirecTV decision, “at least one of … and …” requires at least one instance of each and every item listed.
Claim(s) 19 recite(s) “wherein the fixed face point comprises at least one of a bridge of a nose and an outer edge of a nostril”.
Claim(s) 20 recite(s) “wherein the variable face point comprises at least one of an outer edge of an eyelid and a corner of a mouth”.
Claim(s) 31 recite(s) “wherein the fixed face point comprises at least one of a bridge of a nose and an outer edge of a nostril, and wherein the variable face point comprises at least one of an outer edge of an eyelid and a corner of a mouth”.
If Applicant intends for an interpretation of only one of these items being required for claim interpretation, Applicant can amend the claim language to, instead recite, “at least one of … or …”. In SuperGuide, the Federal Circuit held that the plain meaning of “at least one of A, B, and C” means: at least one of A, at least one of B, and at least one of C. The Court held that if the applicant intended “at least one of A, B, and C” to mean A, B, or C, they should have used “OR.” For the purposes of Examination, the limitations of claims 19, 20, and 31 are interpreted as disjunctive, thus each limitation requiring only one item, as disclosed by Page 9, lines 1-8 of the Specification.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 16-17, 21-22, 26-29, 32, and 34-37 is/are rejected under 35 U.S.C. 103 as being unpatentable over Hu et al. (US 20190214127 A1) hereinafter referenced as Hu, in view of Viklund et al. (WO2019070763A1) hereinafter referenced as Viklund.
Regarding claim 16, Hu discloses: A non-transitory computer-readable medium storing machine-readable instructions that when executed by a processor, causes the processor to (Hu: 0089-0090):
control a camera to capture a reference image of a user's face and a subsequent image of the user's face (Hu: 0077: “In 504, client device 132 captures one or more images of a face or other body part of the user over a period of time.”; 0078: “In 512, image comparison module 320 determines a time series of images previously captured over a predetermined period of time to compare to the most recently received image.”; Wherein the reference image was previously captured and the subsequent image was captured in step 504);
determine reference facial properties of the user from the reference image, and subsequent facial properties of the user from the subsequent image (Hu: 0078: “…recognition module 318 within application 107 of server 104 extracts one or more facial features (or other body features) from the image(s) after receiving the image(s)… image comparison module 320 determines a time series of images previously captured over a predetermined period of time to compare to the most recently received image.”; Wherein the subsequent image’s facial features are extracted and compared to the features of the reference image);
determine any differences between the reference facial properties and the subsequent facial properties (Hu: 0078: “In 512, image comparison module 320 determines a time series of images previously captured over a predetermined period of time to compare to the most recently received image. In 514, image comparison module 320 compares the image series using a knowledge base to determine differences between one or more facial features (or other body features) that are indicative of a sub-optimal health condition of the user.”);
generate a record of the reference and the subsequent facial properties (Hu: 0024: “the one or more databases store the time series of images as well as health information regarding the user's health information. In particular embodiments, the user's health information includes quantitative measurements for determining differences between time-series image data (e.g., “selfies” taken over a period of time).”; Wherein the health information includes the facial properties);
generate a warning when the differences between the reference facial properties and the subsequent facial properties are determined (Hu: Figure 5; 0078: “In 516, image sub-health detection module 322 determines whether a difference in the facial or body features in the series of images exceed a predetermined threshold value.”; 0079: “In 520, alert/notification module 324 creates an alert for close monitoring of the health condition and sends the alert to the user.”; Wherein an alert is created when the differences exceed a threshold value), wherein the warning includes a user health survey (Hu: 0030: “ If over time any problem area reaches a threshold level, the system (e.g., the server) generates an alert. In particular embodiments, the system is configured to instruct users to take an image of a particular part of body to have a closer and more precise monitoring. For example, if dark circles are developing under the eyes of the user, the system instructs the user to take detailed photos of the eyes on a regular basis with greater resolution to improve the accuracy of the image comparison and the recommendation.”;
0036-0039: “the system acquires information from wearable devices to analyze the user's activities to facilitate determining a reason for a detected health concern…the system connects with a user's calendar to detect whether the user is under pressure and working too hard which might lead to sub-healthy status of the user. In another particular embodiment, the system acquires information from a personal health tracking system (e.g., a smart watch or fitness tracker) with the user's permission to reevaluate activity goals if some anomalies are detected.”; Wherein the requests for further imaging of a particular body part and for personal health tracking system information constitute a health survey.);
control a display to display the warning including the user health survey (Hu: 0028: “the system generates an alert to the user such as creating a notification on a mobile phone, sending an email, sending a text message or some other preferred methods of contact that includes an indication to a user that an image should be taken.”; Wherein the alert via notification, email, or text message constitutes an alert being displayed.);
control a communication unit to transmit the user health survey to an external device when the user health survey is completed (Hu: 0029-0030: “ a smart phone is used as the client device for taking daily photos of a user…the system retains taken images of a user, analyzes the images by image recognition, and tags changes in areas of features identified from one image in a sequence image in the series as potential problem area. In the embodiment, the tagged area is checked in each new image to determine if the problem increases or worsens. If over time any problem area reaches a threshold level, the system (e.g., the server) generates an alert. In particular embodiments, the system is configured to instruct users to take an image of a particular part of body to have a closer and more precise monitoring.”; 0036-0039: “the system acquires information from wearable devices to analyze the user's activities to facilitate determining a reason for a detected health concern…the system connects with a user's calendar to detect whether the user is under pressure and working too hard which might lead to sub-healthy status of the user. In another particular embodiment, the system acquires information from a personal health tracking system (e.g., a smart watch or fitness tracker) with the user's permission to reevaluate activity goals if some anomalies are detected.”);
and store the record in a memory (Hu: 0024: “the one or more databases store the time series of images as well as health information regarding the user's health information.”; 0054: “Database(s) 109, such as a user profile database, an image database, and/or a feature/symptom correlation database may be stored in storage 108 as shown or supplied by another source (not shown).”).,
wherein the user health survey comprises a requesting of further information relevant to a condition indicated by the determined differences between the reference facial properties and the subsequent facial properties (Hu: 0030: “ If over time any problem area reaches a threshold level…the system is configured to instruct users to take an image of a particular part of body to have a closer and more precise monitoring. For example, if dark circles are developing under the eyes of the user, the system instructs the user to take detailed photos of the eyes on a regular basis with greater resolution to improve the accuracy of the image comparison and the recommendation.”;
0036-0039: “the system acquires information from wearable devices to analyze the user's activities to facilitate determining a reason for a detected health concern…the system connects with a user's calendar to detect whether the user is under pressure and working too hard which might lead to sub-healthy status of the user. In another particular embodiment, the system acquires information from a personal health tracking system (e.g., a smart watch or fitness tracker) with the user's permission to reevaluate activity goals if some anomalies are detected.”) and thereby provides an early detection system for changes in patient health and compliance with a treatment regimen (Hu: 0031-0033: “a system database or cloud storage system records possible problems which can be noticed on the skin, the face or the body of a user…and compares a knowledge base including an image database for certain health issues available and provided by a medical agency with the user's daily images to provide further information to the user about the user's health status…
In another example, some red spots detected on the skin might be an early symptom for a skin cancer if not treated early enough…In other particular example, the system tracks the development of an injury, wound that is healing, skin condition, or other health condition on various parts of the body to determine through the time progression of images if the status is improving, deteriorating, or remains the same…
the system receives health monitoring information from a user's health monitoring devices such as a smart watch or fitness tracker device to detect changes of life style of the user and learn a correlation between detected facial and body changes and changes of living style or habit.”; Wherein all the collected information is processed by the system in order to detect the user’s current health status allowing for the early detection of health issues and the tracking of whether health conditions, such as wounds, are improving.).
Hu does not disclose expressly: wherein the user health survey comprises one or more questions requesting further information relevant to a condition indicated by the determined differences between the reference facial properties and the subsequent facial properties.
Viklund discloses: a system for analyzing the activities of a user for the purposes of diagnosing the user’s health issues and reporting activity deviations determined to be concerning, wherein for the process of reporting health deviations, a health survey, used to determine whether the activity deviation is urgent, is generated comprises one or more questions requesting further information relevant to a condition indicated by the determined differences between a reference activity and a subsequent received activity (Viklund: Figure 11; 0250: “FIG. 11 illustrates methods of generating an alert based on a dynamic threshold…The dynamic threshold is used to determine if a deviation from expected activity is sufficient to generate an alert.”;
0252-0254: “In a Receive Activity Step 1110 an activity level of a user is received…In a Receive Expected Activity Step 1115 an expected activity is received...In a Determine Deviation Step 1120 it is determined that activity level of the user received in Receive Activity Step 1110 represents a deviation from the expected activity of the user as received in Receive Expected Activity Step 1115.”;
0256: “In a Determine Threshold Step 1125 a threshold for the deviation is determined…Determination of a threshold is optionally responsive to answers to questions selected using Question Logic 195. For example, an answer to a question may explain a deviation or indicate that a deviation is likely to indicate a health problem. In an illustrative example, if a monitored user is detected getting up several times at night, then a question about how well the user slept may be selected. An answer to that question of "there was a party next door" may be indicative that getting up is not the result of an undesirable health state, while an answer "I keep feeling like I have to pee, but cannot" may be indicative of the likelihood of a health problem that warrants an alert be sent. Thresholds determined in Determine Threshold Step 1125 may, therefore, be based on responses to selected questions.”), wherein the user activity also includes user properties extracted from captured images (Viklund: 0194: “All or part of Machine Learning System 735 is optionally disposed on members of Monitored Devices 110. For example, acceleration data generated by Sensor 715A on Monitored Device 110A may be processed by Machine Learning System 735 to produce a preliminary result indicative of an activity…Machine Learning System 735, Machine Learning System 745, and/or Rule Logic are optionally configured to analyze images or a series of images. For example, Machine Learning System 735 may be configured to analyze an image of a wound for signs of infection or to analyze one or more images for indications of skin cancer, and/or other uses of image analysis discussed herein.”).
Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to implement the known technique of providing questions to a user based on detected deviations disclosed by Viklund by sending questions to a user based on the detected anomalies disclosed by Hu. The suggestion/motivation for doing so would have been “an answer to a question may explain a deviation or indicate that a deviation is likely to indicate a health problem… if a monitored user is detected getting up several times at night, then a question about how well the user slept may be selected. An answer to that question of "there was a party next door" may be indicative that getting up is not the result of an undesirable health state, while an answer "I keep feeling like I have to pee, but cannot" may be indicative of the likelihood of a health problem that warrants an alert be sent” (Viklund:0255). Further, one skilled in the art could have combined the elements as described above by known methods with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Hu with Viklund to obtain the invention as specified in claim 16.
Regarding claim 17, Hu in view of Viklund discloses: The non-transitory computer-readable medium according to claim 16, wherein each of the reference facial properties and the subsequent facial properties relate to at least one of eyes, skin, hair, or facial impression of the user as depicted in a respective one of the reference image and the subsequent image (Hu: 0036: “…tracked facial characteristics include, but are not limited to, dark circles around the eyes, eyes that bulge out, dull complexion, unhealthy color of teeth, dark spots on the skin, skin inflammations, change of shape of face, changes in eyeball shape/color, and acne.”).
Regarding claim 21, Hu in view of Viklund discloses: The non-transitory computer-readable medium according to claim 17, wherein the instructions cause the processor to determine at least one of a respective color or clarity of an eye from each of the reference image and the subsequent image (Hu: 0036: “tracked facial characteristics include, but are not limited to, dark circles around the eyes, eyes that bulge out, dull complexion, unhealthy color of teeth, dark spots on the skin, skin inflammations, change of shape of face, changes in eyeball shape/color, and acne.”).
Regarding claim 22, Hu in view of Viklund discloses: The non-transitory computer-readable medium according to claim 17, wherein the instructions cause the processor to determine at least one of a respective skin color, skin tone, or skin moisture from each of the reference image and the subsequent image (Hu: 0036: “tracked facial characteristics include, but are not limited to, dark circles around the eyes, eyes that bulge out, dull complexion, unhealthy color of teeth, dark spots on the skin, skin inflammations, change of shape of face, changes in eyeball shape/color, and acne.”).
Regarding claim 26, Hu in view of Viklund discloses: The non-transitory computer-readable medium according to claim 16, wherein prior to capturing the reference image and/or the subsequent image, the instructions cause the processor to: generate an input window requesting the reference image or the subsequent image to be taken, and control a display to display the input window (The recited “and/or” is interpreted as “or”) (Hu: 0028: “…capturing an image on an individual day could be skipped and if too many days are skipped in a row, the system generates an alert to the user such as creating a notification on a mobile phone, sending an email, sending a text message or some other preferred methods of contact that includes an indication to a user that an image should be taken.”; Wherein the alert/notification to the user requesting for them to take an image constitutes the input window).
As per claim(s) 27, arguments made in rejecting claim(s) 16 are analogous.
Regarding claim 28, Hu in view of Viklund discloses: The apparatus according to claim 27, wherein the apparatus is a mobile device (Hu: 0024: “In particular embodiments, the client device includes a mobile phone or a camera or other imaging device installed at a user location.”).
As per claim(s) 29, arguments made in rejecting claim(s) 17 are analogous.
Regarding claim 32, Hu in view of Viklund discloses: The apparatus according to claim 27, wherein the processor is configured to determine at least one of a respective skin color, skin tone, skin moisture, hair distribution, hair volume, eye color, and/or clarity of an eye from each of the reference image and the subsequent image (The recited “and/or” is interpreted as “or”) (Hu: 0036: “tracked facial characteristics include, but are not limited to, dark circles around the eyes, eyes that bulge out, dull complexion, unhealthy color of teeth, dark spots on the skin, skin inflammations, change of shape of face, changes in eyeball shape/color, and acne.”).
Regarding claim 34, Hu in view of Viklund discloses: The apparatus according to claim 27, wherein the user interface is further configured to receive user inputs for the user health survey (Hu: 0033: “a user's health profile stored in the database is used to provide information regarding the user's family medical history and personal medical history to determine if the person has predisposition to certain conditions…the system receives health monitoring information from a user's health monitoring devices such as a smart watch or fitness tracker device to detect changes of life style of the user and learn a correlation between detected facial and body changes and changes of living style or habit.”)
(Viklund: 0101: “Caregivers can report/provide answers to the questions via Caregiver Interface 158 via, typing, speaking into a microphone, or checking boxes. The answers can be provided by the user in response to the caregiver asking the user a question, or may be based on observations of the user by the caregiver. ”;
0255: “Determination of a threshold is optionally responsive to answers to questions selected using Question Logic 195. For example, an answer to a question may explain a deviation or indicate that a deviation is likely to indicate a health problem. In an illustrative example, if a monitored user is detected getting up several times at night, then a question about how well the user slept may be selected. An answer to that question of "there was a party next door" may be indicative that getting up is not the result of an undesirable health state, while an answer "I keep feeling like I have to pee, but cannot" may be indicative of the likelihood of a health problem that warrants an alert be sent.”).
As per claim(s) 35, arguments made in rejecting claim(s) 16 are analogous.
Regarding claim 36, Hu in view of Viklund discloses: The non-transitory computer-readable medium according to claim 16, wherein to determine any differences between the reference facial properties and the subsequent facial properties, the instructions cause the processor to analyze characteristics of the user's face against pre-stored information detailing pre-defined facial characteristics (Hu: 0031: “In particular embodiments, a system database or cloud storage system records possible problems which can be noticed on the skin, the face or the body of a user. In a particular example, the system keeps records of various images of melanoma progression over time, such as dark or red spots on the skin, and compares a knowledge base including an image database for certain health issues available and provided by a medical agency with the user's daily images to provide further information to the user about the user's health status.”; 0036: “a system tracks a history of user images taken over a period of time, conducts facial recognition to compare historical images to characteristics of a healthy face or from a medical image database for a specific health concern, detects a health concern based upon the comparison, and alerts the user based upon the detected health concern.”).
Regarding claim 37, Hu in view of Viklund discloses: The non-transitory computer-readable medium according to claim 36, wherein the pre-defined facial characteristics represent facial indicators that are indicative of diseases, disorders, or drug side-effects (Hu: 0031: “a system database or cloud storage system records possible problems which can be noticed on the skin, the face or the body of a user. In a particular example, the system keeps records of various images of melanoma progression over time, such as dark or red spots on the skin, and compares a knowledge base including an image database for certain health issues available and provided by a medical agency with the user's daily images to provide further information to the user about the user's health status.”).
Claim(s) 18-20, and 30-31 is/are rejected under 35 U.S.C. 103 as being unpatentable over Hu in view of Viklund, and further in view of Mursel et al. (TR 201723564 A2) hereinafter referenced as Mursel.
Regarding claim 18, Hu in view of Viklund discloses: The non-transitory computer-readable medium according to claim 17.
Hu in view of Viklund does not disclose expressly: wherein the instructions cause the processor to determine a respective facial impression from each of the reference image and the subsequent image based on a respective distance measured between a fixed face point and a variable face point depicted in the corresponding image.
Mursel discloses: determination of a facial impression based on a respective distance measured between a fixed face point and a variable face point depicted (Mursel: Page 3-4: Paragraph: 5: “The information acquired is processed into the database of the processor (3) automated by the deep learning system. such as mouth distance, depth of eye, nose, distance and position relation between cue points”; Wherein the distance between cue points constitutes the distance measured between a fixed and variable face point.).
Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to incorporate the known technique taught by Mursel of measuring the distance between cue points into Hu in view of Viklund by measuring points on the reference and subsequent images. The suggestion/motivation for doing so would have been “…the automated processor (3) with the deep learning system provides a list of possible physical and mental illnesses where these measures and characteristics match.” (Mursel: Page 4: Paragraph 1). Further, one skilled in the art could have combined the elements as described above by known methods with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Hu in view of Viklund with Mursel to obtain the invention as specified in claim 18.
Regarding claim 19, Hu in view of Viklund and Mursel discloses: The non-transitory computer-readable medium according to claim 18, wherein the fixed face point comprises at least one of a bridge of a nose and an outer edge of a nostril (Claim limitation is interpreted according to Superguide Claim Interpretation disclosed above.) (Mursel: Page 3-4: Paragraph: 5: “The information acquired is processed into the database of the processor (3) automated by the deep learning system. such as mouth distance, depth of eye, nose, distance and position relation between cue points”; Wherein the nose constitutes the bridge and the outer edge of a nostril).
Regarding claim 20, Hu in view of Viklund and Mursel discloses: The non-transitory computer-readable medium according to claim 18, wherein the variable face point comprises at least one of an outer edge of an eyelid and a corner of a mouth (Claim limitation is interpreted according to Superguide Claim Interpretation disclosed above.) (Mursel: Page 3-4: Paragraph: 5: “The information acquired is processed into the database of the processor (3) automated by the deep learning system. such as mouth distance, depth of eye, nose, distance and position relation between cue points”; Wherein the eye comprises at least one of an outer edge of an eyelid).
As per claim(s) 30, arguments made in rejecting claim(s) 18 are analogous.
Regarding claim 31, Hu in view of Viklund and Mursel discloses: The apparatus according to claim 30, wherein the fixed face point comprises at least one of a bridge of a nose and an outer edge of a nostril (Claim limitation is interpreted according to Superguide Claim Interpretation disclosed above.) (Mursel: Page 3-4: Paragraph: 5: “The information acquired is processed into the database of the processor (3) automated by the deep learning system. such as mouth distance, depth of eye, nose, distance and position relation between cue points”; Wherein the nose constitutes the bridge and the outer edge of a nostril), and wherein the variable face point comprises at least one of an outer edge of an eyelid and a corner of a mouth (Mursel: Page 3-4: Paragraph: 5: “The information acquired is processed into the database of the processor (3) automated by the deep learning system. such as mouth distance, depth of eye, nose, distance and position relation between cue points”; Wherein the eye comprises at least one of an outer edge of an eyelid).
Claim(s) 23 is/are rejected under 35 U.S.C. 103 as being unpatentable over Hu in view of Viklund, and further in view of Bhalotia (US 20200312455 A1).
Regarding claim 23, Hu in view of Viklund discloses: The non-transitory computer-readable medium according to claim 16.
Hu in view of Viklund does not disclose expressly: wherein the instructions cause the processor to determine at least one of a respective hair distribution, or hair volume from each of the reference and the subsequent images.
Bhalotia discloses: determining at least one of a respective hair distribution, or hair volume from an image (Bhalotia: 0043: “The processing unit 102 may be configured to process at least an image to determine different patterns/colors/shades/texture of eyes, hair, nails, or the skin of face or other body parts.”).
Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to incorporate the algorithms for analyzing hair visual markers disclosed by Bhalotia into the feature extraction process disclosed by Hu in view of Viklund. The suggestion/motivation for doing so would have been “At step 306, the captured image of visual markers is analyzed by the processing unit 102 to determine the health condition of the individual. The determination of the health condition of an individual comprises determination of one or more of a constitution type, a disease type…symptoms of a disease, susceptibility to a disease, tendency to develop illness...” (Bhalotia: 0058; Wherein more visual features allow for more accurate determinations.). Further, one skilled in the art could have combined the elements as described above by known methods with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Hu in view of Viklund with Bhalotia to obtain the invention as specified in claim 23.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ANTHONY J RODRIGUEZ whose telephone number is (703)756-5821. The examiner can normally be reached Monday-Friday 10am-7pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Sumati Lefkowitz can be reached at (571) 272-3638. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ANTHONY J RODRIGUEZ/
Examiner, Art Unit 2672
/SUMATI LEFKOWITZ/Supervisory Patent Examiner, Art Unit 2672