DETAILED ACTION
This Office Action is sent in response to Applicant's Response received 01/14/2026 for 18510478. Claims 1-30 are pending.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 01/14/2026 has been entered.
Response to Arguments
Applicant's arguments with respect to the 101 rejection of claims 1-30 continue to essentially consist of asserting that the claims are directed to statutory subject matter, which cannot take the place of evidence in the record. See In re De Blauwe, 736 F.2d 699, 705, 222 USPQ 191, 196 (Fed. Cir. 1984); In re Schulze, 346 F.2d 600, 602, 145 USPQ 716, 718 (CCPA 1965); In re Geisler, 116 F.3d 1465, 43 USPQ2d 1362 (Fed. Cir. 1997) [MPEP 2145(I)].
As noted below, the Office Action presents evidence that stands in direct contrast to Applicant's assertions that the claims are directed to statutory subject matter. Claims 1-30 remain rejected.
Applicant's arguments with respect to the 103 rejection of claim 1 have been fully considered but are not persuasive in view of the new and/or updated citations used in the current rejection of record under Dalal in response to the newly amended limitations.
Applicant's arguments essentially consist of reciting the claim language, copying portions of each reference, and asserting each reference does not disclose the recited claim language, which are not separate arguments for patentability of the claims and amount to mere allegation that the cited prior art references are deficient. A general allegation that the claims define a patentable invention without specifically pointing out how the language of the claims patentably distinguishes them from the references neither "distinctly and specifically points out the supposed errors in the examiner’s action" nor "present[s] arguments pointing out the specific distinctions believed to render the claims, including any newly presented claims, patentable over any applied references" as required in Applicant's reply [see 37 C.F.R. § 1.111(b)].
As noted below, the Office Action presents evidence that stands in direct contrast to Applicant's arguments that Dalal, alone or in combination, do not teach or suggest the newly amended limitation to "determine, using a mapping model based on the first hand gesture and the control gesture as inputs, a first control option of the plurality of control options of the first settings list based on a combination of the control gesture and the first hand gesture". Claim 1 remains rejected.
Claim 19 recites similar limitations to those recited in claim 1 and remains rejected upon a similar basis as claim 1 as stated above.
Dependent claims 2-18 and 20-30 remain rejected at least based on their dependence from independent claims 1 and 19.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-30 are rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. The claim(s) does/do not fall within at least one of the four categories of patent eligible subject matter because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more.
Claim 1 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
The claim recites detect, based on sensor data from one or more sensors, a first hand gesture of a plurality of first hand gestures of a user; determine a first settings list of a plurality of settings lists based on the first hand gesture, wherein the first settings list comprises a plurality of control options; detect a control gesture of a plurality of control gestures by the user, wherein each control gesture of the plurality of control gestures is associated with a different control option of the plurality of control options; determine a first control option of the plurality of control options of the first settings list based on the control gesture; and enable the first control option to control the apparatus.
The limitation of detecting a first hand gesture of a plurality of first hand gestures of a user, as drafted, is a process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of generic computer components. That is, nothing in the claim element precludes the step from practically being performed in the mind. For example, "detecting" in the context of this claim encompasses a user observation or evaluation that the first hand of a user is performing a gesture from known gestures.
The limitation of determining a first settings list of a plurality of settings lists based on the first hand gesture, as drafted, is a process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of generic computer components. That is, nothing in the claim element precludes the step from practically being performed in the mind. For example, "determining" in the context of this claim encompasses a user evaluation of a first settings list from known setting lists based on the observed hand gesture.
The limitation of detecting a control gesture of a plurality of control gestures by the user, as drafted, is a process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of generic computer components. That is, nothing in the claim element precludes the step from practically being performed in the mind. For example, "detecting" in the context of this claim encompasses a user observation or evaluation of a user control gesture from known control gestures.
The limitation of determining, using a mapping model based on the first hand gesture and the control gesture as inputs, a first control option of the plurality of control options of the first settings list based a combination of on the control gesture and the first hand gesture, as drafted, is a process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of generic computer components. That is, nothing in the claim element precludes the step from practically being performed in the mind. For example, "determining" in the context of this claim encompasses a user evaluation of a control option among known control options based on a mental evaluation mapping the observed control gesture and observed first hand gesture.
If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of generic computer components, then it falls within the "Mental Processes" grouping of abstract ideas. Accordingly, the claim recites an abstract idea.
This judicial exception is not integrated into a practical application. In particular, the claim recites an apparatus, memory, processor, sensor data, sensors, control options, and enabling control of the apparatus. The apparatus, memory, processor, sensor data, sensors, control options, and apparatus control are recited at a high level of generality and recited so generically that they represent no more than mere instructions to apply the judicial exception on a computer [MPEP 2106.05(f)]. Additionally, these limitations can also be viewed as nothing more than an attempt to generally link the use of the judicial exception to the technological environment of computer execution [MPEP 2106.05(h)].
The sensor represents mere data gathering that is necessary for use of the recited judicial exception, as the obtained sensor data is used in the abstract mental process of observing and evaluating. The sensor is recited at a high level of generality and is therefore insignificant extra-solution activity [MPEP 2106.05(g)]. Even when viewed in combination, the additional elements in this claim do no more than automate the mental processes that user performs, using the computer components as a tool.
Accordingly, these additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional elements of an apparatus, memory, processor, sensor data, sensors, and control options amount to no more than mere instructions to apply the exception using generic computer components. Mere instructions to apply an exception using generic computer components cannot provide an inventive concept.
The sensor, as discussed above, represents mere data gathering and is insignificant extra-solution activity. Further, these elements are well-understood, routine and conventional.
With respect to the "sensors" for obtaining sensor data, the courts have found limitations directed to obtaining information electronically, recited at a high level of generality, to be well-understood, routine, and conventional [MPEP 2106.05(d))(II), "electronic recordkeeping," and "storing and retrieving information in memory"].
Considering the additional elements individually and in combination and the claim as a whole, the additional elements do not provide significantly more than the abstract idea. The claim is not patent eligible.
Claim 19 recites method steps substantially similar to those recited in claim 1 and recite an abstract idea. While the claims recite additional elements of a device, sensors, and processors, the elements are recited at a high level of generality and recited so generically that they represent no more than mere instructions to apply the judicial exception on a computer [MPEP 2106.05(f)] and do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea.
The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional elements of a device, sensors, and processors amount to no more than mere instructions to apply the exception using generic computer components. Mere instructions to apply an exception using generic computer components cannot provide an inventive concept. Considering the additional elements individually and in combination and the claims as a whole, the additional elements do not provide significantly more than the abstract idea. The claim is not patent eligible.
The dependent claims also recite limitations of detect the touch gesture of the user (claims 2, 20); the second hand gesture of a plurality of second hand gestures, to detect the second hand gesture of one of a hand performing the first hand gesture or another hand of the user (claims 4, 21); determine a first number associated with the first hand gesture (claims 8, 25); determine the first settings list of the plurality of settings lists based on the first number (claims 9, 26); detect the control gesture and the first hand gesture simultaneously (claims 10, 27) cover performance of the limitation in the mind but for the recitation of generic computer components encompassing user observations or evaluations of the touch gesture of the user, observation or evaluation of a first hand or another hand performing the second hand gesture; observation or evaluation of a first number associated with the first hand gesture; evaluation of a settings list of known settings list based on the observed number; and observation or evaluation of the control gesture and the first hand gesture being performed simultaneously and thus fall within the "Mental Processes" grouping of abstract ideas.
This judicial exception is not integrated into a practical application. The dependent claims recite additional limitations including a touch gesture of a plurality of touch gestures, a touch pad (claims 2, 20); a single tap operation, a double tap operation, a press and hold operation, a swipe forward operation, or a swipe backward operation (claim 3); a pause operation, a select operation, a next operation, a voice assist operation, a volume up operation, a volume down operation, an increase brightness operation, a decrease brightness operation, a launch camera operation, a zoom in operation, or a zoom out operation (claims 5, 22); wherein each settings list of the plurality of settings lists is associated with a respective hand gesture (claims 6, 23); a gesture of one or more fingers indicating a number, a fist gesture, a hand waving gesture, a pinching gesture, or a hand movement gesture to form a shape (claims 7, 24); each settings list of the plurality of settings lists is associated with a respective number (claims 9, 26); a red, green, blue (RGB) camera sensor or a monochrome camera sensor (claim 11); wherein each sensor of the one or more sensors is implemented within one of the apparatus or an additional device (claims 12, 28); wherein the apparatus and the additional device communicate with each other via a wireless communication protocol (claim 13); a Bluetooth protocol (claim 14); a mobile phone (claim 15); an extended reality device (claims 16, 29); a head-mounted-display (claim 17); sensors (claim 18); another device (claim 30) that are recited at a high level of generality and recited so generically that they represent no more than mere instructions to apply the judicial exception on a computer [MPEP 2106.05(f)] and generally link the use of the judicial exception to the technological environment of executing computer systems [MPEP 2106.05(h)] and do not impose any meaningful limits on practicing the abstract idea. The dependent claims also recite additional limitations of detecting the touch gesture of the user via a touch pad (claims 2, 20) that represent insignificant extra-solution activity including nominal or tangential additions to the claim, amounting to mere data collection [MPEP 2106.05(g)]. Accordingly, the additional elements do not integrate the abstract idea into a practical application because they do not impose meaningful limits on practicing the abstract idea. The claims are directed to an abstract idea.
The dependent claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. The additional elements of collecting data are recited at a high level of generality which are well-understood, routine, or conventional activities [MPEP 2106.05(d))(II), "storing and retrieving information in memory"] and remain insignificant extra-solution activity even upon reconsideration [MPEP 2106.05(g)]. Mere instructions to apply an exception using generic computer components, linking the use of an exception to a technological field of use, and insignificant extra-solution activity cannot provide an inventive concept. The claims are not patent eligible.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1, 3-13, 15-16, 18-19, and 21-30 is/are rejected under 35 U.S.C. 103 as being unpatentable over Dalal et al. (US 20140157209 A1).
As to claim 1, Dalal discloses an apparatus for enabling one or more control options [para 0036, system with application control], the apparatus comprising: at least one memory; and at least one processor coupled to the at least one memory and configured [para 0033, 0059, system includes memory and processor] to:
detect, based on sensor data from one or more sensors, a first hand gesture of a plurality of first hand gestures of a user [para 0021, 0030, 0047, determine input gesture from images (read: sensor data) captured by camera (read: sensor) including user hand gesture (read: first hand gesture) among various input gesture patterns (read: first hand gestures)]; …
detect a control gesture of a plurality of control gestures by the user, wherein each control gesture of the plurality of control gestures is associated with a different control option of the plurality of control options [para 0042-0043, detect gesture input (read: control gesture) made by user of detectable gestures (read: plurality of control gestures), where detectable gestures may mapped to (read: associated with) application action (read: different control option) of application actions];
determine, using a mapping model based on the first hand gesture and the control gesture as inputs, a first control option of the plurality of control options of the first settings list based on a combination of the control gesture and the first hand gesture [para 0037-0038, 0043-0044, 0047, detect changed application model (read: mapping model) determined from captured input gesture and using detected gesture input and map detected gesture input to application action (read: first control option) of application actions with application model]; and
enable the first control option to control the apparatus [para 0036, 0045, device performs application action].
As stated above, Dalal teaches detecting, based on sensor data from one or more sensors, a first hand gesture of a plurality of first hand gestures of a user [para 0021, 0030] but not explicitly determine a first settings list of a plurality of settings lists based on the first hand gesture, wherein the first settings list comprises a plurality of control options.
However, Dalal teaches determining a first settings list of a plurality of settings lists based on detecting a selection changing applications, wherein the first settings list comprises a plurality of control option [para 0030, 0036-0038, update hierarchy model for application (read: first settings list) including application actions (read: control options) part of hierarchy models for multiple applications (read: settings lists) based on detected application change selecting application] and that gestures to perform a selection action include a hand gesture [para 0030, 0047, 0054-0055, gestures include specific hand orientation to select between applications].
Dalal is analogous art to the claimed invention being from a similar field of endeavor of user interface input systems. Thus it would have been obvious to one skilled in the art before the effective filing date of the claimed invention apply the teachings of Dalal determining a first settings list of a plurality of settings lists based on a selection changing applications to the teachings of a selection gesture performed with a hand gesture with a reasonable expectation of success to result in determine a first settings list of a plurality of settings lists based on the first hand gesture, wherein the first settings list comprises a plurality of control options [see MPEP 2143].
One of ordinary skill in the art would be motivated to apply this teaching to Dalal to increase the intuitive nature of applying gestures [Dalal, para 0047].
As to claim 3, Dalal discloses the apparatus of claim 2, wherein each touch gesture of the plurality of touch gestures is one of [para 0051, 0055, swipe gesture, note strikethrough indicates non-selected alternative],
As to claim 4, Dalal discloses the apparatus of claim 1, wherein:
the control gesture is a second hand gesture of a plurality of second hand gestures [para 0026-0028, 0030, 0042, determine input gesture based on detecting second gesture object including hand configuration (read: second hand gesture) of detectable object configurations (read: second hand gestures)]; and
to detect the control gesture, the at least one processor is configured to detect the second hand gesture of one of a hand performing the first hand gesture [para 0028, 0030, determine input gesture based on detecting second gesture object of same physical hand object in different configuration]
As to claim 5, Dalal discloses the apparatus of claim 1, wherein each control option of the plurality of control options is one of a pause operation [para 0043, 0046-0048, application actions include pause function, also note other actions include confirmation, next, volume up and down, and zoom in and out actions],
As to claim 6, Dalal discloses the apparatus of claim 1, wherein each settings list of the plurality of settings lists is associated with a respective hand gesture [Figs. 12-14, para 0035, 0038-0039, hierarchy model for application part of hierarchy models for multiple applications reuse gesture inputs including hand gesture].
As to claim 7, Dalal discloses the apparatus of claim 1, wherein each hand gesture of the plurality of first hand gestures is one of [para 0030, 0047, 0052, 0054-0055, various detected input gesture patterns include waving gesture; also note other detected gestures include holding up a number of fingers, closing a hand, pinch gesture, and mimicking brand icons],
As to claim 8, Dalal discloses the apparatus of claim 1, wherein the at least one processor is configured to determine a first number associated with the first hand gesture [para 0033, 0035, 0037-0038, device includes processing unit managing application priority level (read: first number) as changed with detected hand gesture changing application].
As to claim 9, Dalal discloses the apparatus of claim 8, wherein:
the at least one processor is configured to determine the first settings list of the plurality of settings lists based on the first number [para 0033, 0035, 0037-0038, organize hierarchy model for application (read: first settings list) including application actions (read: control options) part of hierarchy models for multiple applications (read: settings lists) based on application priority level]; and
each settings list of the plurality of settings lists is associated with a respective number [Figs. 12-14, para 0035, 0037-0039, application hierarchy model of hierarchy models for multiple applications are prioritized based application priority level].
As to claim 10, Dalal discloses the apparatus of claim 1, wherein the at least one processor is configured to detect the control gesture and the first hand gesture simultaneously [para 0028, detect cooperative gesture input including first gesture performed by first hand and second gesture performed by second hand].
As to claim 11, Dalal discloses the apparatus of claim 1, wherein each sensor of the one or more sensors is one of a red, green, blue (RGB) camera sensor [para 0020-0021, imaging sensor unit includes RGB camera]
As to claim 12, Dalal discloses the apparatus of claim 1, wherein each sensor of the one or more sensors is implemented within one of [para 0019-0020, 0033, imaging sensor unit integrated with controllable device (read: additional device)].
As to claim 13, Dalal discloses the apparatus of claim 12, wherein the apparatus and the additional device communicate with each other via a wireless communication protocol [para 0021, 0033, capture device transfers operation to connected device over wi-fi (read: wireless communication protocol)].
As to claim 15, Dalal discloses the apparatus of claim 12, wherein the additional device is a mobile phone [para 0021, 0033, capture device includes smart phone].
As to claim 16, Dalal discloses the apparatus of claim 1, wherein the apparatus is an extended reality (XR) device [para 0037, 0041, 0049, device displays user interface elements, where device displaying elements including at least virtual button is consistent with examples of extended reality including virtual reality as described in Applicant's specification (0002)].
As to claim 18, Dalal discloses the apparatus of claim 1, further comprising the one or more sensors [para 0020-0021, sensors].
As to claim 19, Dalal, combined at least for the reasons above, discloses a method for controlling a device, the method comprising: using one or more sensors and/or one or more processors [para 0020, 0033, device includes sensor and central processing unit performing operations] to perform limitations substantially similar to those recited in claim 1 and is rejected under similar rationale.
As to claims 21-29, Dalal, combined at least for the reasons above, discloses the method of claim 19 comprising limitations substantially similar to those recited in claims 2, 4-10, 12, and 16, respectively, and is rejected under similar rationale.
As to claim 30, Dalal discloses the method of claim 19, wherein each processor of the one or more processors is implemented within one of the device [para 0033, device includes central processing unit] .
Claims 2, 14, 17, and 20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Dalal as applied to claims 1 and 19 above, and further in view of Canberk (US 20210405761 A1).
As to claim 2, Dalal discloses the apparatus of claim 1, wherein:
the control gesture is a touch gesture of a plurality of touch gestures [para 0030, 0042, gesture input includes user hand gestures among various input gesture patterns detected through touch device]; and
to detect the control gesture, the at least one processor is configured to detect, via a touch [device] of the apparatus, the touch gesture by the user [para 0042, gesture input detected through touch device].
However, Dalal does not specifically disclose wherein "a touch [device]" is "a touch pad".
Canberk discloses a touch pad [para 0029-0031, detect input on touchpad].
Dalal and Canberk are analogous art to the claimed invention being from a similar field of endeavor of user interface input systems. Thus, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the touch device as disclosed by Dalal with the integrated touch pad as disclosed by Canberk with a reasonable expectation of success.
One of ordinary skill in the art would be motivated to modify Dalal as described above to allow user input in an intuitive manner [Canberk, para 0030].
As to claim 14, Dalal discloses the apparatus of claim 13.
However, Dalal does not specifically disclose wherein the wireless communication protocol is a Bluetooth protocol.
Canberk discloses wherein the wireless communication protocol is a Bluetooth protocol [para 0042, 0065, 0083, wireless network communication via Bluetooth].
Dalal and Canberk are analogous art to the claimed invention being from a similar field of endeavor of user interface input systems. Thus, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the wireless communication protocol as disclosed by Dalal with the Bluetooth wireless communication protocol as disclosed by Canberk with a reasonable expectation of success.
One of ordinary skill in the art would be motivated to modify Dalal as described above to generate more accurate location coordinates [Canberk, para 0083].
As to claim 17, Dalal discloses the apparatus of claim 16.
However, Dalal does not specifically disclose wherein the XR device is a head-mounted display (HMD) apparatus.
Canberk discloses wherein the XR device is a head-mounted display (HMD) apparatus [Figs. 1A, 2A, 4, para 0029, 0045, 0058, augmented reality system includes eyewear device, note augmented reality display is consistent with examples of extended reality including augmented reality as described in Applicant's specification (0002)].
Dalal and Canberk are analogous art to the claimed invention being from a similar field of endeavor of user interface input systems. Thus, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the extended reality device as disclosed by Dalal with the head-mounted display apparatus as disclosed by Canberk with a reasonable expectation of success.
One of ordinary skill in the art would be motivated to modify Dalal as described above to enhance and simplify the user experience [Canberk, para 0030].
As to claim 20, Dalal and Canberk, combined at least for the reasons above, discloses the method of claim 19 comprising limitations substantially similar to those recited in claim 2 and is rejected under similar rationale.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Christie (US 20110239155 A1) generally discloses classifying detected features and associating feature groups to respective user interface elements.
Erivantcev et al. (US 20220253146 A1) and Kies et al. (US 20140282272 A1) and generally disclose determining application context and invoking commands based on determined application context.
Kramer et al. (US 20140195988 A1) generally discloses determining hand gestures with fingers indicating a number.
Nie et al. (US 20240094819 A1) generally discloses distinguishing control options based on a detected control gesture.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to LINDA HUYNH whose telephone number is (571)272-5240 and email is linda.huynh@uspto.gov. The examiner can normally be reached M-F between 9am-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Adam Queler can be reached at (571) 272-4140. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/LINDA HUYNH/Primary Examiner, Art Unit 2172