DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
Acknowledgement is made to Applicant’s claim to priority to National Stage Application No. PCT/IB2021/059184 filed October 06, 2021 and to priority to Provisional Application No. 63/091961 filed October 15, 2020.
Status of Claims
This present office action is responsive to the Application filed on April 4, 2023. As directed, claims 1-17 are presently pending in this application.
Claim Objections
Claims 1-17 are objected to because of the following informalities:
Claim 1 recites, “at least a first portion of the facepiece or body” which Examiner suggest amending to read --at least a portion of the facepiece or the body-- for consistency and clarity.
Claim 2-15 are objected by dependency to claim 1.
Claim 6 recites, “the facepiece or body” in ln 2 which Examiner suggest amending to read --the facepiece or the body--
Claim 10 recites, “wherein the touch pattern includes” in ln 1 which Examiner suggest amending to read --wherein the touch pattern further comprises--
Claim 16 recites, “at least a first portion of the facepiece or body” in ln 4 which Examiner suggest amending to read --at least a portion of the facepiece or the body--
Claim 16 recites, “at least the portion of the facepiece or body” in ln 6 which Examiner suggest amending to read --at least the portion of the facepiece or the body--
Claim 17 is objected by dependency to claim 16.
Appropriate correction is required.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1-17 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 1 recites, “a body (or face blank or frame)” in ln 2 which renders claim indefinite because it is unclear if the limitations within the parenthesis are part of the claimed invention. See MPEP § 2173.05(d).
For the examination purposes, it is interpreted as to --a body--.
Claims 2-15 are rejected by the virtue of dependency to claim 1.
Claim 8 recites, “wherein a first sensing unit” which renders claim indefinite because it is unclear whether the “first sensing unit” is further limiting the haptic sensing unit of claim 1 or adding an additional sensing unit to the respiratory device.
For the examination purposes, it is interpreted as to --wherein the haptic sensing unit comprises--.
Further, claim 8 recites, “the body (or face blank or frame)” in ln 3 which renders claim indefinite because it is unclear if the limitations within the parenthesis are part of the claimed invention. See MPEP § 2173.05(d).
For the examination purposes, it is interpreted as to --the body--.
Claims 12-13 recites, “wherein a first haptic sensor” similar to the claim 8 which renders claim indefinite because it is unclear whether the “first sensing unit” is further limiting the haptic sensing unit of claim 1 or adding an additional sensing unit to the respiratory device.
For the examination purposes, it is interpreted as to --wherein the haptic sensing unit--.
Claim 16 recites, “triggering the use of at least one of the electrical function components” in ln 7 which renders claim indefinite because it is unclear whether the claim is newly introducing more electrical function components other than the “electrical function component” in ln 1 or referring to the same “electrical function component” in ln 1.
For the examination purposes, it is interpreted as to -- triggering the use of the electrical function component --.
Claim 17 is rejected by the virtue of dependency to claim 16.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-7, and 15-17 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Wallace (US 20200306567 A1; cited on IDS filed on 05/31/2023).
Regarding claim 1, Wallace discloses, a respiratory device (10, Fig 1), comprising:
a body (30, Fig 1), at least a portion of the body defining a volume (a space within the haptic device as shown in Fig 1);
a facepiece (12, Fig 1) coupled to the body (¶0026, “The haptic device 18 may be permanently retained or removably coupled to the facemask 12“); and
a haptic sensing unit (18, Fig 1) to sense a touch pattern occurring on at least a portion of the body (¶0026-34, “may manually trigger the transmission of a request for assistance, or conversely, a signal indicating a positive status, by touching a predetermined portion of the haptic device 18 or by speaking a command into the haptic device 18. In such configurations, the haptic device 18 may include a microphone (not shown) or a touch sensor (not shown) in communication with the processor 22”).
Regarding claim 2, Wallace discloses, the respiratory device of claim 1 as discussed above.
Wallace further discloses, wherein the haptic sensing unit comprises at least one haptic sensor (¶0026-34).
Regarding claim 3, Wallace discloses, the respiratory device of claim 2 as discussed above.
Wallace further discloses, wherein the at least one haptic sensor comprises at least one of an accelerometer, and a touch sensor (¶0026-34).
Regarding claim 4, Wallace discloses, the respiratory device of claim 1 as discussed above.
Wallace further discloses, wherein the haptic sensor unit is programmed to execute a user-definable command (¶0026-34).
Regarding claim 5, Wallace discloses, the respiratory device of claim 4 as discussed above.
Wallace further discloses, wherein the user definable command includes one or more of the following commands: Feature on/off functionality (¶0007, “emergency request for assistance”; ¶0026-34, “emergency request”, “by touching a predetermined portion of the haptic device 18 or by speaking a command into the haptic device 18”).
Regarding claim 6, Wallace discloses, the respiratory device of claim 1 as discussed above.
Wallace further discloses, the haptic sensor unit comprises a plurality of haptic sensors (24a and 24b as shown in Fig 1), wherein each sensor is disposed in a different portion of the body (24a and 24b are located in different location as shown in Fig 1).
Regarding claim 7, Wallace discloses, the respiratory device of claim 1 as discussed above.
Wallace further discloses, wherein the haptic sensor unit is disposed in a Mask Communication Unit (MCU) (22, Fig 1; ¶0026) at least partially located within the volume.
Regarding claim 15, Wallace discloses, the respiratory device of claim 7 as discussed above.
Wallace further discloses, at least one electrical function component (20, Fig 1; ¶0026, the substrate 20 includes a printed circuit board in communication with the processor or controller 22 having processing circuitry configured processes the various signals sent to and/or received from the haptic device 18”) in communication with the MCU; and a rechargeable power (26, Fig 1; ¶0029, “the power source 26 may be rechargeable”) source at least partially located within the volume, the rechargeable power source providing power to each of the at least one electrical function components (¶0029, “he power source 26, such as a battery, may be in electrical communication with the processor 22 to provide power to the various components of the haptic device 18 and may be disposed on one side of the processor 22”).
Regarding claim 16, Wallace discloses, a method (abstract, “A method and system for communication with a use”) of commanding an electrical function component of a respiratory device (10, Fig 1), comprising:
a body (18, Fig 1), and a facepiece (12, Fig 1) coupled to the body (¶0026, “The haptic
device 18 may be permanently retained or removably coupled to the facemask 12“) comprising:
providing a haptic sensing unit (24a and 24b of 18, Fig 1) having a haptic sensor configured to sense a touch pattern occurring on at least a first portion of the facepiece or body (¶0026-34, “ay manually trigger the transmission of a request for assistance, or conversely, a signal indicating a positive status, by touching a predetermined portion of the haptic device 18 or by speaking a command into the haptic device 18. In such configurations, the haptic device 18 may include a microphone (not shown) or a touch sensor (not shown) in communication with the processor 22”);
executing a user definable command when the haptic sensor senses a touch pattern occurring on at least a portion of the body (¶0026-34); and
triggering the use of the electrical function components based on the touch pattern sensed (¶0026-34).
Regarding claim 17, Wallace discloses, the method of claim 16 as discussed above.
Wallace further discloses, wherein the user definable command includes one or more of the following commands: Feature on/off functionality (¶0007, “emergency request for assistance”; ¶0026-34, “emergency request”, “by touching a predetermined portion of the haptic device 18 or by speaking a command into the haptic device 18”).
Claims 1-3, 12, 14, and 16 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Wu (machine translation of WO 2018223942 A1).
Regarding claim 1, Wu discloses a respiratory device (an assembly of Fig 1-3), comprising:
a body (a welding mask 100, Fig 2), at least a portion of the body defining a volume (a space within the mask for wearing on an operator’s head as shown in Figs 1-2; PG 4, ln 8-9);
a facepiece (auto dimmer filter 101, Figs 1-2, and 4) coupled to the body; and
a haptic sensing unit (interface area 101b, Fig 4) to sense a touch pattern occurring on at least a portion of the facepiece (101b, Fig 4; PG 5, ln 8-16, “The operation interface display area 101b itself may be a touch type liquid crystal display panel so that the operator can conveniently operate the parameters of the automatic dimming filter 101 with the fingers”).
Regarding claim 2, Wu discloses, the device of claim 1 as discussed above.
Wu further discloses, wherein the haptic sensing unit comprises at least one haptic sensor (101b, Fig 4; PG 5, ln 8-16, “The operation interface display area 101b itself may be a touch type liquid crystal display panel so that the operator can conveniently operate the parameters of the automatic dimming filter 101 with the fingers”).
Regarding claim 3, Wu discloses, the device of claim 2 as discussed above.
Wu further discloses, wherein the at least one haptic sensor comprises a touch sensor (101b, Fig 4; PG 5, ln 8-16, “The operation interface display area 101b itself may be a touch type liquid crystal display panel so that the operator can conveniently operate the parameters of the automatic dimming filter 101 with the fingers”). .
Regarding claim 12, Wu discloses, the device of claim 1 as discussed above.
Wu further discloses, wherein a first haptic sensor is mounted on a centerline of the facepiece (101, Figs 1-2, and 4; Examiner note that the 101b is a touch type liquid crystal display panel which are located at the centerline of the facepiece).
Regarding claim 14, Wu discloses, the device of claim 1 as discussed above.
Wu further discloses, further comprising an In-Mask Display (101b and 101c, Fig 4; PG 5, ln 8-6, “auto-dimmer filter 101 includes a liquid crystal display area 101a, an operation interface display area 101b, and a respirator parameter display area 101c”)
Regarding claim 16, Wu discloses a method of commanding an electrical function component (abstract; PG 2, ln 28-37) of a respirator device (an assembly of Fig 1-2) having
a body (a welding mask 100, Fig 2), and a facepiece (auto dimmer filter 101, Figs 1-2, and 4) coupled to the body, comprising:
providing a haptic sensing unit (interface area 101b, Fig 4) having a haptic sensor configured to sense a touch pattern occurring on at least a portion of the facepiece;
executing a user definable command when the haptic sensor senses a touch pattern occurring on at least a first portion of the facepiece or body (101b, Fig 4; PG 5, ln 8-16, “The operation interface display area 101b itself may be a touch type liquid crystal display panel so that the operator can conveniently operate the parameters of the automatic dimming filter 101 with the fingers”); and
triggering the use of the electrical function components based on the touch pattern sensed (PG 2, ln 1-37, PG 5, ln 8-16, implies that the dimming filter can be control via the touch interface, 101b”).
Claims 1, 4, 6-8, and 16 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Hu et al. (machine translation of CN 111558181 A1).
Regarding claim 1, Hu et al. discloses, a respiratory device (Fig 1), comprising:
a body (7, Fig 1), at least a portion of the body defining a volume (PG 8, ln 19-28, implies that there is a space within the shell 7 where the modules are placed);
a facepiece (mask 1 which is connected to the shell 7 via cable 100 as shown in Fig 1) coupled to the body; and
a haptic sensing unit to sense a touch pattern (Examiner interprets the haptic sensing unit as to any structure that can interact with any touch pattern such as press, drag, tap, gesture input or etc. in BRI as no further structure is defined to the unit) occurring on at least a portion of the facepiece (101 of the mask 1, Fig 1; PG 8, ln 36- PG 9, ln 17 implies that the display 101 module can be touch operated or the button controlled to control the display settings) or the body (8, Fig 1; PG 8, ln 29-35, “a display switching button… on-off signal of the display switching button… controlling the augmented reality display module”).
Regarding claim 4, Hu et al. discloses the device of claim 1 as discussed above.
Hu et al. further discloses, wherein the haptic sensor unit is programmed to execute a user-definable command (PG 8, ln 29- PG 9, ln 17).
Regarding claim 6, Hu et al. discloses the device of claim 1 as discussed above.
Hu et al. further discloses, wherein the haptic sensor unit comprises a plurality of haptic sensors, wherein each sensor is disposed in a different portion of the facepiece (101 of the mask 1, Fig 1; PG 8, ln 36- PG 9, ln 17) or the body (8, Fig 1; PG 8, ln 29-35).
Regarding claim 7, Hu et al. discloses the device of claim 1 as discussed above.
Hu et al. further discloses, wherein the haptic sensor unit is disposed in a Mask Communication Unit (MCU) (PG 8, ln 19-28, “the power supply module 4, the main control module 5 and the wireless communication module 6 is set in the main machine shell 7”) at least partially located within the volume.
Regarding claim 8, Hu et al. discloses the device of claim 1 as discussed above.
Hu et al. further discloses, wherein the haptic sensing unit comprises a first haptic sensor that senses a touch pattern performed on the facepiece (101 of the mask 1, Fig 1; PG 8, ln 36- PG 9, ln 17) and a second haptic sensor that senses a touch pattern performed on the body (8, Fig 1; PG 8, ln 29-35).
Regarding claim 16, Hu et al. discloses, a method of commanding an electrical function component (abstract) of a respirator device (Fig 1) having a body (7, Fig 1), and a facepiece (mask 1 which is connected to the shell 7 via cable 100 as shown in Fig 1) coupled to the body, comprising:
providing a haptic sensing unit (Examiner interprets the haptic sensing unit as to any structure that can interact with any touch pattern such as press, drag, tap, gesture input or etc. in BRI as no further structure is defined to the unit) having a haptic sensor configured to sense a touch pattern occurring on at least a first portion of the facepiece (101 of the mask 1, Fig 1; PG 8, ln 36- PG 9, ln 17 implies that the display 101 module can be touch operated or the button controlled to control the display settings) or the body (8, Fig 1; PG 8, ln 29-35, “a display switching button… on-off signal of the display switching button… controlling the augmented reality display module”).;
executing a user definable command when the haptic sensor senses a touch pattern occurring on at least a portion of the facepiece or the body (PG 8, ln 29- PG 9, ln 17); and triggering the use of the electrical function components based on the touch pattern sensed (PG 8, ln 29- PG 9, ln 17).
Claims 1, 4-5, 14, and 16-17 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Yu et al. (US 20150130945 A1).
Regarding claim 1, Yu et al. discloses, a respiratory device (100, Fig 1), comprising:
a body (a helmet shell 10, Fig 1), at least a portion of the body defining a volume (an internal cavity 11, Fig 1);
a facepiece (a visor 20, Fig 1) coupled to the body; and
a haptic sensing unit (touch panel 110, Fig 1) to sense a touch pattern occurring on at least a portion of the body (¶0017, “The touch panel 110 is electronically coupled to the controller 40, and configured to input commands to the controller 40. For example, the touch panel 110 can input a command for activating/deactivating the projector 30”)
Examiner notes that the limitation, “a respiratory device” is broad since it does not require any of the components related to the features of the respiratory device such as a fan, a pump, a filter, an airway, etc.
Regarding claim 4, Yu et al. discloses the device of claim 1 as discussed above.
Yu et al. further discloses, wherein the haptic sensor unit is programmed to execute a user-definable command (¶0017).
Regarding claim 5, Yu et al. discloses the device of claim 4 as discussed above.
Yu et al. further discloses, wherein the user definable command includes one or more of the following commands: Feature on/off functionality, Display (on/off) (¶0017).
Regarding claim 14, Yu et al. discloses the device of claim 1 as discussed above.
Yu et al. further discloses, an In-Mask Display (a projector 30, Fig 1).
Regarding claim 16, Yu et al. discloses, a method of commanding an electrical function component (abstract; ¶0006) of a respirator device (100, Fig 1) having a body (10, Fig 1), and a facepiece (20, Fig 1) coupled to the body, comprising:
providing a haptic sensing unit (110, Fig 1) having a haptic sensor configured to sense a touch pattern occurring on at least a portion of the body (¶0017);
executing a user definable command when the haptic sensor senses a touch pattern occurring on at least a portion of the body (¶0017); and triggering the use of the electrical function components based on the touch pattern sensed (¶0017).
Regarding claim 17, Yu et al. discloses the method of claim 16 as discussed above.
Yu et al. further discloses, wherein the user definable command includes one or more of the following commands: Feature on/off functionality, Display (on/off) (¶0017).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 9-10, and 13 are rejected under 35 U.S.C. 103 as being unpatentable over Wu (machine translation of WO 2018223942 A1) as applied to claim 1 above.
Regarding claim 9, Wu discloses, the device of claim 1 as discussed above.
While Wu discloses having the operation interface display area which is a touch type liquid crystal display panel so that the operator can conveniently operate the parameters of the automatic dimming filter 101 with the fingers (PG 5, ln 8-16) and having different portions of contact as shown in Fig 4, Wu is silent on the touch pattern includes a first double touch.
However, one of the ordinary skills in the art would have recognized that the interface of Wu is capable of a first double touch in order to adjust the parameters of the automatic dimming filter such as tapping the up-arrow area or tapping the down-arrow area as shown in Fig 4 to increase or to decrease the parameter values.
Therefore, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the device of Wu to include the touch pattern includes a first double touch in order to adjust the parameters of the automatic dimming filter.
Regarding claim 10, Wu discloses, the device of claim 9 as discussed above.
While Wu discloses having the operation interface display area which is a touch type liquid crystal display panel so that the operator can conveniently operate the parameters of the automatic dimming filter 101 with the fingers (PG 5, ln 8-16) and having different portions of contact as shown in Fig 4, Wu is silent on the touch pattern includes a second double touch.
However, one of the ordinary skills in the art would have recognized that the interface of Wu is capable of a first double touch in order to adjust the parameters of the automatic dimming filter such as tapping the up-arrow area or tapping the down-arrow area as shown in Fig 4 to increase or to decrease the parameter values.
Therefore, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the device of Wu to include the touch pattern includes a second double touch in order to adjust the parameters of the automatic dimming filter.
Regarding claim 13, Wu discloses, the device of claim 1 as discussed above.
While Wu discloses having the operation interface display area which is a touch type liquid crystal display panel so that the operator can conveniently operate the parameters of the automatic dimming filter 101 with the fingers (PG 5, ln 8-16) and having different portions of contact as shown in Fig 4, Wu is silent on the haptic sensing unit is configured to distinguish between a first touch pattern executed on a first portion of the facepiece and a second touch pattern executed on a second portion of the facepiece.
However, one of the ordinary skills in the art would have recognized that the interface of Wu is capable of distinguishing between touch patterns in order to adjust the parameters of the automatic dimming filter such as tapping/touching the up-arrow area or tapping/touching the down-arrow area as shown in Fig 4 to increase or to decrease the parameter values.
Therefore, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the device of Wu to include, the haptic sensing unit is configured to distinguish between a first touch pattern executed on a first portion of the facepiece and a second touch pattern executed on a second portion of the facepiece in order to adjust the parameters of the automatic dimming filter.
Claim 11 is rejected under 35 U.S.C. 103 as being unpatentable over Yu et al. (US 20150130945 A1) as applied to claim 1 above, and further in view of Brice et al. (US Pat 20180213873 A1).
Regarding claim 11, Yu et al. discloses the device of claim 1 as discussed above.
While Yu et al. discloses wherein a user display toggles between a first display setting and a second display setting (¶0017, implies that the display can be turn on/off which can be a first display setting and a second display setting respectively) by the input commands through the touch panel, Yu et al. does not specifically discloses that the input commands are double touch executed within a defined timer interval.
However, Brice et al. which is analogous art to the device of Yu et al. teaches the method of controlling the system such as to turn on and off by using a predetermined pattern of motion which is detected by the sensor such as tapping on a helmet (¶0029) and the input commands are double touch executed within a defined timer interval (¶0048, “if the sensor detects a certain pattern of tapping on a helmet, that can be used to turn on or off the system… user may tap the side of the helmets twice within a short period of time) in order to control the system without removing the gloves or as a safety precaution (¶0048).
Therefore, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the device of Yu et al. to include the input method including a double touch within a defined timer interval as taught by Brice et al. in order to activate/deactivate the system without removing the gloves or as a safety precaution (¶0048).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Dykes (US Pat 10420965) is cited for a method of two sharp taps to the SCBA face shield within a predefined time period, e.g., 1 or 1.5 seconds to transition to active mode (Col 16, ln 28-65).
Guelzow et al. (US 20080023002 A1) is cited for having heads up display component of Fig 7 and a control unit portion includes one or more buttons configured to activate or select between a number of display options (¶0029)
Teetzel (US 20150217145 A1) is cited for the button may comprise an on/off button for toggling the HUD display between the powered on and powered off states (¶0022).
Calilung et al. (US 20180303190 A1) is cited for the capacitive and/or resistive sensors designed to detect contact with a user's finger which the user can use different gestures to modify parameters of the helmet and wherein the gestures include, a frontward swipe, a rearward swipe, an upward swipe, a downward swipe, one or more taps such as a double or triple tap, pressing the screen for a specific duration of time, and a multiple position tap (¶0123)
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JAEICK JANG whose telephone number is (703)756-4569. The examiner can normally be reached M-F 8:30 - 4:30.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kendra D Carter can be reached at (571) 272-9034. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/J.J./Examiner, Art Unit 3785
/JOSEPH D. BOECKER/Primary Examiner, Art Unit 3785