DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 09/06/2023 was filed. Accordingly, the information disclosure statement is being considered by the examiner.
Priority
Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d).
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
Claims 1-7 are interpreted to invoke 35 U.S.C. 112(f).
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: “photographing part” “image analysis part” and “virtual gauge visualization part” in claims 1-7.
Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. The original specifications [0051, 0053, 0068, 0046-0047] and Fig. 3 discloses – “a virtual-gauge-based plant monitoring system 100 according to an embodiment of the present disclosure includes a photographing part 110-may be CCTV pre-installed in the plant 10, an image analysis part 120- may be provided as the supervisory server 20 or realized as a program installed on the supervisory server 20, a virtual gauge visualization part 130- may be provided as the monitoring server 30 or realized as a program installed on the monitoring server 30, and a monitoring part 140, Herein, the virtual-gauge-based plant monitoring system 100 according to the embodiment of the present disclosure may be used for monitoring a plant 10 in which plant facilities for manufacturing, assembly, storage, and management are built”, discloses hardware structure equivalents thereof to part for executing the functions.
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-2, 4-9 and 11-14 are rejected under 35 U.S.C. 103 as being unpatentable over Kazunari et al. (JP H03162624 A) English translation in view of Masaru et al. (JP 2010003267 A) English translation.
Regarding Claim 1,
Masaru discloses A virtual-gauge-based plant monitoring system, comprising: a photographing part provided at a plurality of locations inside a plant and configured to photograph at least one plant facility; (Kazunari, conventional technology, plant data display device according to the present invention selects and inputs a monitoring item from a selection input board, and displays data M of the input monitoring item. The selection processing means checks whether the data exists in the record file, and if so, determines whether the data of the current and previous monitoring items are different, and if it is R, selects the selected monitoring item. The selected address of the registered file corresponding to is read by the selected address reading means,The plant data and evening data of the read selected address are taken in, measurement data is created by the measurement data conversion means, and the created measurement data is displayed on the display device through the display output means; plant facility is monitored and the data is displayed on display screen using captured images)
an image analysis part configured to receive image data obtained by the photographing part, identify a measurement device of the plant facility from the received image data to extract a measurement value, and generate a measurement data packet transmittable to a non-secure area, the measurement data packet including the extracted measurement value; (Kazunari, Embodiment of the invention, Fig. 1, 11, discloses provides plant data that selects and displays plant data corresponding to a specified identification code on a display according to an identification code registered in a registration file; a selection input board as means for inputting selection of monitoring items, 12 is a code input section for selection input information, 13 is selection processing means for selecting monitoring items, 14 is a measurement data switching section, and 15 is a selection inputting means for selecting monitoring items. Selected address reading means for searching and reading the selected address of the monitoring item; 16, measurement data converting means for converting the collected measurement data from analog values to digital values; 17, plant data input means; 18, outputting data to the display. The display output means 19 is a display device; during program execution, selection input information for monitoring items is input from the selection input board 11 (step ST2 1); selection input information is code-converted by the code input section 12 (step ST22) and input to the selection processing means 13. This selection processing means 13 determines whether the selection input information is in the registered file or not (step ST23); If the selected monitoring item code as the selected input p information is on the registration file, it is determined whether the monitoring item code input this time is different from the monitoring item code input last time (step ST24). As a result of this determination, if it is determined that there is a difference, that is, if it is determined that there has been registration, the plant data to be measured is determined from the registration file of the selected monitoring item and the previously registered monitoring item. The address of the monitoring item of the other party to be collected is searched for (step ST25). The measured data switching unit 14 reads this address and collects necessary plant data (step ST26); measured data is transferred as filed data to other areas of the facility that might not be secure) and
display the measurement value in real time on the visualized virtual gauge. (Kanuzari, Embodiments of the invention, FIG. 1, 11, discloses is a selection input board as means for inputting selection of monitoring items, 12 is a code input section for selection input information, 13 is selection processing means for selecting monitoring items, 14 is a measurement data switching section, and 15 is a selection inputting means for selecting monitoring items. Selected address reading means for searching and reading the selected address of the monitoring item; 16, measurement data converting means for converting the collected measurement data from analog values to digital values; 17, plant data input means; 18, outputting data to the display. The display output means 19 is a display device. Further, FIG. 2 is a flowchart showing a procedure for processing display data in the plant data display device shown in FIG. Next, the operation will be explained with reference to the flowchart shown in FIG. The plant data of the monitoring items collected in this way is converted into a predetermined display format by the measurement data conversion means 16, and written into a predetermined area on the assigned display 19 via the display output means 18 (step ST27); measurement data obtained from the monitoring of analog data is converted to digital data as virtual gauge data and displayed on the screen)
Kazunari does not explicitly disclose a virtual gauge visualization part configured to receive the measurement data packet in the non-secure area, visualize a virtual gauge corresponding to the measurement device of the plant facility on the basis of the measurement data packet
Masaru discloses a virtual gauge visualization part configured to receive the measurement data packet in the non-secure area, visualize a virtual gauge corresponding to the measurement device of the plant facility on the basis of the measurement data packet (Masaru, First embodiment, discloses the system shown in the figure includes a plant 10, a plurality of sensors 1, 2,..., N, and a plant monitoring device 100; plant 10 is, for example, a power plant, and includes various devices including a feed water heater. Sensors 1, 2,..., N are temperature sensors and pressures that are installed in various parts of the equipment constituting the plant 10 and continuously measure various amounts of various process data such as temperature, pressure, and flow rate of steam and water. It corresponds to a sensor, a flow rate sensor or the like. The values measured by the sensors 1, 2,..., N are sent to the plant monitoring apparatus 100 side in real time; plant monitoring device 100 displays information indicating the operating state of the plant 10 on a display device based on the measured values of various process data detected by the sensors 1, 2,..., N. The plant monitoring apparatus 100 has a computer function capable of executing a predetermined program; plant monitoring device 100 includes a display processing device 11, an input device 12, a plant database 13, a calculated value generation device 14, and a display device 15. In addition, all of these elements 11 to 15 may be housed in a single housing, or some elements may be externally attached. Further, the function of the display processing device 11 and the function of the calculated value generation device 14 may be realized in the form of a group of programs; display processing device 11 sequentially receives a plurality of measured values sent from the sensors 1, 2,..., N, adjusts the display units, and stores the values in the plant database 14. In addition, the display device 15 has a function of displaying these values together with past measurement values in the form of a graph or the like; display device (virtual guage) is updated according to the measurement values obtained from the sensors (measurement device) and displayed according to different sensor types including 1 through n sensors (measurement device type)).
Both Kazunari and Masaru are directed to updating the measurement values obtained from different types of measurement devices such as sensors as obtained from images of the plant facility and display the updated values in digital form. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of Kazunari in view of Masaru displaying measurement values obtained from plant location, with the teachings of Masaru updating and displaying measurement values obtained from different types of sensors in order to visually improving attention to details of the changed values of specific type of data obtained from the specific type of measurement device in applications including manufacturing plant factories by graphical representation of specific type of sensor of which measurement values are displayed at the moment.
Regarding Claim 2,
The combination of Kazunari and Masaru discloses a monitoring part configured to monitor the measurement value displayed on the virtual gauge and output an alarm when the measurement value is out of a preset normal range. (Masaru, first embodiment, discloses the display processing device 11 includes a measured value of a certain sensor (hereinafter referred to as first sensor) designated by the operator via the input device 12 and a sensor group other than the first sensor (hereinafter referred to as first sensor). The calculated value corresponding to the measured value of the first sensor calculated from each measured value of second sensor group using a model formula is displayed on the display device 15 together. Can do. If the deviation between the calculated value obtained from the model formula and the actual measured value exceeds a certain value, it is estimated that some problem (such as a pipe defect in the device, sensor abnormality) occurs in the plant 10. In such a case, the display processing device 11 issues an alarm by display or sound; change in value displayed is notified using alarm system). Additionally, the rational and motivation to combine the references Kazunari and Masaru as applied in rejection of claim 1 apply to this claim.
Regarding Claim 4,
Kazunari further discloses wherein the image analysis part is configured to: in generating the measurement data packet. (Kazunari, means to solve the problem, effect, discloses provides plant data that selects and displays plant data corresponding to a specified identification code on a display according to an identification code registered in a registration file; plant data display device according to the present invention selects and inputs a monitoring item from a selection input board, and displays data M of the input monitoring item. The selection processing means checks whether the data exists in the record file, and if so, determines whether the data of the current and previous monitoring items are different, and if it is R, selects the selected monitoring item. The selected address of the registered file corresponding to is read by the selected address reading means; plant data and evening data of the read selected address are taken in, measurement data is created by the measurement data conversion means, and the created measurement data is displayed on the display device through the display output means; In FIG. 1, 11 is a selection input board as means for inputting selection of monitoring items, 12 is a code input section for selection input information, 13 is selection processing means for selecting monitoring items, 14 is a measurement data switching section, and 15 is a selection inputting means for selecting monitoring items. Selected address reading means for searching and reading the selected address of the monitoring item; 16, measurement data converting means for converting the collected measurement data from analog values to digital values; 17, plant data input means; 18, outputting data to the display. The display output means 19 is a display device; registration file (measurement data packet) is transferred from the device and type of device is determined based on the value and its identification is determined of the measurement device by determining the number and selected address is determined based on monitoring item (measurement values)).
Kazunari does not explicitly disclose identify a type of the measurement device from an image of a detected object to extract the measurement value, and insert identification information on the type of the measurement device and information on the measurement value
Masaru discloses identify a type of the measurement device from an image of a detected object to extract the measurement value, and insert identification information on the type of the measurement device and information on the measurement value (Masaru, Description, discloses the plant monitoring apparatus according to another aspect of the present invention is information that indicates an operation state of the plant based on measured values of a plurality of process data detected by a plurality of sensors installed in various places of equipment constituting the plant. Is displayed on a display device, and includes a display processing means for displaying a plurality of measured values obtained from the plurality of sensors on the display device, the display processing means comprising: a first sensor group; The calculated value calculated from each measured value using the first mathematical model and the calculated value calculated from each measured value of the second sensor group other than the first sensor group using the second mathematical model, It has a means to display the calculated value corresponding to the calculated value of the first sensor group together on the display device; first, second or any other sensor measurement values are identified and determined to be of specific sensor (measurement device identification) and displayed to the guage accordingly). Additionally, the rational and motivation to combine the references Kazunari and Masaru as applied in rejection of claim 1 apply to this claim.
Regarding Claim 5,
The combination of Kazunari and Masaru further discloses identify a type of the measurement device on the basis of identification information included in the measurement data packet, and load and visualize a 3D model corresponding to the identified measurement device as the virtual gauge. (Kazunari, Effect of the invention, discloses (Kazunari, Effect of the invention, discloses according to the present invention, a monitoring item is selected and input from the selection input board, and the selection processing means checks whether or not the input monitoring item exists in the data registration file. It is determined whether or not the data of the previous monitoring item is different, and if it is different, the selected address of the registration file corresponding to the selected and input monitoring item is read by the selected address reading means, and the read selected address is read. The system is designed to import plant data, create measurement data using the measurement data conversion means, and display the created measurement data on the display through the display output means. In addition to being able to arbitrarily and easily display measurement data (plant data) corresponding to a number of monitoring items on the display unit, internal processing can search for data files in which monitoring items are registered. It is now possible to change or add monitoring items just by changing the file, and it has the effect of solving all the economical and installation space inconveniences that conventionally required adding display units to add items; display is changed according to the monitoring item (monitoring measurement value transferred as register file) (Masaru, Description, discloses display device 15 displays a plurality of measured values obtained by the display processing device 11 from the sensors 1, 2,..., A plurality of calculated values generated by the calculated value generating device 14, time series data thereof, etc. It is displayed in the form of a graph or the like on the top; display value is displayed according to the sensor type (measurement device type) and rendered in any of the selected dimensional values).
Regarding Claim 6,
The combination of Kazunari and Masaru discloses wherein the virtual gauge visualization part is configured to: change, each time the measurement data packet is received and the measurement value is updated, a pointer on the virtual gauge and an object corresponding to a gauge board to locations or numerical values according to the measurement value of the measurement data packet, and display the locations or the numerical values. (Masaru, Description, discloses screen of Fig. 8 shows how the calculated value Q .sub.1.sup.m and the calculated value Q .sub.2 .sup.m of the heat amount change with time. In this screen, both values show the same change, but if the time zone in which the difference between the two values is large is long, it is difficult to detect a minute change. Therefore, in order to bring both values as close as possible, instead of the calculated values Q .sub.1 .sup.m and Q .sub.2 .sup.m , for example, adjusted calculated values Q .sub.1 .sup.c and Q .sub.2 .sup.c may be used. In that case, the calculated values Q .sub.1 .sup.c and Q .sub.2 .sup.c after adjustment can be expressed as follows using the calculated values Q .sub.1 .sup.m and Q .sub.2 .sup.m ; changes to the measurement values are updated to the display screen (virtual guage) as they occur in real time). Additionally, the rational and motivation to combine the references Kazunari and Masaru as applied in rejection of claim 1 apply to this claim.
Regarding Claim 7,
The combination of Kazunari and Masaru discloses wherein in the measurement data packet, location information of the plant facility or the photographing part and identification information of the measurement device are inserted in a header area, and information on time when the plant facility is photographed and information on the measurement value are inserted in a body area. (Kazunari, Effect of invention, discloses according to the present invention, a monitoring item is selected and input from the selection input board, and the selection processing means checks whether or not the input monitoring item exists in the data registration file. It is determined whether or not the data of the previous monitoring item is different, and if it is different, the selected address of the registration file corresponding to the selected and input monitoring item is read by the selected address reading means, and the read selected address is read. The system is designed to import plant data, create measurement data using the measurement data conversion means, and display the created measurement data on the display through the display output means. In addition to being able to arbitrarily and easily display measurement data (plant data) corresponding to a number of monitoring items on the display unit, internal processing can search for data files in which monitoring items are registered. It is now possible to change or add monitoring items just by changing the file, and it has the effect of solving all the economical and installation space inconveniences that conventionally required adding display units to add items; registration data file transfers the coded (measurement data packet) file including monitoring item (measurement device identification and its information) according to the area). (Masaru, Description, discloses the plant monitoring apparatus according to another aspect of the present invention is information that indicates an operation state of the plant based on measured values of a plurality of process data detected by a plurality of sensors installed in various places of equipment constituting the plant. Is displayed on a display device, and includes a display processing means for displaying a plurality of measured values obtained from the plurality of sensors on the display device, the display processing means comprising: a first sensor group; The calculated value calculated from each measured value using the first mathematical model and the calculated value calculated from each measured value of the second sensor group other than the first sensor group using the second mathematical model, It has a means to display the calculated value corresponding to the calculated value of the first sensor group together on the display device; first, second or any other sensor measurement values are identified and determined to be of specific sensor (measurement device identification) and displayed to the guage accordingly). Additionally, the rational and motivation to combine the references Kazunari and Masaru as applied in rejection of claim 1 apply to this claim.
Claims 8-9 and 11-14 recite method with steps corresponding to the system elements recited in Claims 1-2 and 4-7 respectively. Therefore, the recited elements of the apparatus and method Claims 8-9 and 11-14 are mapped to the proposed combination in the same manner as the corresponding elements of Claims 1-2 and 4-7 respectively. Additionally, the rationale and motivation to combine the Kazunari and Masaru references presented in rejection of Claim 1, apply to these claims.
Claims 3 and 10 are rejected under 35 U.S.C. 103 as being unpatentable over Kazunari as modified by Masaru and further in view of Yu et al. (US Pub No. 20170193310). The teachings of Kazunari and Masaru have been discussed previously.
Regarding Claim 3,
The combination of Kazunari and Masaru does not explicitly disclose wherein the image analysis part is configured to: convert the image data to a hue saturation value (HSV) image, set a region of interest (ROI) on the basis of at least one structural feature selected from a group of a color, a pattern, a size, and a shape, and detect an object corresponding to the measurement device within the region of interest.
Yu discloses wherein the image analysis part is configured to: convert the image data to a hue saturation value (HSV) image, set a region of interest (ROI) on the basis of at least one structural feature selected from a group of a color, a pattern, a size, and a shape, and detect an object corresponding to the measurement device within the region of interest. (Yu, [0037-0038], discloses the image information may also comprise color information. In an embodiment, in the RGB color space, the RGB information of the image frame may be acquired as the color information. In another embodiment, in the HSV color space, the HSV information of the image frame may be acquired as the color information. Those skilled in the art should understand that the above is only an example. Depending on different color spaces that are adopted, corresponding color information may be acquired in the method according to the embodiment of the present disclosure. In addition, alternatively, when it is infrared imaged, the image information may also comprise infrared information; at the step S120, the object in the image frame may be detected based on the depth information and one of the color information and the infrared information to obtain the object detection result for the image frame. Similarly to the method as previously described, the method according to the embodiment of the present disclosure may detect the object based on the depth information and the one of the color information and the infrared information, in various ways such as the edge detection, the classifier or the like, the process of which is similar to that described above and will not be described here repeatedly; image is converted to hsv image and color is selected as structural feature to determine region of interest from the image).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of Kazunari as modified by Masaru in view of Yu that processes specific feature values including shape or color, in order to improve the detection and processing of object of interest in images in applications including manufacturing plants where various sensors images are obtained for measurement value evaluations.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
KR 102139582 B1 (The communication network 110 may include an access point. The access point here includes small base stations, such as femto or pico base stations, which are often installed in buildings. The femto or pico base station is classified according to the maximum number of accesses to the photographing apparatus 100, etc., according to the classification of the small base station. Of course, the access point may include a short-range communication module for performing short-range communication, such as Zigbee and Wi-Fi, with the photographing apparatus 100. The access point can use TCP/IP or RTSP (Real-Time Streaming Protocol) for wireless communication. Here, the short-range communication may be performed in various specifications such as Bluetooth, ZigBee, infrared, UHF (Ultra High Frequency) and VHF (Very High Frequency), and Radio Frequency (RF) and Ultra Wideband Communication (UWB). Accordingly, the access point extracts the location of the data packet, designates the best communication path for the extracted location, and transmits the data packet along the designated communication path to the next device, such as the video analysis device 120 or the control device 130, etc. Can be delivered. An access point can share multiple lines in a typical network environment, including routers, repeaters and repeaters, for example)
US-20180365835-A1(A system and method for actively selecting and labeling images for semantic segmentation are disclosed. A particular embodiment includes: receiving image data from an image generating device; performing semantic segmentation or other object detection on the received image data to identify and label objects in the image data and produce semantic label image data; determining the quality of the semantic label image data based on prediction probabilities associated with regions or portions of the image; and identifying a region or portion of the image for manual labeling if an associated prediction probability is below a pre-determined threshold)
JP 2010075343 A (Description, A VRAM 433 is connected to the VDP 431, and a decorative symbol display device 110 is connected via a scaler and a transmitter (not shown). The scaler enlarges the image generated by the VDP 431 according to the number of pixels of the decorative design display device 110, and the transmitter converts the digital image data into analog signal R (red) signal, G (green) signal, B ( The signal is converted into a blue signal and output to the decorative symbol display device 110. Note that a luminance adjustment signal for enabling the CPU 404 to adjust the luminance of the display screen of the decorative symbol display device 110 is input to the decorative symbol display device 110)
Any inquiry concerning this communication or earlier communications from the examiner should be directed to PINALBEN V PATEL whose telephone number is (571)270-5872. The examiner can normally be reached M-F: 10am - 8pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chineyere Wills-Burns can be reached at 571-272-9752. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Pinalben Patel/Examiner, Art Unit 2673