DETAILED ACTION
Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Objections
2. Claim 1 is objected to because of the following informalities: claim 1 recite “… automatically launching the AR viewer with an AR view of the RTL link over laying a view of the specific asset.” The words “the AR viewer” should be “an AR viewer”, and “over laying” is a typo, should be “… automatically launching an AR viewer with an AR view of the RTL link overlaying a view of the specific asset.”. Appropriate correction is required.
3. Claim 9 is objected to because of the following informalities: claim 9 recite “… automatically launch an AR viewer with an AR view of the RTL link over laying a view of the specific asset.” The word “over laying” is a typo, should be “… automatically launch an AR viewer with an AR view of the RTL link overlaying a view of the specific asset.”. Appropriate correction is required.
4. Claim 16 is objected to because of the following informalities: claim 16 ends without a “.”. Appropriate correction is required.
Claim Rejections - 35 USC § 103
5. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
6. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
7. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
8. Claim(s) 1-16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Paul (US 2020/0110934 A1) in view of Todeschini et al. (US 2016/0171777 A1).
9. With reference to claim 1, Paul teaches A computerized method for augmented-reality tracking of asset trackers comprising: integrating an asset tracker with a specific asset; (“the present disclosure relates to an AR system integrated with an indoor positioning system (IPS) that supports and provides mechanisms to dynamically track assets within an environment covered by the IPS and to further visualize the assets in an AR device registered or otherwise associated with the AR system. In some aspects, the methods and systems of the present disclosure provide a visualization of the current, actual relative location of an asset tracked within an IPS in a current FOV of an AR device.” [0018] “system 100 includes a plurality of location beacons 110, 115, 120, and 125. Each location beacon is at a known, fixed location. The location of each location beacon is known with respect to a predetermined coordinate system. The fixed locations may correspond to different infrastructure positions of an environment such as, for example, a wall, a pillar, a support, a door, and other fixtures. System 100 further includes a tag 105. Tag 105 is associated with an asset to be tracked within the environment of system 100. In some aspects, tag 105 is located adjacent to (e.g., physically proximal to, in or on) asset 130. In this manner, knowing a location and orientation of tag 105 equates to knowing the location and orientation of asset 130 itself.” [0020]) Paul also teaches configuring the asset tracker to communicate a location of the asset tracker to an Augmented Reality (AR) tracking system; (“a tag 105 associated with an asset 130 herein may operate to communicate with the location beacons 110, 115, 120, and 125 in its environment (e.g., hospital, warehouse, factory, school, etc.), calculate its own position based on its relative distances from the plurality of location beacons, and further communicate its position to a server (e.g., 145) which can relay such information to the AR device 150 when the server is queried by the AR device.” [0029]) Paul further teaches the search area comprises the specific asset; (“server 145 may maintain and manage a searchable database including, for example, of all the tags of system 100, as well as their associated assets and the locations of those assets where the location for each location may be continually updated.” [0028]) Paul teaches the AR tracking system comprises an RTL (Real-Time Location) system, (“Input 335 includes information indicative of a location of an AR device in a system herein. The location of the AR device may be referenced to the image marker of input 330, as discussed hereinabove. In some embodiments, input 335 may be continually (or at least periodically) updated so as to provide a current or real-time location for an AR device herein.” [0039]) Paul also teaches with the AR tracking system software: detecting an RTL link of the RTL system is within a range of the specific asset, and automatically launching the AR viewer with an AR view of the RTL link over laying a view of the specific asset. (“since the image marker coordinate system 140 is aligned with the IPS coordinate system by virtue of the image marker being positioned at the origin of the IPS' known coordinate system, the AR system's coordinate system is also aligned with the IPS' coordinate system. That is, the AR system's and the IPS' coordinate systems may be aligned with each other when configured as disclosed herein. With both the AR system and the IPS being referenced to the same coordinate system, the locations of the tag(s) 105 calculated by the UWB IPS of system 100 may be aligned and coordinated with the AR system of system 100 that includes AR device 150. Accordingly, the locations of tag(s) 105 and the asset(s) 130 associated with and affixed adjacent to the tag(s) herein may be accurately represented and included in a field of view of AR device 150. In this manner, a consolidated FOV might be displayed by AR device 150 that includes the AR device's actual, current FOV and the assets (e.g., 130) tracked in the environment of system 100 as they come into view within the current FOV.” [0027] “after an asset's location is determined, the AR device's display presents visual cues overlaid on the display of an actual object(s) within the FOV of the user. In the instance the object is outside the FOV of the AR device/user, then the system may provide visual cues and/or directions from the AR device/user' current location to the location of the asset on the AR device (e.g., a visor, headset, helmet, glasses, etc.) guide the user to the asset.” [0030] “Input 325 includes (UWB) information indicative of a distance of a tag (e.g., 105, 205) relative to the one or more location beacons referenced in input 320. The relative distance of each tag from multiple different location beacons can be used to accurately determine the location of the tag, and by association, an asset located coincident with a particular tag. In some embodiments, input 325 may be continually (or at least periodically) updated so as to provide an accurate basis for the calculation of a current or real-time location for an asset associated with a tag herein.” [0037])
PNG
media_image1.png
362
575
media_image1.png
Greyscale
Paul does not explicitly teach overlaying an AR visualization system overlay digital location data with a real-time camera view of a search area. This is what Todeschini teaches (“The target device 130 being searched for may transmit a data packet via BLE on a regular interval. The AR system 100 may use BLE to listen for this data packet. Once a BLE signal is read, the AR system will examine the device hardware identifier to determine if it is the device being searched for. If so, the current location of the AR system may be used to calculate the distance to the target device. The sensors 135 may also be used to determine the system's current orientation. This will inform the person which direction it is facing and how it is rotated. When the device orientation is in line with the target device, a bouncing arrow may be rendered above the target device so that the user can visible see where it is located. The size of the arrow may change depending on the distance between the AR system and the target device.” [0020] “IG. 4 is a flowchart illustrating a method 400 of using an augmented reality system to provide a view of an area with an indication of the location of a missing target device. Method 400 includes receiving a signal from a device to be located, a server, and/or a surveillance system at 410. The signal identifies the device with some form of ID in order to determine if the device is the device being searched for. At 420, location information is determined from the received signal. … a graphic representative of the location of the device is created. The graphic may be an arrow or other symbol that points to or otherwise identifies the location of the target device, or a circle having a radius corresponding to the identified distance from the target device. At 440, an augmented reality system view of an area where the device is located is created, with the graphic overlaid on the view to provide user visible location information corresponding to the device. … The computing device may also include other mobile or wearable devices such as smart glasses (e.g., Google Glass®), other near-to-eye devices, and head-mounted displays. Thus, the computing device may be any device with a see through or a camera assisted device that can overlay or is capable of overlaying the AR features on the real images displayed on the output 504 (e.g., display).” [0022-0024]) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Todeschini into Paul, in order to provide user visible location information corresponding to the device.
10. With reference to claim 2, Paul teaches the AR tracking system is integrated into a smartphone, a smart glasses system, or a tablet computer. (“the present disclosure relates to an AR system integrated with an indoor positioning system (IPS) that supports and provides mechanisms to dynamically track assets within an environment covered by the IPS and to further visualize the assets in an AR device registered or otherwise associated with the AR system. In some aspects, the methods and systems of the present disclosure provide a visualization of the current, actual relative location of an asset tracked within an IPS in a current FOV of an AR device.” [0018] “In the instance the object is outside the FOV of the AR device/user, then the system may provide visual cues and/or directions from the AR device/user' current location to the location of the asset on the AR device (e.g., a visor, headset, helmet, glasses, etc.) guide the user to the asset.” [0030] “FIG. 6 includes an illustrative depiction of an AR device embodying a portable/mobile tablet 600 that might be carried or otherwise transported by a person/user. Tablet 600 includes display on which visualizations may be presented, including a FOV 605 of the user at a current location of the AR device 600 as acquired by one or more cameras on tablet 600 (not shown in FIG. 6). FIG. 7 includes an illustrative depiction of an AR device including a mobile phone 700 that may be carried by a person or user of the mobile phone. Phone 700 includes a display screen 705 in which visualizations may be presented. The screen depicted in FIG. 7 includes a FOV 710 generated based on images and/or video acquired by one or more cameras (not shown in FIG. 7) of the mobile phone at a current location of the phone.” [0050-0051])
11. With reference to claim 3, Paul teaches the RTL system is integrated with a system. (“the present disclosure relates to an AR system integrated with an indoor positioning system (IPS) that supports and provides mechanisms to dynamically track assets within an environment covered by the IPS and to further visualize the assets in an AR device registered or otherwise associated with the AR system. In some aspects, the methods and systems of the present disclosure provide a visualization of the current, actual relative location of an asset tracked within an IPS in a current FOV of an AR device.” [0018] “Input 325 includes (UWB) information indicative of a distance of a tag (e.g., 105, 205) relative to the one or more location beacons referenced in input 320. The relative distance of each tag from multiple different location beacons can be used to accurately determine the location of the tag, and by association, an asset located coincident with a particular tag. In some embodiments, input 325 may be continually (or at least periodically) updated so as to provide an accurate basis for the calculation of a current or real-time location for an asset associated with a tag herein.” [0037] “Input 335 includes information indicative of a location of an AR device in a system herein. The location of the AR device may be referenced to the image marker of input 330, as discussed hereinabove. In some embodiments, input 335 may be continually (or at least periodically) updated so as to provide a current or real-time location for an AR device herein.” [0039])
Paul does not explicitly teach BLUETOOTH. This is what Todeschini teaches (“T The target device 130 to be located transmits a signal that uniquely identifies itself. In one embodiment, this signal comprises a low energy RF signal, and the position is an indoor positioning signal. One example of a low energy RF signal having low cost and low power usage is a Bluetooth Low Energy (BLE) beacon in one embodiment that identifies the device serial number, MAC address, or other form of identifier, such as global unique identifier (GUID). In another embodiment, the target device transmits the uniquely identifying signal using Wi-Fi technology. As the person scans their environment for their missing device, graphics may be overlaid on the physical world, indicating where their device is located.” [0017]) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Todeschini into Paul, in order to provide user visible location information corresponding to the device.
12. With reference to claim 4, Paul teaches the RTL system is integrated with a system. (“the present disclosure relates to an AR system integrated with an indoor positioning system (IPS) that supports and provides mechanisms to dynamically track assets within an environment covered by the IPS and to further visualize the assets in an AR device registered or otherwise associated with the AR system. In some aspects, the methods and systems of the present disclosure provide a visualization of the current, actual relative location of an asset tracked within an IPS in a current FOV of an AR device.” [0018] “Input 325 includes (UWB) information indicative of a distance of a tag (e.g., 105, 205) relative to the one or more location beacons referenced in input 320. The relative distance of each tag from multiple different location beacons can be used to accurately determine the location of the tag, and by association, an asset located coincident with a particular tag. In some embodiments, input 325 may be continually (or at least periodically) updated so as to provide an accurate basis for the calculation of a current or real-time location for an asset associated with a tag herein.” [0037] “Input 335 includes information indicative of a location of an AR device in a system herein. The location of the AR device may be referenced to the image marker of input 330, as discussed hereinabove. In some embodiments, input 335 may be continually (or at least periodically) updated so as to provide a current or real-time location for an AR device herein.” [0039])
Paul does not explicitly teach Wi-Fi. This is what Todeschini teaches (“T The target device 130 to be located transmits a signal that uniquely identifies itself. In one embodiment, this signal comprises a low energy RF signal, and the position is an indoor positioning signal. One example of a low energy RF signal having low cost and low power usage is a Bluetooth Low Energy (BLE) beacon in one embodiment that identifies the device serial number, MAC address, or other form of identifier, such as global unique identifier (GUID). In another embodiment, the target device transmits the uniquely identifying signal using Wi-Fi technology. As the person scans their environment for their missing device, graphics may be overlaid on the physical world, indicating where their device is located.” [0017]) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Todeschini into Paul, in order to provide user visible location information corresponding to the device.
13. With reference to claim 5, Paul teaches the RTL system is integrated with an Ultrawide-band system. (“the present disclosure relates to an AR system integrated with an indoor positioning system (IPS) that supports and provides mechanisms to dynamically track assets within an environment covered by the IPS and to further visualize the assets in an AR device registered or otherwise associated with the AR system. In some aspects, the methods and systems of the present disclosure provide a visualization of the current, actual relative location of an asset tracked within an IPS in a current FOV of an AR device.” [0018] “Communication between the plurality of location beacons 110, 115, 120, and 125 and tag 105 may be via a wireless communication protocol. In some embodiments, the communication technology protocol used by location beacons 110, 115, 120, and 125 and tag 105 may be an ultra-wide band (UWB) technology.” [0021] “Input 325 includes (UWB) information indicative of a distance of a tag (e.g., 105, 205) relative to the one or more location beacons referenced in input 320. The relative distance of each tag from multiple different location beacons can be used to accurately determine the location of the tag, and by association, an asset located coincident with a particular tag. In some embodiments, input 325 may be continually (or at least periodically) updated so as to provide an accurate basis for the calculation of a current or real-time location for an asset associated with a tag herein.” [0037] “Input 335 includes information indicative of a location of an AR device in a system herein. The location of the AR device may be referenced to the image marker of input 330, as discussed hereinabove. In some embodiments, input 335 may be continually (or at least periodically) updated so as to provide a current or real-time location for an AR device herein.” [0039])
14. With reference to claim 6, Paul teaches the RTL system is combined with positioning systems. (“the present disclosure relates to an AR system integrated with an indoor positioning system (IPS) that supports and provides mechanisms to dynamically track assets within an environment covered by the IPS and to further visualize the assets in an AR device registered or otherwise associated with the AR system. In some aspects, the methods and systems of the present disclosure provide a visualization of the current, actual relative location of an asset tracked within an IPS in a current FOV of an AR device.” [0018] “system 300 includes a number of inputs 305 from one or more location beacons, tags, image marker(s), and AR devices that may be processed by a processor-based inference engine 310 to produce a number of outputs 315 that may be displayed on a AR device 350. In one embodiment, system 300 produces or otherwise generates a consolidated FOV including a combination of visualizations of a location of an asset being tracked and contextual directions to the asset. … Input 320 is received from one or more (UWB) location beacons, and includes information indicative of a location of the beacon's location. In some aspects, this location information can include 3D coordinate information for each beacon, where the beacons are each located at a fixed position. Input 325 includes (UWB) information indicative of a distance of a tag (e.g., 105, 205) relative to the one or more location beacons referenced in input 320. The relative distance of each tag from multiple different location beacons can be used to accurately determine the location of the tag, and by association, an asset located coincident with a particular tag. In some embodiments, input 325 may be continually (or at least periodically) updated so as to provide an accurate basis for the calculation of a current or real-time location for an asset associated with a tag herein. Input 330 includes information indicative of a location of an image marker. …Input 335 includes information indicative of a location of an AR device in a system herein. The location of the AR device may be referenced to the image marker of input 330, as discussed hereinabove. In some embodiments, input 335 may be continually (or at least periodically) updated so as to provide a current or real-time location for an AR device herein.” [0035-0039])
Paul does not explicitly teach another local positioning systems. This is what Todeschini teaches (“The target device 130 to be located transmits a signal that uniquely identifies itself. In one embodiment, this signal comprises a low energy RF signal, and the position is an indoor positioning signal. One example of a low energy RF signal having low cost and low power usage is a Bluetooth Low Energy (BLE) beacon in one embodiment that identifies the device serial number, MAC address, or other form of identifier, such as global unique identifier (GUID). In another embodiment, the target device transmits the uniquely identifying signal using Wi-Fi technology. As the person scans their environment for their missing device, graphics may be overlaid on the physical world, indicating where their device is located. … The AR system 100 may also include location sensing ability, such as a GPS module to identify its location. The sensors 135 may be used to update the location of the system 100 and adjust the display accordingly. In a further embodiment, the missing device/equipment may have a way to locate itself in its environment via BLE indoor positioning, GPS or other location technology.” [0017-0018]) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Todeschini into Paul, in order to provide user visible location information corresponding to the device.
15. With reference to claim 7, Paul teaches positioning system (“the present disclosure relates to an AR system integrated with an indoor positioning system (IPS) that supports and provides mechanisms to dynamically track assets within an environment covered by the IPS and to further visualize the assets in an AR device registered or otherwise associated with the AR system. In some aspects, the methods and systems of the present disclosure provide a visualization of the current, actual relative location of an asset tracked within an IPS in a current FOV of an AR device.” [0018])
Paul does not explicitly teach the other local positioning system comprises a Global Positioning System (GPS), a mobile phone tracking system, or a Wi-Fi tracking system. This is what Todeschini teaches (“The target device 130 to be located transmits a signal that uniquely identifies itself. In one embodiment, this signal comprises a low energy RF signal, and the position is an indoor positioning signal. One example of a low energy RF signal having low cost and low power usage is a Bluetooth Low Energy (BLE) beacon in one embodiment that identifies the device serial number, MAC address, or other form of identifier, such as global unique identifier (GUID). In another embodiment, the target device transmits the uniquely identifying signal using Wi-Fi technology. As the person scans their environment for their missing device, graphics may be overlaid on the physical world, indicating where their device is located. … The AR system 100 may also include location sensing ability, such as a GPS module to identify its location. The sensors 135 may be used to update the location of the system 100 and adjust the display accordingly. In a further embodiment, the missing device/equipment may have a way to locate itself in its environment via BLE indoor positioning, GPS or other location technology.” [0017-0018]) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Todeschini into Paul, in order to provide user visible location information corresponding to the device.
16. With reference to claim 8, Paul teaches using a series of location information to include a speed, a direction, and a spatial orientation of the tracked asset into a display in the AR tracking system of the specific asset. (“Tag 105 is associated with an asset to be tracked within the environment of system 100. In some aspects, tag 105 is located adjacent to (e.g., physically proximal to, in or on) asset 130. In this manner, knowing a location and orientation of tag 105 equates to knowing the location and orientation of asset 130 itself.” [0020] “an orientation of the image marker, a location of the image marker, a reference to other information, and other data.” [0025] “system 300 includes a number of inputs 305 from one or more location beacons, tags, image marker(s), and AR devices that may be processed by a processor-based inference engine 310 to produce a number of outputs 315 that may be displayed on a AR device 350. In one embodiment, system 300 produces or otherwise generates a consolidated FOV including a combination of visualizations of a location of an asset being tracked and contextual directions to the asset. … Input 320 is received from one or more (UWB) location beacons, and includes information indicative of a location of the beacon's location. In some aspects, this location information can include 3D coordinate information for each beacon, where the beacons are each located at a fixed position. Input 325 includes (UWB) information indicative of a distance of a tag (e.g., 105, 205) relative to the one or more location beacons referenced in input 320. The relative distance of each tag from multiple different location beacons can be used to accurately determine the location of the tag, and by association, an asset located coincident with a particular tag. In some embodiments, input 325 may be continually (or at least periodically) updated so as to provide an accurate basis for the calculation of a current or real-time location for an asset associated with a tag herein. Input 330 includes information indicative of a location of an image marker. …Input 335 includes information indicative of a location of an AR device in a system herein. The location of the AR device may be referenced to the image marker of input 330, as discussed hereinabove. In some embodiments, input 335 may be continually (or at least periodically) updated so as to provide a current or real-time location for an AR device herein.” [0035-0039])
17. Claim 9 is similar in scope to claim 1, and thus is rejected under similar rationale. Paul additionally teaches A system for augmented-reality tracking of asset trackers comprising: a plurality of asset trackers, wherein each asset tracker tracks one or more loT assets and obtains a set of loT data; one or more communications hubs; a base station; one or more communication networks; wherein each asset tracker is configured to be in communication with a base station and one or more of the communications hubs; wherein in the one or more communications hubs are configured to be in communication with one or more of the mobile units, and the one or more network; wherein the base station is configured to be in communication with the plurality of asset trackers; a server computing device configured to be in communication with the one or more networks, wherein the server computing device is further configured to implement the following logic: (“a system to track the AR device within the IPS mapped environment, a data management system (e.g., a server, a database, and combinations thereof) to store asset-related location information, and a communication network to facilitate and support communication between the IPS, the AR device, and the data management system.” [0019] “FIG. 3 is an illustrative schematic diagram 300 of an example AR system integrated with an IPS tracking assets, in accordance with some embodiments herein. In some aspects, system 300 includes a number of inputs 305 from one or more location beacons, tags, image marker(s), and AR devices that may be processed by a processor-based inference engine 310 to produce a number of outputs 315 that may be displayed on a AR device 350. In one embodiment, system 300 produces or otherwise generates a consolidated FOV including a combination of visualizations of a location of an asset being tracked and contextual directions to the asset.” [0035] “the inputs 320, 325, 330, and 335 may be transmitted to inference engine 310. In some embodiments, inference engine 310 might include artificial intelligence (AI) aspects, such as machine learning, artificial neural networks, statistical learning, and other techniques. In some aspects, inference engine 310 may use one or more predetermined rules, lookup tables, and other stored data references to process inputs 305 to produce outputs 315. In some instances, inference engine 310 might use a combination of AI aspects and one or more other techniques in processing inputs 305. Operationally, inference engine 310 processes inputs 305 to determine and generate outputs 315.” [0040] “inference engine 310 can be implemented by a remotely located central or distributed server system or integrated with the AR device 350. In some embodiments including a remotely located inference engine, the inputs from a location beacon and/or tag may be communicated wirelessly (e.g., UWB) to a central (or distributed) computing and database server.” [0043] “System 1300 includes processor(s) 1310 operatively coupled to communication device 1320, data storage device 1330, one or more input devices 1340, one or more output devices 1350, and memory 1360. Communication device 1320 may facilitate communication with external devices, such as a data server and other data sources. Input device(s) 1340 may comprise, for example, a keyboard, a keypad, a mouse or other pointing device, a microphone, knob or a switch, an infrared (IR) port, a docking station, and/or a touch screen.” [0058])
18. Claims 10-16 are similar in scope to claims 2-8, and they are rejected under similar rationale.
Conclusion
19. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Michelle Chin whose telephone number is (571)270-3697. The examiner can normally be reached on Monday-Friday 8:00 AM-4:30 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http:/Awww.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner's supervisor, Kent Chang can be reached on (571)272-7667. The fax phone number for the organization where this application or proceeding is assigned is (571)273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https:/Awww.uspto.gov/patents/apply/patent- center for more information about Patent Center and https:/Awww.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MICHELLE CHIN/
Primary Examiner, Art Unit 2614