Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
The Amendment filed December 25th, 2025 has been entered. Claims 1-13 and 15-20 remain pending in the application. Applicant's amendments to the Claims, Specification, and Drawings have overcome each and every objection previously set forth in the Non-Final office Action mailed October 28th, 2025.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1, 4, 6-8, 11-13, 15-18, and 20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Jasiobedzki et al. (United States Patent Application Publication 20140062772 A1), hereinafter Jasiobedzki.
Regarding claim 1, Jasiobedzki teaches a system, comprising: a scanning or measuring device for obtaining information about a cavity surrounding a user ([0043] Referring to FIG. 1 and the block diagram of FIG. 2, the hand-held unit 14 comprises a pose camera (PC) 16, an imaging camera (IC) 20, rangefinder (LRF) 22),
the scanning or measuring device comprising a localization and mapping device, adapted to generate a point cloud representing an outline of at least a part of the cavity, ([0044] The pose camera 16 may be providing 2D images or 3D (range) images or both. Algorithms for detecting the target 18 will typically rely on their visual appearance or shape.; [0046] The imaging camera 20 may also be a 3D camera that provides range images (or point clouds) representing geometry of the observed scene.),
wherein the scanning or measuring device is adapted to register a first location of the scanning or measuring device relative to the cavity after the system is stationary for at least a predetermined period of time, said registration being associated with a first time stamp ([0040] This data includes the absolute positions of objects of interest in GNSS coordinates, images of the objects, measurements obtained with wearable or hand-held detectors during the investigation and a recorded and/or transcribed record of the investigation made by the investigator.; [0064] t: time stamp of a measurement);
a device for providing location information ([0040] GNSS data is used for location determination);
at least one first processor ([0048] one or more processors 50) adapted to:
obtain an absolute location of the system from the device for providing location information when the system is stationary for at least the predetermined period of time ([0040] This data includes the absolute positions of objects of interest in GNSS coordinates, images of the objects, measurements obtained with wearable or hand-held detectors during the investigation and a recorded and/or transcribed record of the investigation made by the investigator.); and
register the absolute location in association with a second time stamp ([0064] t: time stamp of a measurement [0065] t.sub.-1 and t.sub.+1: time stamps of the PLU data before and after t);
at least one second processor ([0055] The computer control system 26) adapted to:
receive the point cloud collected by the scanning or measuring device and generate a three dimensional model of the cavity based on the information; and ([0055] The computer control system 26 analyses the images from pose camera 16 and computes the relative position and orientation (pose) of the visual target 18 with respect to the hand-held unit 14; [0071] The data may be overlayed on a map of the investigated scene, if such is available, showing the paths and locations. FIG. 6 is an illustration of such a display.); and
match the absolute location from the device for providing the location services with the location of the scanning or measuring device, based at least on the first time stamp and the second time stamp ([0071] Crossed triangles indicate the camera locations and bearing of the captured images.); and
a display device adapted to display to the user a three dimensional representation of the model of the cavity, the representation comprising an indication of the first location with the associated absolute location ([0043] a user interface 28; [0071] Complete data including paths followed by the investigator, locations and images of objects of interest, measurements from additional detectors can be displayed on the User Interface or transferred to an external computer.; [0074] The pose camera 16 may be either a 2D camera that captures 2D images or a 3D camera providing range images or point-clouds. Depending on the type of camera, different algorithms will be used to detect the Target and estimate its position. Visual targets, such as Space Vision Marker System [4] or AR markers [5] can be used.)
Regarding claim 4, Jasiobedzki teaches the system of claim 1, wherein the representation further comprises an indication of at least one second location within the model, and associated absolute locations ([Fig. 7]; [0071] The data may be overlayed on a map of the investigated scene, if such is available, showing the paths and locations. FIG. 6 is an illustration of such a display. Continuous lines overlayed on a floor map of a building show the paths taken by the investigator with the arrows indicating locations where he stopped.).
Regarding claim 6, Jasiobedzki teaches the system of claim 1, wherein the predetermined period of time is between about 0.5 seconds and about 30 seconds ([0064] t: time stamp of a measurement [0065] t.sub.-1 and t.sub.+1: time stamps of the PLU data before and after t [0066] P.sub.IC.sup.PLU0|.sub.t.sub.-1 and P.sub.IC.sup.PLU0|.sub.t.sub.+1: the recorded poses of the Imaging Camera with respect to the initial pose of the PLU corresponding to time t.sub.-1 and t.sub.+1).
Regarding claim 7, Jasiobedzki teaches the system of claim 1, wherein the display device is further adapted to receive from the user an indication to an area of the cavity, and display an image captured by the image capture device depicting the area ([Fig. 7]; [0071] The data may be overlayed on a map of the investigated scene, if such is available, showing the paths and locations. FIG. 6 is an illustration of such a display. Continuous lines overlayed on a floor map of a building show the paths taken by the investigator with the arrows indicating locations where he stopped.).
Regarding claim 8, Jasiobedzki teaches the system of claim 1, wherein the scanning or measuring device and the device providing location information are adapted to be attached to at least one wearable item, thereby enabling the user to advance with free hands within the cavity ([0040] The wearable object locator and imaging system and method disclosed herein provides an efficient means of both collecting investigative data).
Regarding claim 11, Jasiobedzki teaches the system of claim 1, further comprising a data storage device for storing data output by the scanning or measuring device and images captured by the image capture device ([0048] FIG. 7 provides an exemplary, non-limiting implementation of computer control system. Computer control system 26, forming part of the command and control system, may, but include one or more processors 50 (for example, a CPU/microprocessor), bus 52, memory 56, which may include random access memory (RAM) and/or read only memory (ROM), one or more internal storage devices 60 (e.g. a hard disk drive, compact disk drive or internal flash memory),).
Regarding claim 12, Jasiobedzki teaches the system of claim 11, wherein the data storage device is adapted to be attached to an at least one wearable item ([0047] Optional wearable detectors 34 may be interfaced with computer control system 26 using physical cables or wireless means).
Regarding claim 13, Jasiobedzki teaches the system of claim 1, further comprising a communication module for transmitting information collected by the scanning or measuring device to a remote computing device ([0047] The body mounted PLU 12 is interfaced with the computer control system 26 via cables or wireless means. Optional wearable detectors 34 may be interfaced with computer control system 26 using physical cables or wireless means.).
Regarding claim 14, Jasiobedzki teaches the system of claim 13, wherein the remote computing device is adapted to generate a three dimensional model of the cavity ([0046] The imaging camera 20 may also be a 3D camera that provides range images (or point clouds) representing geometry of the observed scene.).
Regarding claim 15, Jasiobedzki teaches the system of claim 1, wherein the information is three dimensional, and further comprising a processor for determining a projection of the three dimensional information onto a two dimensional map ([0046] The imaging camera 20 may also be a 3D camera that provides range images (or point clouds) representing geometry of the observed scene. [0071] Complete data including paths followed by the investigator, locations and images of objects of interest, measurements from additional detectors can be displayed on the User Interface or transferred to an external computer.).
Regarding claim 16, Jasiobedzki teaches the system of claim 1, further comprising a frame wherein the at least the scanning or measuring device, the image capture device and the device providing location information are attached to the frame ([0043] Referring to FIG. 1 and the block diagram of FIG. 2, the hand-held unit 14 comprises a pose camera (PC) 16, an imaging camera (IC) 20, rangefinder (LRF) 22 (preferably a laser rangefinder) and a computer control system 26 with a user interface 28. PC 16, IC 20 and LRF 22 are coupled together and their spatial relationship with each other is known.).
Regarding claim 17, Jasiobedzki teaches the system of claim 1, wherein matching the absolute with the location of the scanning or measuring device, is based also on relation between the absolute location and the location ([0062] The body worn PLU 12 typically operates in an incremental mode and provides location relative to the initial location where the unit was started or reset. This initial location and heading may be assigned to a point on a global map in the scene manually or automatically using another global localisation system.).
Regarding claim 18, Jasiobedzki teaches an apparatus, comprising: a base device of a system for providing location information ([0055] The computer control system 26 analyses the images from pose camera 16 and computes the relative position and orientation (pose) of the visual target 18 with respect to the hand-held unit 14;);
a moving platform ([0043] the hand-held unit 14) having installed thereon:
a scanning or measuring device for obtaining information about a cavity surrounding the moving platform ([0043] Referring to FIG. 1 and the block diagram of FIG. 2, the hand-held unit 14 comprises a pose camera (PC) 16, an imaging camera (IC) 20, rangefinder (LRF) 22),
the scanning or measuring device comprising a localization and mapping device, adapted to generate a point cloud representing an outline of at least a part of the cavity ([0044] The pose camera 16 may be providing 2D images or 3D (range) images or both. Algorithms for detecting the target 18 will typically rely on their visual appearance or shape.; [0046] The imaging camera 20 may also be a 3D camera that provides range images (or point clouds) representing geometry of the observed scene.),
wherein the scanning or measuring device is adapted to register a first location of the scanning or measuring device relative to the cavity after the system is stationary for at least a predetermined period of time, said registration being associated with a first time stamp ([0040] This data includes the absolute positions of objects of interest in GNSS coordinates, images of the objects, measurements obtained with wearable or hand-held detectors during the investigation and a recorded and/or transcribed record of the investigation made by the investigator.; [0064] t: time stamp of a measurement);
a rover device in communication with the base device ([0047] The body mounted PLU 12 is interfaced with the computer control system 26 via cables or wireless means. Optional wearable detectors 34 may be interfaced with computer control system 26 using physical cables or wireless means.);
at least one first processor ([0048] one or more processors 50) adapted to:
obtain an absolute location of the system from the rover device when the system is stationary for at least the predetermined period of time ([0040] This data includes the absolute positions of objects of interest in GNSS coordinates, images of the objects, measurements obtained with wearable or hand-held detectors during the investigation and a recorded and/or transcribed record of the investigation made by the investigator.); and
register the absolute location in association with a second time stamp ([0064] t: time stamp of a measurement [0065] t.sub.-1 and t.sub.+1: time stamps of the PLU data before and after t);
a second platform having installed thereon: a second processor ([0055] The computer control system 26 ) adapted to:
receive the point cloud collected by the scanning or measuring device and generate a three dimensional model of the cavity based on the point cloud; and ([0055] The computer control system 26 analyses the images from pose camera 16 and computes the relative position and orientation (pose) of the visual target 18 with respect to the hand-held unit 14; [0071] The data may be overlayed on a map of the investigated scene, if such is available, showing the paths and locations. FIG. 6 is an illustration of such a display.); and
match the absolute location received from the rover device with the first location according to the first time stamp and the second time stamp ([0071] Crossed triangles indicate the camera locations and bearing of the captured images.); and
a display device adapted to display to the user three dimensional representation of the model of the cavity, a route taken by the moving platform within the cavity and an indication of the first location with the associated absolute location. ([0043] a user interface 28; [0071] Complete data including paths followed by the investigator, locations and images of objects of interest, measurements from additional detectors can be displayed on the User Interface or transferred to an external computer.; [0074] The pose camera 16 may be either a 2D camera that captures 2D images or a 3D camera providing range images or point-clouds. Depending on the type of camera, different algorithms will be used to detect the Target and estimate its position. Visual targets, such as Space Vision Marker System [4] or AR markers [5] can be used.).
Regarding claim 20, Jasiobedzki teaches a method comprising: receiving a point cloud from a scanning or measuring device about a cavity, the scanning or measuring device comprising a localization and mapping device adapted to generate a point cloud representing an outline of at least a part of the cavity; ([0044] The pose camera 16 may be providing 2D images or 3D (range) images or both. Algorithms for detecting the target 18 will typically rely on their visual appearance or shape.; [0046] The imaging camera 20 may also be a 3D camera that provides range images (or point clouds) representing geometry of the observed scene.);
receiving from the scanning or measuring device a first location of the scanning or measuring device relative to the cavity after the scanning or measuring device is stationary for at least a predetermined period of time, the first location being associated with a first time stamp ([0040] This data includes the absolute positions of objects of interest in GNSS coordinates, images of the objects, measurements obtained with wearable or hand-held detectors during the investigation and a recorded and/or transcribed record of the investigation made by the investigator.; [0064] t: time stamp of a measurement);
generating a three dimensional model of the cavity based on the point cloud ([0055] The computer control system 26 analyses the images from pose camera 16 and computes the relative position and orientation (pose) of the visual target 18 with respect to the hand-held unit 14; [0071] The data may be overlayed on a map of the investigated scene, if such is available, showing the paths and locations. FIG. 6 is an illustration of such a display.); and
receiving from a device for providing location information absolute coordinates after the device for providing location information is stationary for at least the predetermined period of time, said absolute coordinates associated with a second time stamp ([0040] This data includes the absolute positions of objects of interest in GNSS coordinates, images of the objects, measurements obtained with wearable or hand-held detectors during the investigation and a recorded and/or transcribed record of the investigation made by the investigator.; [0064] t: time stamp of a measurement [0065] t.sub.-1 and t.sub.+1: time stamps of the PLU data before and after t);
matching the first location of the scanning or measuring device with the absolute coordinates according to the first time stamp and the second time stamp ([0071] Crossed triangles indicate the camera locations and bearing of the captured images.); and
displaying to a user a view of the three dimensional model and an indication of the first location with the associated absolute location ([0043] a user interface 28; [0071] Complete data including paths followed by the investigator, locations and images of objects of interest, measurements from additional detectors can be displayed on the User Interface or transferred to an external computer.; [0074] The pose camera 16 may be either a 2D camera that captures 2D images or a 3D camera providing range images or point-clouds. Depending on the type of camera, different algorithms will be used to detect the Target and estimate its position. Visual targets, such as Space Vision Marker System [4] or AR markers [5] can be used.).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 2-3, 5, and 9 are rejected under 35 U.S.C. 103 as being unpatentable over Jasiobedzki in view of Singer (United States Patent Application Publication 20190094021), hereinafter Singer
Regarding claim 2, Jasiobedzki teaches the system of claim 1,
Jasiobedzki fails to teach the system wherein the at least one first processor, the at least one second processor, the device providing location information and the scanning or measuring device are located in a housing.
However, Singer teaches the system the system wherein the at least one first processor, the at least one second processor, the device providing location information and the scanning or measuring device are located in a housing ([0047] The surveying instrument 2 comprises an infrared projector 20 for providing a referencing marker 200 which allows the AR-device 10 to reference itself (i.e. its pose) relative to a reference system. The surveying instrument 2 further comprises a computer 21 and a measuring unit 22.).
It would have been obvious to one of ordinary skill in the art prior to the effective filing date of this invention to modify the invention of Jasiobedzki to comprise the shared housing of components similar to Singer, with a reasonable expectation of success. This would have the predictable result of creating a compact measuring unit that is easily portable for scanning remote locations.
Regarding claim 3, Jasiobedzki teaches the system of claim 1,
Jasiobedzki fails to teach the system wherein the at least one first processor, the device providing location information and the scanning or measuring device are located in a housing, and the at least one second processor is located in a mobile operation interface not included in the housing.
However, Singer teaches the system wherein the at least one first processor, the device providing location information and the scanning or measuring device are located in a housing, and the at least one second processor is located in a mobile operation interface not included in the housing ([0042] Other embodiments of the AR-device according to the invention are handheld devices such as smart phones or tablet computers. [0047] The surveying instrument 2 comprises an infrared projector 20 for providing a referencing marker 200 which allows the AR-device 10 to reference itself (i.e. its pose) relative to a reference system. The surveying instrument 2 further comprises a computer 21 and a measuring unit 22.).
It would have been obvious to one of ordinary skill in the art prior to the effective filing date of this invention to modify the invention of Jasiobedzki to comprise the separate housing for separate components similar to Singer, with a reasonable expectation of success. This would have the predictable result of remotely communicating data back to an offsite processor and interface that a user need not carry with them.
Regarding claim 5, Jasiobedzki, as modified, teaches the system of claim 2,
Jasiobedzki fails to teach the system wherein the at least one second location is indicated in a less prominent manner than the at least one first location.
However, Singer teaches the system wherein the at least one second location is indicated in a less prominent manner than the at least one first location ([0062] An adaptation has taken place because the field of view of the visual sensor 140 of the AR-device 10 now captures a smaller part of the wall. Would the referencing marker 200 still be as large as shown in FIG. 2, then the AR-device 10 could not be able to catch the marker as a whole. Therefore, the surveying instrument 2 may be configured in such a way that the projector 20 adapts the size of the referencing marker 200).
It would have been obvious to one of ordinary skill in the art prior to the effective filing date of this invention to modify the invention of Jasiobedzki to comprise the change in prominence of secondary markers similar to Singer, with a reasonable expectation of success. This would have the predictable result of adapting the visualization of markers to meet the needs of the user and indicate the path of the user.
Regarding claim 9, Jasiobedzki teaches the system of claim 8,
Jasiobedzki fails to teach the system wherein the at least one wearable item comprises at least one item to which the display device attaches, the at least one item selected from the group consisting of: a wrist case, a vest, glasses; and a helmet.
However, Singer teaches the system wherein the at least one wearable item comprises at least one item to which the display device attaches, the at least one item selected from the group consisting of: a wrist case, a vest, glasses; and a helmet ([0042] FIGS. 1a and 1b show two embodiments 10/11 of an Augmented Reality (AR)-device according to the invention, i.e. AR-glasses 10 and an AR-helmet 11...The AR-device further comprises a display 120/121 for displaying AR-data, and a computer 110/111 for controlling the visual sensor 100/101 and the display 120/121).
It would have been obvious to one of ordinary skill in the art prior to the effective filing date of this invention to modify the invention of Jasiobedzki to comprise the wearable item consisting of a wrist case, vest, glasses or helmet similar to Singer, with a reasonable expectation of success. This would have the predictable result of making the system compact and portable for real world use.
Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Jasiobedzki in view of Santarone et al. (United States Patent Application Publication 20200380178 A1), hereinafter Santarone.
Regarding claim 10, Jasiobedzki teaches the system of claim 8,
Jasiobedzki fails to teach the system wherein the at least one wearable item comprises a vest to which the scanning or measuring device attaches.
However, Santarone teaches the system wherein the at least one wearable item comprises a vest to which the scanning or measuring device attaches ([0062] By way of non-limiting example, Transceivers 105 supported by the Agent 100 may be included in, and/or be in logical communication with, a Smart Device, such as a smart phone, tablet, headgear, ring, arm band, watch, footwear, vest, lab coat, smock, wand, pointer, badge, Tag, Node or other Agent 100 supportable device with a portable Transceiver 105 able to transceive with the Reference Point Transceivers 101-104.).
It would have been obvious to one of ordinary skill in the art prior to the effective filing date of this invention to modify the invention of Jasiobedzki to comprise the vest with attachable scanning or measuring device similar to Santarone, with a reasonable expectation of success. This would have the predictable result of freeing the user’s hands for better mobility in real world scanning scenarios.
Claim 19 is rejected under 35 U.S.C. 103 as being unpatentable over Jasiobedzki in view of Santarone et al. (United States Patent Application Publication 20220004672 A1), hereinafter Santarone and Duffy.
Regarding claim 19, Jasiobedzki teaches the apparatus of claim 18,
Jasiobedzki fails to teach the apparatus wherein the moving platform is remotely stirred in accordance with the information as received at the remote computing platform.
However, Santarone teaches the apparatus wherein the moving platform is remotely stirred in accordance with the information as received at the remote computing platform ([0432] The methods used may be, by means of non-limiting example, one or more of: augmented reality overlays as displayed by heads-up displays and other wearable technologies, augmented reality overlays as displayed on smart devices, virtual reality walkthroughs as shown by wearable technologies or smart devices, direct instruction or remote control,; [0585] By way of non-limiting example, a user interface may be one or more of: a smart device application, a virtual reality headset, an augmented reality apparatus, a remote control interface for an unmanned vehicle, etc.).
It would have been obvious to one of ordinary skill in the art prior to the effective filing date of this invention to modify the invention of Jasiobedzki to comprise the remotely stirred moving platform similar to Santarone and Duff, with a reasonable expectation of success. This would have the predictable result of imaging an area without the need of a human entering into the space, making the operation safer and more accessible.
Response to Arguments
Applicant's arguments filed December 25th, 2025 have been fully considered but they are not persuasive.
The applicant argues that the prior art taught by Jasiobedzki fails to teach a three dimensional point cloud model as written in the independent claim of the immediate application. However, Jasiobedzki in the previously cited rejection, as well as the newly cited section of the same prior art, as necessitated by the amendments, teaches using a 3D camera to generate a point cloud to map an area, which under the broadest reasonable interpretation to one of ordinary skill in the art, is the same as the three dimensional model that is claimed. Further, the argument that the prior art fails to teach that this model is then displayed to the user as a representation of the model is likewise not persuasive, as Jasiobedzki shows that this data can be overlayed onto a map with markers that can be represented as a three dimensional marker. Thus, the amended claims are seen as unpersuasive to overcome the prior art rejection.
Applicant also argues that the prior art fails to teach the claim limitation that the system be stationary for a predetermined period of time during each scan. However, as each scan is taught to have a distinct time-stamp, and the scan has a period of time in which to operate, the prior art, as cited previously and above, clearly shows that the scan would take place during a predetermined period of time, as written in the limitation of the independent claims, and the dependent claim 6.
Further arguments are made regarding the dependent claims, including that Jasiobedzki fails to teach a representation a location within the model associated with absolute locations. However, as cited previously and above, the map that is used in which the generated point cloud is modeled over, teaches the absolute location reference under the broadest reasonable interpretation to one of ordinary skill in the art, and as written in the claims.
In addition, the argument that the prior art fails to teach the indication given by the user wherein the display displays an image capture by the image capture device is found unpersuasive as the indicated limitation is not written to be so narrow such that the prior art cited does not read on the claim. As such, the rejection cited above is maintained in this Final Office Action.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ROBERT WILLIAM VASQUEZ JR whose telephone number is (571)272-3745. The examiner can normally be reached Monday thru Thursday, Flex Friday, 8:00-5:00 PST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, HELAL ALGAHAIM can be reached at (571)270-5227. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ROBERT W VASQUEZ/Examiner, Art Unit 3645
/HELAL A ALGAHAIM/SPE , Art Unit 3645