DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant’s arguments with respect to the rejections of claims 1-15 under 35 U.S.C. §103 have been fully considered but are not persuasive. As discussed in further detail below, Lund does teach the limitation of obtaining horizonal and vertical plane data according to two operations.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 03/19/2026 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
Claims 1-3 and 11-13 are rejected under 35 U.S.C. 103 as being unpatentable over US 20250067875 A1, with an earliest priority date of 01/31/2022, hereinafter “Sawahashi”, in view of US 20220230550 A1, filed 05/26/2020, hereinafter “Lund”, further in view of US 20190129444 A1, filed 10/31/2017, hereinafter “Wirth”.
Regarding claim 1, Sawahashi teaches An electronic apparatus comprising: memory; at least one sensor; and at least one processor operatively connected with the memory and the at least one sensor, and configured to execute instructions. See at least [0006], [0066], and [0076]-[0077], wherein the disclosure is implemented by an apparatus comprising a CPU and memory that execute a stored program.
wherein the instructions, when executed by the at least one processor individually or collectively, cause the electronic apparatus to: perform first driving of the electronic apparatus for the target space to collect first sensor data obtained through the at least one sensor while moving the electronic apparatus through the target space, and obtain first surface data for a first surface corresponding to a driving direction of the electronic apparatus based on the first sensor data. See at least [0113]-[0114] and figure 16, step S110, wherein sensor data from a first (2D) distance sensor 12 is used to obtain first surface data (2D point cloud). Additionally, see at least [0052], wherein the first sensor 12 captures information in the traveling direction of the mobile body.
perform second driving of the electronic apparatus for the target space to collect second sensor data obtained through the at least one sensor while moving the electronic apparatus through the target space, and obtain second surface data for a second surface different from the first surface based on the second sensor data. See at least [0113]-[0114] and figure 16, step S100, wherein sensor data from a second (3D) distance sensor 11 is used to obtain second surface data (3D point cloud). Additionally, see at least [0053], wherein the second distance sensor 11 obtains information in a traveling direction of the mobile body. See at least
based on completing the driving for the target space, obtain the map based on the first surface data and the second surface data. See at least [0120], [0125], and figure 16, Steps S111 and S120, wherein the first surface data and the second surface data are integrated to obtain a map. Additionally, see at least [0120], [0124], and steps S105 and S115, wherein the map obtained in step S120 is only generated on the basis of completing analysis of the collected sensor data.
Sawahashi remains silent on based on receiving a request for a map corresponding to a target space. Additionally Sawahashi remains silent on performing first and second driving, where the second driving is performed based on completing the first driving. As discussed above, Sawahashi’s collection of first and second sensor data occur mostly simultaneously as the apparatus travels through the space. Finally, Sawahashi remains silent on identify a driving route to a target location based on the map, and control the electronic apparatus to move to the target location based on the driving route, wherein the first surface is related to a horizontal surface of the target space, and wherein the second surface is related to a vertical surface of the target space.
Lund teaches based on receiving a request for a map corresponding to a target space. See at least [0085] and figure 4A, wherein the apparatus performs map generation based on receiving a mission objective, or request, to map a target survey area.
identify a driving route to a target location based on the map, and control the electronic apparatus to move to the target location based on the driving route. See at least [0103]-[0107] and figure 6, Steps 602-608, wherein the occupancy maps generated from the first and second surface data collection (horizontal and vertical occupancy maps) are used to identify a route to a destination, and the apparatus is controlled to move to the destination along the identified route.
wherein the first surface is related to a horizontal surface of the target space, and wherein the second surface is related to a vertical surface of the target space. See at least [0094]-[0097] and figure 5, steps 502-504, wherein the apparatus performs a first movement to collect a set of horizontal planar sensor data, and then a second movement to collect a second set of vertical planar sensor data. See at least [0058], wherein different sets of sensor data correspond to different scans performed by the apparatus.
One having ordinary skill in the art, before the effective filing date of the claimed invention, would have found it obvious to modify Sawahashi with Lund’s technique of receiving a request for a map corresponding to a target space, identifying a driving route to a target location based on the map, and controlling the electronic apparatus to move to the target location based on the driving route, wherein the first surface is related to a horizontal surface of the target space, and wherein the second surface is related to a vertical surface of the target space. It would have been obvious to modify because doing so enables increased reliability when mapping systems perform scans to generate three-dimensional maps, as recognized by Lund (see at least [0023]-[0024]).
Wirth teaches based on completing the first driving. See at least [0047], [0057]-[0060], and figure 2, steps S204-210, wherein a robot is directed to traverse a first route through locations in a predefined area. The robot then traverses a second route, through one or more of the same locations, and collects second sensor data on the second route.
One having ordinary skill in the art, before the effective filing date of the claimed invention, would have found it obvious to further modify Sawahashi with Wirth’s technique of collecting second sensor data on a second driving based on completing the first driving. It would have been obvious to modify because doing so enables robots to collect sensor data for mapping dynamically, allowing robots to navigate changing environments, as recognized by Wirth (see at least [0002]-[0004]).
Regarding claim 2, Sawahashi, Lund, and Wirth in combination teach all of the limitations of claim 1 as discussed above, and Sawahashi additionally teaches wherein the first surface data comprises first information corresponding to the horizontal surface including an x axis and a y axis based on the driving direction of the electronic apparatus. See at least [0055]-[0057] and figure 4A, wherein the first distance sensor 12 collects information corresponding to a horizontal surface parallel to the XY plane in the traveling direction of the vehicle.
and wherein the second surface data comprises second information corresponding to the vertical surface including a z axis based on the driving direction of the electronic apparatus. See at least [0053], [0084], and figure 9, wherein the second distance sensor 11 collects information, including information in a vertical direction of the z axis in the traveling direction of the vehicle.
Regarding claim 3, Sawahashi, Lund, and Wirth in combination teach all of the limitations of claim 1 as discussed above, and Sawahashi additionally teaches wherein the first surface data comprises first spatial information of the first surface and first object information of the first surface. See at least [0055]-[0058] and [0174], wherein the first surface data comprises a point cloud of points representing obstacles, each point containing spatial information in the form of (x, y) coordinates. Additionally, see at least [0069]-[0070], wherein the points from the first surface data and the second surface data contain classification information indicating whether the point is an obstacle or a surface.
Regarding claim 11, Sawahashi teaches A controlling method of an electronic apparatus. See at least [0006], [0066], and [0076]-[0077], wherein the disclosure is implemented by an apparatus comprising a CPU and memory that execute a stored program.
the controlling method comprising: performing first driving of the electronic apparatus for the target space to collect first sensor data obtained through at least one sensor of the electronic apparatus while moving the electronic apparatus through the target space, and obtaining first surface data for a first surface corresponding to a driving direction of the electronic apparatus based on the first sensor data. See at least [0113]-[0114] and figure 16, step S110, wherein sensor data from a first (2D) distance sensor 12 is used to obtain first surface data (2D point cloud). Additionally, see at least [0052], wherein the first sensor 12 captures information in the traveling direction of the mobile body.
performing second driving of the electronic apparatus for the target space to collect second sensor data obtained through the at least one sensor while moving the electronic apparatus through the target space, and obtaining second surface data for a second surface different from the first surface based on the second sensor data. See at least [0113]-[0114] and figure 16, step S100, wherein sensor data from a second (3D) distance sensor 11 is used to obtain second surface data (3D point cloud). Additionally, see at least [0053], wherein the second distance sensor 11 obtains information in a traveling direction of the mobile body.
based on completing the driving for the target space, obtaining the map based on the first surface data and the second surface data. See at least [0120], [0125], and figure 16, Steps S111 and S120, wherein the first surface data and the second surface data are integrated to obtain a map.
Sawahashi remains silent on based on receiving a request for a map corresponding to a target space. Additionally Sawahashi remains silent on performing first and second driving, where the second driving is performed based on completing the first driving. As discussed above, Sawahashi’s collection of first and second sensor data occur mostly simultaneously as the apparatus travels through the space. Finally, Sawahashi remains silent on identifying a driving route to a target location based on the map, and controlling the electronic apparatus to move to the target location based on the driving route, wherein the first surface is related to a horizontal surface of the target space, and wherein the second surface is related to a vertical surface of the target space.
Lund teaches based on receiving a request for a map corresponding to a target space. See at least [0085] and figure 4A, wherein the apparatus performs map generation based on receiving a mission objective, or request, to map a target survey area.
identifying a driving route to a target location based on the map, and controlling the electronic apparatus to move to the target location based on the driving route. See at least [0103]-[0107] and figure 6, Steps 602-608, wherein the occupancy maps generated from the first and second surface data collection (horizontal and vertical occupancy maps) are used to identify a route to a destination, and the apparatus is controlled to move to the destination along the identified route.
wherein the first surface is related to a horizontal surface of the target space, and wherein the second surface is related to a vertical surface of the target space. See at least [0094]-[0097] and figure 5, steps 502-504, wherein the apparatus performs a first movement to collect a set of horizontal planar sensor data, and then a second movement to collect a second set of vertical planar sensor data. See at least [0058], wherein different sets of sensor data correspond to different scans performed by the apparatus.
One having ordinary skill in the art, before the effective filing date of the claimed invention, would have found it obvious to modify Sawahashi with Lund’s technique of receiving a request for a map corresponding to a target space, identifying a driving route to a target location based on the map, and controlling the electronic apparatus to move to the target location based on the driving route, wherein the first surface is related to a horizontal surface of the target space, and wherein the second surface is related to a vertical surface of the target space. It would have been obvious to modify because doing so enables increased reliability when mapping systems perform scans to generate three-dimensional maps, as recognized by Lund (see at least [0023]-[0024]).
Wirth teaches based on completing the first driving. See at least [0047], [0057]-[0060], and figure 2, steps S204-210, wherein a robot is directed to traverse a first route through locations in a predefined area. The robot then traverses a second route, through one or more of the same locations, and collects second sensor data on the second route.
One having ordinary skill in the art, before the effective filing date of the claimed invention, would have found it obvious to further modify Sawahashi with Wirth’s technique of collecting second sensor data on a second driving based on completing the first driving. It would have been obvious to modify because doing so enables robots to collect sensor data for mapping dynamically, allowing robots to navigate changing environments, as recognized by Wirth (see at least [0002]-[0004]).
Regarding claim 12, Sawahashi, Lund, and Wirth in combination teach all of the limitations of claim 11 as discussed above, and Sawahashi additionally teaches wherein the first surface data comprises first information corresponding to the horizontal surface including an x axis and a y axis based on the driving direction of the electronic apparatus. See at least [0055]-[0057] and figure 4A, wherein the first distance sensor 12 collects information corresponding to a horizontal surface parallel to the XY plane in the traveling direction of the vehicle.
and wherein the second surface data comprises second information corresponding to the vertical surface including a z axis based on the driving direction of the electronic apparatus. See at least [0053], [0084], and figure 9, wherein the second distance sensor 11 collects information, including information in a vertical direction of the z axis in the traveling direction of the vehicle.
Regarding claim 13, Sawahashi, Lund, and Wirth in combination teach all of the limitations of claim 11 as discussed above, and Sawahashi additionally teaches wherein the first surface data comprises first spatial information of the first surface and first object information of the first surface. See at least [0055]-[0058] and [0174], wherein the first surface data comprises a point cloud of points representing obstacles, each point containing spatial information in the form of (x, y) coordinates. Additionally, see at least [0069]-[0070], wherein the points from the first surface data and the second surface data contain classification information indicating whether the point is an obstacle or a surface.
Claims 4-10 and 14-15 are rejected under 35 U.S.C. 103 as being unpatentable over Sawahashi, Lund, and Wirth as applied to claims above, and further in view of US 20210110607 A1, filed 06/04/2019, hereinafter “Coddington”.
Regarding claim 4, Sawahashi, Lund, and Wirth in combination teach all of the limitations of claim 3 as discussed above, and Sawahashi additionally teaches wherein the at least one sensor comprises: a first distance sensor, and an acceleration sensor, wherein the first sensor data comprises first distance data obtained through the first distance sensor and first acceleration data obtained through the acceleration sensor. See at least [0052] and [0162], wherein the sensors include the first distance sensor 12 and an acceleration sensor. Additionally, see at least [0052] and [0055]-[0057], wherein the first sensor 12 captures distance information in the traveling direction of the mobile body.
Sawahashi remains silent on wherein the instructions, when executed by the at least one processor, cause the electronic apparatus to: detect first edge information based on the first distance data, obtain first direction information of the electronic apparatus based on the first acceleration data, and obtain the first surface data based on the first edge information and the first direction information.
Lund teaches obtain direction information of the electronic apparatus based on the acceleration data. See at least [0030], wherein the accelerometer provides linear acceleration data including a direction and magnitude of the apparatus.
One having ordinary skill in the art, before the effective filing date of the claimed invention, would have found it obvious to modify Sawahashi with Lund’s technique of obtaining first direction information of the apparatus based on the acceleration data. It would have been obvious to modify because doing so enables increased reliability when mapping systems perform scans to generate three-dimensional maps, as recognized by Lund (see at least [0023]-[0024]).
Coddington teaches wherein the instructions, when executed by the at least one processor, cause the electronic apparatus to: detect edge information based on the distance data. See at least [0033]-[0034] and figures 5-7, wherein the computer with software performs processing to detect intersecting line information based on the measured distance information. The intersecting line information represents edges between intersecting surfaces.
and obtain the surface data based on the edge information and the direction information. See at least [0035], wherein the direction of movement of the apparatus is combined with the determined 3D model. Additionally, see at least [0033]-[0034], wherein the 3D model comprises the detected edges from the distance data.
One having ordinary skill in the art, before the effective filing date of the claimed invention, would have found it obvious to further modify Sawahashi with Coddington’s technique of detecting edge information from distance information, and using the edge information and direction information to generate surface data. It would have been obvious to modify because doing so enables low-cost mapping of interior spaces while maintaining sufficient accuracy, as recognized by Coddington (see at least [0007]-[0008]).
Regarding claim 5, Sawahashi, Lund, Wirth, and Coddington in combination teach all of the limitations of claim 4 as discussed above, and Sawahashi additionally teaches wherein the second surface data comprises second spatial information of the second surface and second object information of the second surface. See at least [0053]-[0054], wherein the second distance sensor 11 provides information on objects detected by the sensor and their spatial information, in the form of (x, y, z) coordinates. Additionally, see at least [0069]-[0070], wherein the points from the first surface data and the second surface data contain classification information indicating whether the point is an obstacle or a surface.
Regarding claim 6, Sawahashi, Lund, Wirth and Coddington in combination teach all of the limitations of claim 5 as discussed above, and Sawahashi additionally teaches wherein the at least one sensor comprises: a second distance sensor, and wherein the first sensor data comprises second distance data obtained through the second distance sensor and second acceleration data obtained through the acceleration sensor. See at least [0052] and [0162], wherein the sensors include the second distance sensor 11 and an acceleration sensor. Additionally, see at least [0052] and [0055]-[0057], wherein the second distance sensor 11 captures distance information in the traveling direction of the mobile body.
Sawahashi remains silent on wherein the instructions, when executed by the at least one processor, cause the electronic apparatus to: detect second edge information based on the first distance data, obtain second direction information of the electronic apparatus based on the second acceleration data, and obtain the second surface data based on the first edge information and the first direction information.
Lund teaches obtain direction information of the electronic apparatus based on the acceleration data. See at least [0030], wherein the accelerometer provides linear acceleration data including a direction and magnitude of the apparatus.
One having ordinary skill in the art, before the effective filing date of the claimed invention, would have found it obvious to modify Sawahashi with Lund’s technique of obtaining direction information of the apparatus based on the acceleration data. It would have been obvious to modify because doing so enables increased reliability when mapping systems perform scans to generate three-dimensional maps, as recognized by Lund (see at least [0023]-[0024]).
Coddington teaches wherein the instructions, when executed by the at least one processor, cause the electronic apparatus to: detect edge information based on the distance data. See at least [0033]-[0034] and figures 5-7, wherein the computer with software performs processing to detect intersecting line information based on the measured distance information. The intersecting line information represents edges between intersecting surfaces.
and obtain the surface data based on the edge information and the direction information. See at least [0035], wherein the direction of movement of the apparatus is combined with the determined 3D model. Additionally, see at least [0033]-[0034], wherein the 3D model comprises the detected edges from the distance data.
One having ordinary skill in the art, before the effective filing date of the claimed invention, would have found it obvious to further modify Sawahashi with Coddington’s technique of detecting edge information from distance information, and using the edge information and direction information to generate surface data. It would have been obvious to modify because doing so enables low-cost mapping of interior spaces while maintaining sufficient accuracy, as recognized by Coddington (see at least [0007]-[0008]).
Regarding claim 7, Sawahashi, Lund, Wirth and Coddington in combination teach all of the limitations of claim 6 as discussed above, and Sawahashi additionally teaches wherein the at least one sensor further comprises a vision sensor, wherein the second sensor data further comprises image data obtained through the vision sensor. See at least [0155]-[0156], wherein the sensors include a camera that acquires captured image information.
and wherein the instructions, when executed by the at least one processor, cause the electronic apparatus to update the second object information based on the image data. See at least [0152] and [0156], wherein the data obtained by the camera is accumulated along with data from the second distance sensor to generate the surface data.
Regarding claim 8, Sawahashi, Lund, Wirth and Coddington in combination teach all of the limitations of claim 6 as discussed above, and Sawahashi additionally teaches wherein the at least one sensor further comprises a tilt sensor. See at least [0162], wherein the sensors include a gyro sensor.
Sawahashi remains silent on wherein the second sensor data further comprises tilt data obtained through the tilt sensor, and wherein the instructions, when executed by the at least one processor, cause the electronic apparatus to: obtain, based on the tilt data, a first tilt angle in a roll direction, a second tilt angle in a pitch direction, and a third tilt angle in a yaw direction, of the electronic apparatus, and update the second spatial information based on the first tilt angle, the second tilt angle, and the third tilt angle.
Lund teaches wherein the second sensor data further comprises tilt data obtained through the tilt sensor and wherein the instructions, when executed by the at least one processor, cause the electronic apparatus to: obtain, based on the tilt data, a first tilt angle in a roll direction, a second tilt angle in a pitch direction, and a third tilt angle in a yaw direction, of the electronic apparatus. See at least [0030], wherein the orientation sensor provides tilt data which comprises a magnitude and direction of pitch, roll, and yaw of the apparatus.
and update the second spatial information based on the first tilt angle, the second tilt angle, and the third tilt angle. See at least [0096] and figure 5, step 504, wherein the map data in the vertical plane is updated based on the determined orientation data of the apparatus.
One having ordinary skill in the art, before the effective filing date of the claimed invention, would have found it obvious to modify Sawahashi with Lund’s technique of obtaining tilt data from a tilt sensor, obtaining tilt angles representing pitch, roll, and yaw, and updating spatial information in a vertical plane based on the obtained pitch, roll, and yaw information. It would have been obvious to modify because doing so enables increased reliability when mapping systems perform scans to generate three-dimensional maps, as recognized by Lund (see at least [0023]-[0024]).
Regarding claim 9, Sawahashi, Lund, Wirth and Coddington in combination teach all of the limitations of claim 8 as discussed above, and Sawahashi additionally teaches wherein the first distance sensor is a Light Detection and Ranging (LiDAR) sensor. See at least [0056], wherein the first distance sensor 12 is a 2D-LiDAR sensor.
the second distance sensor is a Time of Flight (ToF) sensor. See at least [0054], wherein the second distance sensor 11 is a direct Time of Flight (dToF) sensor.
and the tilt sensor is a gyro sensor. See at least [0162], wherein the sensors include a gyro sensor.
Regarding claim 10, Sawahashi, Lund, Wirth and Coddington in combination teach all of the limitations of claim 6 as discussed above, and Sawahashi additionally teaches wherein the instructions, when executed by the at least one processor, cause the electronic apparatus to: obtain, based on the first spatial information and the second spatial information corresponding to a same location, third spatial information by combining the first spatial information and the second spatial information. See at least [0071], [0120], and figure 16, step S111, wherein the first point cloud and the second point cloud, both comprising spatial coordinate information, are combined to obtain a third integrated point cloud.
obtain, based on the first object information and the second object information corresponding to a same location, third object information by combining the first object information and the second object information. See at least [0071]-[0074], [0120], and figure 16, step S111, wherein the first point cloud and the second point cloud, both comprising object point information, are combined to obtain a third integrated point cloud. The obstacle information from the first and second surface data is integrated to form an integrated obstacle point cloud.
and obtain the map, wherein the map comprises the third spatial information and the third object information. See at least [0071]-[0074], wherein the integrated point cloud data and the integrated obstacle point cloud data are used to create an obstacle location map.
Regarding claim 14, Sawahashi, Lund, and Wirth in combination teach all of the limitations of claim 13 as discussed above, and Sawahashi additionally teaches wherein the electronic apparatus comprises a first distance sensor and an acceleration sensor, wherein the first sensor data comprises first distance data obtained through the first distance sensor and first acceleration data obtained through the acceleration sensor. See at least [0052] and [0162], wherein the sensors include the first distance sensor 12 and an acceleration sensor. Additionally, see at least [0052] and [0055]-[0057], wherein the first sensor 12 captures distance information in the traveling direction of the mobile body.
Sawahashi remains silent on wherein the obtaining the first surface data comprises: detecting first edge information based on the first distance data; obtaining first direction information of the electronic apparatus based on the first acceleration data; and obtaining the first surface data based on the first edge information and the first direction information.
Lund teaches obtaining direction information of the electronic apparatus based on the acceleration data. See at least [0030], wherein the accelerometer provides linear acceleration data including a direction and magnitude of the apparatus.
One having ordinary skill in the art, before the effective filing date of the claimed invention, would have found it obvious to modify Sawahashi with Lund’s technique of obtaining first direction information of the apparatus based on the acceleration data. It would have been obvious to modify because doing so enables increased reliability when mapping systems perform scans to generate three-dimensional maps, as recognized by Lund (see at least [0023]-[0024]).
Coddington teaches wherein the obtaining the first surface data comprises: detecting edge information based on the distance data. See at least [0033]-[0034] and figures 5-7, wherein the computer with software performs processing to detect intersecting line information based on the measured distance information. The intersecting line information represents edges between intersecting surfaces.
and obtaining the surface data based on the edge information and the direction information. See at least [0035], wherein the direction of movement of the apparatus is combined with the determined 3D model. Additionally, see at least [0033]-[0034], wherein the 3D model comprises the detected edges from the distance data.
One having ordinary skill in the art, before the effective filing date of the claimed invention, would have found it obvious to further modify Sawahashi with Coddington’s technique of detecting edge information from distance information, and using the edge information and direction information to generate surface data. It would have been obvious to modify because doing so enables low-cost mapping of interior spaces while maintaining sufficient accuracy, as recognized by Coddington (see at least [0007]-[0008]).
Regarding claim 15, Sawahashi, Lund, Wirth and Coddington in combination teach all of the limitations of claim 14 as discussed above, and Sawahashi additionally teaches wherein the second surface data comprises second spatial information of the second surface and second object information of the second surface. See at least [0053]-[0054], wherein the second distance sensor 11 provides information on objects detected by the sensor and their spatial information, in the form of (x, y, z) coordinates. Additionally, see at least [0069]-[0070], wherein the points from the first surface data and the second surface data contain classification information indicating whether the point is an obstacle or a surface.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Selena M. Jin whose telephone number is (408)918-7588. The examiner can normally be reached Monday - Thursday and alternate Fridays, 7:30-4:30 PT.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Faris Almatrahi can be reached at (313) 446-4821. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/S.M.J./ Examiner, Art Unit 3667
/FARIS S ALMATRAHI/ Supervisory Patent Examiner, Art Unit 3667