DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 10/20/2025 has been entered.
Response to Amendment
This office action is responded to the amendment filed on 06/30/2025.
Claims 36-43, 45, and 49-55 are currently amended.
Claims 44, 46-48 are as previously presented.
Response to Argument
Applicant argues nowhere in Li discloses wherein one or more servers are remote from a second device and are connected to the second device over a network connection, and wherein the first image data are received from the first device while each of the first device and the second device is within a same device region and wherein the second image data are received from the second device while each of the first device and the second device is within the device region. Applicant argument is respectfully traversed.
Li discloses wherein one or more servers are remote from a second device and are connected to the second device over a network connection (the controller receive image data from an in-vehicle device, where in the controller is remote from the first vehicle and connected to the first device over a network – see include but are not limited to paragraphs 0024, 0079-0081, 0092), and wherein the first image data are received from the first device while each of the first device and the second device is within a same device region and wherein the second image data are received from the second device while each of the first device and the second device is within the device region. Applicant argument is respectfully traversed (after identifying another vehicle owner being at the location, requesting from other vehicle owner the navigation map data characteristic – see includes but are not limited to paragraphs 0024, 0081, 0119).
For the reason above, rejection of claims 36-55 are discussed below.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 36, 41-42, 45-46, 49, and 54-55 are rejected under 35 U.S.C. 102 as being unpatentable over LI (US 20180306590).
Regarding claim 36, LI discloses a method of automated map generation, comprising (method – see paragraph 0002): receiving, by one or more servers, first image data from a first device (controller receive image data from the in-vehicle terminal – see paragraph 0008);
wherein the one or more server is remote from the first device and is connected to the first device over a network connection, wherein one or more servers are remote from a second device and are connected to the second device over a network connection (the controller receive image data from an in-vehicle device, where in the controller is remote from the first vehicle and connected to the first device over a network – see include but are not limited to paragraphs 0024, 0079-0081, 0092), and wherein the first image data are received from the first device while each of the first device and the second device is within a same device region (after identifying another vehicle owner being at the location, requesting from other vehicle owner the navigation map data characteristic – see includes but are not limited to paragraphs 0024, 0081, 0119).
identifying one or more first data parameters of the first image data (Identify the data characteristics – see paragraphs 0087; generating, via the one or more server using the first image data, map data of the device region (generating, via controller using map image data of the location of the vehicle – see paragraphs 0071, 0089) ; transmitting, over the network connection to the first device, first guide data configured to guide the first device in navigating the device region of the first device, wherein the first guide data are obtained based on the map data (transmitting over the network communication, navigation map data configured to navigate the vehicle at a location, the navigation map data are obtained based on the map image data – see paragraphs 0002, 0073, 0087); receiving, by one or more servers, second image data for the device region from the second device ( receiving by the controller, current image data of the vehicle location – see paragraph 0055-0058, 0071), where in the second image data are received from the second device while each of the first device and the second device is within the device region (after identifying another vehicle owner being at the location, requesting from other vehicle owner the navigation map data characteristic – see includes but are not limited to paragraphs 0024, 0081, 0119); identifying one or more second data parameters of the second image data ( Identify the characteristics of the second image data – see paragraph 0071); determining that the one or more second data parameters provide higher quality image data for the device region than do the one or more first data parameters (comparing the quality of currently data with navigation map data which include image data - see paragraphs 0072-0075); based at least in part on the determining that the second data parameters provide the higher quality image data for the device region, updating, via the one or more server, the map data of the device region based on the second image data (update the map image data via controller of the vehicle based on the current image data – see paragraphs 0083- 0084, 0087); and transmitting, over the network connection to the first device, second guide data configured to guide the first device in the navigating of the device region based on the updated map data (transmitting the navigation map data over the network communication to the vehicle to navigate the vehicle in the location based on the updated map image data – see include but are not limited to paragraphs 0071 – 0075).
Regarding claim 41, LI discloses the method of claim 36, wherein a first data parameter of the one or more first data parameters comprises one or more of an image resolution or a framerate associated with the first image data at a time of capture of the first image data, and a second data parameter of the one or more second data parameters comprises one or more of an image resolution or a framerate associated with second image data at a time of capture of the second image data (the data characteristic include comprise the quality of the image – see paragraph 0087).
Regarding claim 42, Li discloses the method of claim 36, further comprising: based at least in part on identifying the second device as being in the same device region as the first device at a current time, requesting from the second device the one or more second data parameters (after identifying another vehicle owner being at the location, requesting from other vehicle owner the navigation map data characteristic in real time – see includes but are not limited to paragraphs 0024, 0081, 0109, 0119, 0143).
Regarding claim 45, LI discloses the method of claim 36, wherein the updating of the map of the device region based on the second image data comprises replacing, at least in part, the first image data with the second image data updating (updating the map of the vehicle location based on current map image data – see include but are not limited to paragraphs 0075).
Regarding claim 46, LI discloses the method of claim 36, wherein the second image data are received from a second device different from the first device (the map image data is different from the current map image data – see include but are not limited to paragraphs 0092-0093).
Regarding claim 49, LI discloses a system of automated map generation, comprising (system – see paragraphs 0002, 0071): one or more server comprising: input/output (I/O) circuitry to receiving first image data from a first device over a network connection(input/output circuitry receive image data from the in-vehicle terminal over the network communication – see paragraph 0008, 0125); wherein the one or more server is remote from the first device and is connected to the first device over a network connection, wherein one or more servers are remote from a second device and are connected to the second device over a network connection (the controller receive image data from an in-vehicle device, where in the controller is remote from the first vehicle and connected to the first device over a network – see include but are not limited to paragraphs 0024, 0079-0081, 0092), and wherein the first image data are received from the first device while each of the first device and the second device is within a same device region (after identifying another vehicle owner being at the location, requesting from other vehicle owner the navigation map data characteristic – see includes but are not limited to paragraphs 0024, 0081, 0119).
And processing circuitry configured : to identifying one or more first data parameters of the first image data (Identify the data characteristics – see paragraphs 0087; generating, using the first image data, map data of the device region (generate map image data of the location of the vehicle – see paragraphs 0071, 0089) ; to transmit, over the network connection to the first device, first guide data configured to guide the first device in navigating the device region of the first device, wherein the first guide data are obtained based on the map data (transmitting navigation map data from the network to configured to navigate the vehicle at a location, the navigation map data are obtained based on the map image data – see paragraphs 0002, 0073, 0087); receiving, by one or more servers, second image data for the device region from the second device ( receiving by the controller, current image data of the vehicle location – see paragraph 0055-0058, 0071), where in the second image data are received from the second device while each of the first device and the second device is within the device region (after identifying another vehicle owner being at the location, requesting from other vehicle owner the navigation map data characteristic – see includes but are not limited to paragraphs 0024, 0081, 0119); identifying one or more second data parameters of the second image data ( Identify the characteristics of the second image data – see paragraph 0071); determining that the one or more second data parameters provide higher quality image data for the device region than do the one or more first data parameters (comparing the quality of currently data with navigation map data which include image data - see paragraphs 0072-0075); based at least in part on the determining that the second data parameters provide the higher quality image data, updating the map data of the device region based on the second image data (update the map image data of the vehicle based on the current image data – see paragraphs 0083, 0084, 0087); and to transmit, over the network connection to the first device, second guide data configured to guide the first device in the navigating of the device region based on the updated map data (transmitting the navigation map data over the network to the vehicle to navigate the vehicle in the location based on the updated map image data – see include but are not limited to paragraphs 0071 – 0075, 0079).
Regarding claim 54, LI discloses the method of claim 49, wherein a first data parameter of the one or more first data parameters comprises one or more of an image resolution or a framerate associated with the first image data at a time of capture of the first image data, and a second data parameter of the one or more second data parameters comprises one or more of an image resolution or a framerate associated with second image data at a time of capture of the second image data (the data characteristic include comprise the quality of the image – see paragraph 0087).
Regarding claim 55, Li discloses the method of claim 49, wherein the processing circuitry is configured: to request based at least in part on identifying the second device as being in the same device region as the first device at a current time, requesting from the second device the one or more second data parameters (after identifying another vehicle owner being at the location, requesting from other vehicle owner the navigation map data characteristic in real time – see includes but are not limited to paragraphs 0024, 0081, 0109, 0119, 0143).Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 37-40, 43- 44, 47-48, and 50-53 are rejected under 35 U.S.C. 103 as being unpatentable over LI at al (US 20180306590) in view of Burbank at al (US 20220047134) and Wu at al (US 10893310 B1).
Regarding claim 37, LI discloses the method of claim 36, wherein a first data parameter of the one or more first data parameters comprises an image resolution of the first device (the navigation map data include image quality of the in-vehicle terminal – see paragraphs 0079, 0087).
However, LI does not explicitly disclose a first data parameter of the one or more first data parameters comprises a maximum image resolution capacity of a capturing sensor of the first device.
Wu discloses a first data parameter of the one or more first data parameters comprises a maximum image resolution capacity of a capturing sensor of the first device (transmitting the parameters comprises the maximum image resolution capacity of the device – see include but are not limited to Col. 6 lines 24-26, Col. 10 lines 30-36, Col 11 lines 5-15).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention to modify LI with the teaching of a first data parameter of the one or more first data parameters comprises a maximum image resolution capacity of a capturing sensor of the first device as taught by Wu in order to yield predicable result of clarify the capability of the device.
Regarding claim 38, LI discloses the method of claim 36, wherein a first data parameter of the one or more first data parameters comprise the image quality of a capturing sensor of the first device (the navigation map data include image quality of the in-vehicle terminal – see paragraphs 0079, 0087).
However, LI does not explicitly disclose the method of claim 36, wherein a first data parameter of the one or more first data parameters comprise a maximum framerate capacity of a capturing sensor of the first device.
Wu discloses a first data parameter of the one or more first data parameters comprises a maximum framerate capacity of a capturing sensor of the first device (transmitting the parameters comprises the maximum framerate capacity of the device – see include but are not limited to Col. 6 lines 24-26, Col. 10 lines 29-36, Col 11 lines 5-15)
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention to modify LI with the teaching of a first data parameter of the one or more first data parameters comprises a maximum framerate capacity of a capturing sensor of the first device as taught by Wu in order to yield predicable result of clarify the capability of the device.
Regarding claim 39, LI discloses the method of claim 36, wherein a first data parameter comprises an image data throughput of the first device (paragraph 0071).
However, LI does not explicitly disclose wherein a first data parameter of the one or more first data parameters comprises a codec image data throughput of the first device.
Wu discloses wherein a first data parameter of the one or more first data parameters comprises a codec image data throughput of the first device (the parameter include codec image – see Col. 12 lines 31-49)
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filling date of a first data parameter of the one or more first data parameters comprises a codec image data throughput of the first device as taught by Wu in order to yield predicable result of improve the process of transmitting the obtained data.
Regarding claim 40, LI discloses the method of claim 36, wherein a first data parameter of the one or more first data parameters comprises the quality of the image (paragraph 0071).
However, LI does not explicitly disclose wherein a first data parameter of the one or more first data parameters comprises a maximum bitrate transmission capacity of the first device.
Wu discloses wherein a first data parameter of the one or more first data parameters comprises a maximum bitrate transmission capacity of the first device. (transmitting the parameters comprises the maximum bitrate of the device – see include but are not limited to Col. 6 lines 11-35, Col. 10 lines 29-36, Col 11 lines 5-15).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filling date of a first data parameter of the one or more first data parameters comprises a codec image data throughput of the first device as taught by Wu in order to yield predicable result of clarify the capability of the device.
Regarding claim 43, LI discloses the method of claim 36, wherein the generating the map data and continuously updating the map remote from the first device and the second device(paragraphs 0087).
However, LI does not explicitly disclose the generating the map and the updating of the map are performed by a SLAM network edge service processor of the one or more servers.
Burbank discloses generating the map and the updating of the map are performed by a SLAM network edge service processor remote from the first device and the second device, (the controller generate and update map perform a SLAM technique remote from the robot, the SLAM technique include using a processing circuitry – see includes but are not limited to paragraph 0003, 0052, 0104)
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention to modify LI with the teaching of generating the map and the updating of the map are performed by a SLAM network edge service processor remote from the first device, wherein the SLAM network edge service processor comprises the processing circuitry as taught by Burbank in order to yield predicable result of better extracting the features for the environment [0001].
Regarding claim 44, LI discloses the method of claim 36, wherein the generating the map and the updating of the map are performed by a device (paragraph 0002).
However, LI does not explicitly disclose the method of claim 36, wherein the generating the map and the updating of the map are performed by a SLAM-enabled device.
Burbank discloses the method of claim 36, wherein the generating the map and the updating of the map data are performed by a SLAM-enabled device (construct map data using a SLAM-enable device -see include but are not limited to paragraphs 0053, 0086, 0115).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention to modify LI with the teaching of wherein the generating the map and the updating of the map are performed by a SLAM-enabled device.as taught by Burbank in order to yield predicable result of better extracting the features for the environment.
Regarding claim 47, LI discloses the method of claim 36.
However, Li does not disclose wherein the first device is a wearable virtual reality device or augmented reality device.
Burbank discloses the first device is a wearable virtual reality device or augmented reality device (mobile device can be a virtual reality headset – see paragraph 0056)
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention to modify LI with the teaching of the first device is a wearable virtual reality device or augmented reality device as taught by Burbank in order to yield predicable result of giving the user better field of view of the environment.
Regarding claim 48, LI discloses the method of claim 36, wherein the first device is a vehicle.
However, Li does not disclose the method of claim 3648, wherein the first device is an autonomous vehicle, a drone, or a robot.
Burbank discloses the first device is an autonomous vehicle, a drone, or a robot. (mobile device can be a robot – see paragraph 0026)
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention to modify LI with the teaching of the first device is an autonomous vehicle, a drone, or a robot as taught by Burbank in order to yield predicable result of further implement the autonomous capability of the device.
Regarding claim 50, LI discloses the method of claim 49, wherein a first data parameter of the one or more first data parameters comprises an image resolution of the first device (the navigation map data include image quality of the in-vehicle terminal – see paragraphs 0079, 0087).
However, LI does not explicitly disclose a first data parameter of the one or more first data parameters comprises a maximum image resolution capacity of a capturing sensor of the first device.
Wu discloses a first data parameter of the one or more first data parameters comprises a maximum image resolution capacity of a capturing sensor of the first device (transmitting the parameters comprises the maximum image resolution capacity of the device – see include but are not limited to Col. 6 lines 24-26, Col. 10 lines 30-36, Col 11 lines 5-15).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention to modify LI with the teaching of a first data parameter of the one or more first data parameters comprises a maximum image resolution capacity of a capturing sensor of the first device as taught by Wu in order to yield predicable result of clarify the capability of the device.
Regarding claim 51, LI discloses the method of claim 49, wherein a first data parameter of the one or more first data parameters comprise the image quality of a capturing sensor of the first device (the navigation map data include image quality of the in-vehicle terminal – see paragraphs 0079, 0087).
However, LI does not explicitly disclose the method of claim 36, wherein a first data parameter of the one or more first data parameters comprise a maximum framerate capacity of a capturing sensor of the first device.
Wu discloses a first data parameter of the one or more first data parameters comprises a maximum framerate capacity of a capturing sensor of the first device (transmitting the parameters comprises the maximum framerate capacity of the device – see include but are not limited to Col. 6 lines 24-26, Col. 10 lines 29-36, Col 11 lines 5-15)
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention to modify LI with the teaching of a first data parameter of the one or more first data parameters comprises a maximum framerate capacity of a capturing sensor of the first device as taught by Wu in order to yield predicable result of clarify the capability of the device.
Regarding claim 52, LI discloses the method of claim 49, wherein a first data parameter of the one or more first data parameters comprises an image data throughput of the first device (paragraph 0071).
However, LI does not explicitly disclose wherein a first data parameter of the one or more first data parameters comprises a codec image data throughput of the first device.
Wu discloses wherein a first data parameter of the one or more first data parameters comprises a codec image data throughput of the first device (the parameter include codec image – see Col. 12 lines 31-49)
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filling date of a first data parameter of the one or more first data parameters comprises a codec image data throughput of the first device as taught by Wu in order to yield predicable result of improve the process of transmitting the obtained data.
Regarding claim 53, LI discloses the method of claim 40, wherein a first data parameter of the one or more first data parameters comprises the quality of the image (paragraph 0071).
However, LI does not explicitly disclose wherein a first data parameter of the one or more first data parameters comprises a maximum bitrate transmission capacity of the first device.
Wu discloses wherein a first data parameter of the one or more first data parameters comprises a maximum bitrate transmission capacity of the first device. (transmitting the parameters comprises the maximum bitrate of the device – see include but are not limited to Col. 6 lines 11-35, Col. 10 lines 29-36, Col 11 lines 5-15).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filling date of a first data parameter of the one or more first data parameters comprises a codec image data throughput of the first device as taught by Wu in order to yield predicable result of clarify the capability of the device.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant’s disclosure:
Romain et al. (US 20210404829) discloses a map system for updating a region of a host map based on sensor data received from a plurality of connected vehicles travelling in the region.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to AI KIM TRAN whose telephone number is (703)756-5911. The examiner can normally be reached Thursday 8:00 am - 5:00 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Christian Chace can be reached on (571) 272-4190. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/A.K.T./Examiner, Art Unit 3665
/CHRISTIAN CHACE/Supervisory Patent Examiner, Art Unit 3665