DETAILED ACTION
Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Double Patenting
2. The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer.
3. Claims 1-2 of present application are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1 and 3, respectively, of U.S. Patent No. 12,112,547 (patent 547). Although the claims at issue are not identical, they are not patentably distinct from each other because they are similar and the present claims are slightly broader.
4. The following table shows a correspondence between the claims of present application and claims of patent 547.
Claims of present application
1
2
Claims of patent 547
1
3
5. The following table shows a correspondence between the limitations of claim 1 of present application with claim 1 of 547
Claim 1 of present application
Claim 1 of patent 547
1. An apparatus comprising:
circuitry; and
memory connected to the circuitry,
wherein the circuitry, in operation:
obtains first sensing data from a first mobile apparatus including one or more first sensors, the first sensing data including data generated by sensing using the one or more first sensors;
obtains second sensing data from a second mobile apparatus including one or more second sensors, the second sensing data including data generated by sensing using the one or more second sensors; and
generates synthesized data from the first sensing data and the second sensing data based on position information indicating a position of the second mobile apparatus,
and
wherein the position information is estimated from a result of the sensing using the one or more first sensors.
1. An apparatus comprising:
circuitry; and
memory connected to the circuitry,
wherein the circuitry, in operation:
obtains first sensing data from a first mobile apparatus including one or more first sensors, the first sensing data including data generated by sensing using the one or more first sensors;
obtains second sensing data from a second mobile apparatus including one or more second sensors, the second sensing data including data generated by sensing using the one or more second sensors; and
generates synthesized data from the first sensing data and the second sensing data based on position information indicating a position of the second mobile apparatus,
wherein the position information is a relative position of the second mobile apparatus relative to the first mobile apparatus,
and
wherein the position information is estimated from a result of the sensing using the one or more first sensors.
6. Claims 1-2 of present application are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1 and 9, respectively, of U.S. Patent No. 11,741,717 (patent 717). Although the claims at issue are not identical, they are not patentably distinct from each other because they are similar and the present claims are slightly broader.
7. The following table shows a correspondence between the claims of present application and claims of patent 717.
Claims of present application
1
2
Claims of patent 547
1
9
8. The following table shows a correspondence between the limitations of claim 1 of present application with claim 1 of 717
Claim 1 of present application
Claim 1 of patent 717
1. An apparatus comprising:
circuitry; and
memory connected to the circuitry,
wherein the circuitry, in operation:
obtains first sensing data from a first mobile apparatus including one or more first sensors, the first sensing data including data generated by sensing using the one or more first sensors;
obtains second sensing data from a second mobile apparatus including one or more second sensors, the second sensing data including data generated by sensing using the one or more second sensors;
1. A data generator, comprising:
circuitry; and
memory connected to the circuitry,
wherein the circuitry, in operation:
obtains sensing data from each of a plurality of moving bodies that includes a plurality of sensors, the sensing data being configured based on results of sensing by the plurality of sensors;
and
generates synthesized data by mapping the sensing data of the moving body into a virtual space,
and
generates synthesized data from the first sensing data and the second sensing data based on position information indicating a position of the second mobile apparatus,
wherein when generating the synthesized data, the circuitry determines a position of the sensing data to be mapped into the virtual space, based at least on a position of the moving body in a real space corresponding to the sensing data, and
wherein the circuitry further:
obtains, from each of the plurality of moving bodies, position information at a time when the sensing data of the moving body is generated, the position information indicating the position of the moving body in the real space; andssss
and
wherein the position information is estimated from a result of the sensing using the one or more first sensors.
when generating the synthesized data, determines the position of the sensing data in the virtual space obtained from the moving body, based on the position indicated by the position information of the moving body.
Claim Rejections - 35 USC § 103
9. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
10. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
11. Claims 1-2 are rejected under 35 U.S.C. 103 as being unpatentable over Du et al. (US Patent Application Publication No. 2019/0052842 A1) in view of Caveney et al. (US Patent Application Publication No. 2017/0291540 A1).
12. Regarding Claim 1, Du discloses An apparatus comprising: circuitry; and memory connected to the circuitry, wherein the circuitry, in operation: (paragraph [0067] reciting “As will be well understood by those skilled in the art, the several and various steps and processes discussed herein to describe the invention may be referring to operations performed by a computer, a processor or other electronic calculating device that manipulate and/or transform data using electrical phenomenon. Those computers and electronic devices may employ various volatile and/or non-volatile memories including non-transitory computer-readable medium with an executable program stored thereon including various code or executable instructions able to be performed by the computer or processor, where the memory and/or computer-readable medium may include all forms and types of memory and other computer-readable media.”)
obtains first sensing data from a first mobile apparatus including one or more first sensors, the first sensing data including data generated by sensing using the one or more first sensors; (paragraph [0058] reciting “… The method is first operative to collect data 710 from onboard sensors such as cameras, LiDAR sensors and radar. The collected data may include a front view from a camera. …” Data 710 corresponds to first sensing data. Lidars/radars/cameras correspond to one or more sensors. The vehicle corresponds to first mobile apparatus.)
obtains second sensing data from a second mobile apparatus including one or more second sensors, the second sensing data including data generated by sensing using the one or more second sensors; and (paragraph [0058] reciting “… The method is then operative to receive data from other sources 720, such as proximate vehicles or infrastructure. The received data may be a camera view from a proximate vehicle. …” Proximate vehicle corresponds to second mobile apparatus. Its cameras/LiDAR sensors/radar correspond to one or more second sensors and the data those collect correspond to the second sensing data.)
generates synthesized data from the first sensing data and the second sensing data based on position information indicating a position of the second mobile apparatus, and (paragraph [0061] reciting “In the proposed system may utilize a vision enhanced V2V arrangement resident on a host vehicle wherein camera systems monitor and conditionally communicate detected vehicles. The longitudinal and lateral offset of target vehicle with respect to the host is converted to global coordinate frame. The system is operative to detect object overlap between V2V and camera and merge the information into a common data structure, such as a proximate object map. The system then conditionally communicate surrogate message, such as basic service message 1 (BSM part 1) to provide situation awareness detail for other V2V vehicles. In addition, the system may be operative to communicate surrogate BSM part 2 on hard braking or other detected events.”
The longitudinal and lateral offset of the target vehicle (second vehicle) with respect to the host (first vehicle) corresponds to position information of the second vehicle. And this information is used to generate the proximate object map which is synthesized data from information from both host and target vehicle.) While not explicitly disclosed by Du, Caveney discloses wherein the position information is estimated from a result of the sensing using the one or more first sensors. (paragraph [0021] reciting “In one or more arrangements, the sensor system 40 can include radar sensor(s) 46. The radar sensor(s) 46 can be any device, component and/or system that can detect something using at least in part radio signals. The radar sensor(s) 46 can be configured to detect the presence of one or more objects in the external environment 12 of the vehicle 10, the position of each detected object relative to the vehicle 10, the distance between each detected object and the vehicle
10 in one or more directions (e.g., in the longitudinal direction α, the lateral direction β and/or other direction(s)), the elevation of each detected object, the speed of each detected object, and/or the movement of each detected object. The radar sensor(s) 46, or data obtained thereby, can determine or be used to determine the speed, position, and/or orientation of objects in the external environment 12 of the vehicle 10. The radar sensor(s) 46 can have three dimensional coordinate data associated with the objects.”
It would have been obvious to a person of ordinary skills in the art before the effective filing date of the claimed invention to modify Du with Caveney so that the relative position of the target vehicle with respect to a host vehicle is determined by using a radar. This is an obviously beneficial modification since the relative position of the target vehicle (longitudinal and lateral offsets) is required in Du to determine a global position of the target vehicle so their respective camera images can be combined into a proximate object map. Du further discloses sensors being radars. Therefore, the radar in Du can be modified by the teachings of Isaji to find the relative position of the target vehicle with respect to the host vehicle.
13. Regarding Claim 2, Du discloses A method (Abstract reciting “A system and method is taught for collaborative vehicle to all (V2X) communications to improve autonomous driving vehicle performance in a heterogeneous capability environment by sharing capabilities among different vehicles. …”) comprising:
obtaining first sensing data from a first mobile apparatus including one or more first sensors, the first sensing data including data generated by sensing using the one or more first sensors; (paragraph [0058] reciting “… The method is first operative to collect data 710 from onboard sensors such as cameras, LiDAR sensors and radar. The collected data may include a front view from a camera. …” Data 710 corresponds to first sensing data. Lidars/radars/cameras correspond to one or more sensors. The vehicle corresponds to first mobile apparatus.) obtaining second sensing data from a second mobile apparatus including one or more second sensors, the second sensing data including data generated by sensing using the one or more second sensors; (paragraph [0058] reciting “… The method is then operative to receive data from other sources 720, such as proximate vehicles or infrastructure. The received data may be a camera view from a proximate vehicle. …” Proximate vehicle corresponds to second mobile apparatus. Its cameras/LiDAR sensors/radar correspond to one or more second sensors and the data those collect correspond to the second sensing data.)
and generating synthesized data from the first sensing data and the second sensing data based on position information indicating a position of the second mobile apparatus, (paragraph [0061] reciting “In the proposed system may utilize a vision enhanced V2V arrangement resident on a host vehicle wherein camera systems monitor and conditionally communicate detected vehicles. The longitudinal and lateral offset of target vehicle with respect to the host is converted to global coordinate frame. The system is operative to detect object overlap between V2V and camera and merge the information into a common data structure, such as a proximate object map. The system then conditionally communicate surrogate message, such as basic service message 1 (BSM part 1) to provide situation awareness detail for other V2V vehicles. In addition, the system may be operative to communicate surrogate BSM part 2 on hard braking or other detected events.” The longitudinal and lateral offset of the target vehicle (second vehicle) with respect to the host (first vehicle) corresponds to position information of the second vehicle. And this information is used to generate the proximate object map which is synthesized data from information from both host and target vehicle.) While not explicitly disclosed by Du, Caveney discloses wherein the position information is estimated from a result of the sensing using the one or more first sensors. (paragraph [0021] reciting “In one or more arrangements, the sensor system 40 can include radar sensor(s) 46. The radar sensor(s) 46 can be any device, component and/or system that can detect something using at least in part radio signals. The radar sensor(s) 46 can be configured to detect the presence of one or more objects in the external environment 12 of the vehicle 10, the position of each detected object relative to the vehicle 10, the distance between each detected object and the vehicle
10 in one or more directions (e.g., in the longitudinal direction α, the lateral direction β and/or other direction(s)), the elevation of each detected object, the speed of each detected object, and/or the movement of each detected object. The radar sensor(s) 46, or data obtained thereby, can determine or be used to determine the speed, position, and/or orientation of objects in the external environment 12 of the vehicle 10. The radar sensor(s) 46 can have three dimensional coordinate data associated with the objects.”
It would have been obvious to a person of ordinary skills in the art before the effective filing date of the claimed invention to modify Du with Caveney so that the relative position of the target vehicle with respect to a host vehicle is determined by using a radar. This is an obviously beneficial modification since the relative position of the target vehicle (longitudinal and lateral offsets) is required in Du to determine a global position of the target vehicle so their respective camera images can be combined into a proximate object map. Du further discloses sensors being radars. Therefore, the radar in Du can be modified by the teachings of Isaji to find the relative position of the target vehicle with respect to the host vehicle.
CONTACT
Any inquiry concerning this communication or earlier communications from the examiner should be directed to FRANK S CHEN whose telephone number is (571)270-7993. The examiner can normally be reached Mon - Fri 8-11:30 and 1:30-6.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kee Tung can be reached at 5712727794. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/FRANK S CHEN/Primary Examiner, Art Unit 2611