DETAILED ACTION
This office action is in response to applicant’s remarks filed on July 17, 2025 in application 17/899,234.
Claims 1-6, 8-13, 15-20 are presented on September 6, 2024 for examination. Claims 1, 8, and 15 are amended. Claims 7 and 14 are cancelled.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Reference Zhang et al. (US 2021/0009145), commonly owned, was published on January 14, 2021 which precede applicant’s effective filing date of August 30, 2022.
Applicant's arguments filed July 17, 2025 have been fully considered but they are not persuasive. Applicant stated that Zhang et al. does not specifically teach loading a software test program to indicate a computational hardware failure and determine a cause of the computation hardware failure.
Examiner disagreed. Zhang et al. teach of an automatic failure test that may be initiated by transmitting an executable image of a sensor processing application from host system to a sensor processing unit. The executable image my include the particular diagnostic instructions (para. 32, 37-39). Diagnosing a sensor processing unit and determine whether the sensor processing application operates properly (fig. 4, para. 40). Zhang et al. fig. 3 also shown the DUT including the sensor processing unit to execute script 311 from host software. Therefore, Zhang does load of testing software and the testing software contains script that can indicate and determine the cause of failure. For these reasons, the rejections are maintained.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-6, 8-13, 15-20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Zhang et al. (US 2021/0009145).
In regard to claim 1, Zhang et al. teach a computer-implemented method of verifying functionality of computation hardware on a device under test (DUT), the method comprising:
loading a software test program, comprising debugging software, onto the DUT, wherein the DUT comprises a plurality of computation hardware components (para. 32, the automatic factory test may be initiated by transmitting an executable image 312 of a sensor processing application from a host system to a sensor processing unit 200, fig. 4, label 402, para. 40, transmitting and storing executable image to the sensor processing unit (200), fig. 3, sensor processing unit 200, is located within the DUT (see fig. 3, labels DUT 320, and sensor processing unit 200));
executing the software test program on the DUT to test the plurality of computation hardware components, wherein, during the testing, the method further comprises: (para. 33, the sensor processing unit may execute and launch the executable image 312 in the DRAM 201 from the eMMC 202 storage device, fig. 4, label 404, para. 40, at block 404, the processing logic may cause the sensor processing unit 200 to execute and launch the executable image 312 of the sensor processing application in the DRAM 201 of the sensor processing unit 200, para. 29, fig. 2 shows an example sensor processing unit having interfaces for GPS and camera sensors. For additional sensors see [0021-0022]; see [0021] plural processing modules fig. 1B, labels 101A-C for each sensor unit 100)
instructing, by the software test program, one or more devices external to the DUT to provide one or more signals to one or more of the plurality of computation hardware component (para. 34, sensor processing unit 200 may transmit a sequence of predetermined commands to the executed sensor processing application to perform a plurality of sensor data processing operations on sensor data obtained from a plurality of sensors or sensor simulators associated with an autonomous driving vehicle, fig. 4, label 406, para. 40, at block 406, the processing logic may transmit a sequence of predetermined commands to the executed sensor processing application to perform multiple sensor data processing operations on sensor data obtained from multiple sensors or sensor simulators associated with an autonomous driving vehicle, fig. 3, label Test Board 330, para. 31, the DUT 320 may also be connected to a test board 330, which may include a GPS simulator 331 and one or more sensors 332. Sensors 332 may be actual sensors or may be sensor simulators which may simulate various visual, electrical, or other suitable sensors.”, which shows the sensor signals (e.g., GPS simulator 331 located on a test board 330, which is external to DUT 320)); and
receiving, at the DUT, the one or more signals from the one or more devices external to the DUT (para. 34, sensor processing unit 200 may transmit a sequence of predetermined commands to the executed sensor processing application to perform a plurality of sensor data processing operations on sensor data obtained from a plurality of sensors or sensor simulators associated with an autonomous driving vehicle, fig. 4, label 406, para. 40, at block 406, the processing logic may transmit a sequence of predetermined commands to the executed sensor processing application to perform multiple sensor data processing operations on sensor data obtained from multiple sensors or sensor simulators associated with an autonomous driving vehicle); and
generating, by the software test program executing on the DUT, a set of test results in response to testing the plurality of computation hardware components based on the one or more signals received from the one or more devices external to the DUT (para. 35, the sensor processing unit 200 may compare processing results of the sensor processing operations against expected processing results to determine whether the sensor processing application operates properly, fig. 4, label 408, para. 48, at block 408, the processing logic may compare processing results of the sensor processing operations against expected processing results to determine whether the sensor processing application operates properly), wherein, responsive to the set of test results indicating a computation hardware failure (the sensor processing unit may compare processing results against expected processing results to determine whether the sensor processing application operates properly, para. 35), determining (determine that a PCIe interface on the sensor processing unit is functional, para. 38, determine that a GPS interface or any component on the sensor processing unit is functional, para. 39), by the debugging software, a cause of the computation hardware failure (the automatic failure test may be initiated by transmitting an executable image of a sensor processing application from host system to a sensor processing unit, the executable image my include the particular diagnostic instructions, para. 32, 37-39, diagnosing a sensor processing unit and determine whether the sensor processing application operates properly, fig. 4, para. 40).
In regard to claim 2, Zhang et al. teach the method of claim 1, wherein at least one of the one or more devices external to the DUT comprises a sensor selected from a group consisting of a camera, a GPS unit, a radar unit, and a LIDAR unit (para. 24, sensor processing modules configured to process sensor data obtained from sensors, para. 21, sensors can include a variety of sensors such as, a camera, a LIDAR device, a RADAR device, a GPS receiver, an IMU, an ultrasonic sensor, a GNSS receiver, an LTE or cellular SIM card, vehicle sensors, and system sensors, etc.).
In regard to claim 3, Zhang et al. teach the method of claim 2, wherein the software test program analyzes at least a portion of the plurality of computation hardware components while in a runtime environment (para. 50, functions of autonomous driving vehicle may be controlled or managed by perception and planning system when operating in an autonomous driving mode).
In regard to claim 4, Zhang et al. teach the method of claim 1, further comprising: setting one of a plurality of levels to test the DUT, wherein one of the plurality of levels is a full functional level coverage test (para. 15, a factory test that covers all or some sensor components and communication interfaces).
In regard to claim 5, Zhang et al. teach the method of claim 1, further comprising: displaying, on a display monitor, one or more vision-based condition patterns of the DUT during the testing of the plurality of computation hardware components; and capturing, by a camera, the one or more vision-based condition patterns displayed on the display monitor (para. 57, perception module may include a computer vision system to process and analyze images captured by one or more cameras).
In regard to claim 5, Zhang et al. teach the method of claim 1, wherein the software test program is loaded onto a memory that is located on a same printed circuit board as at least a portion of the plurality of computation hardware components (para. 15, script may include instructions to diagnose a sensor processing unit located on an FPGA board, the instructions may include steps to test various I/O interfaces as well as other components).
In regard to claim 8, Zhang et al. teach a non-transitory machine-readable medium having instructions stored therein, which when executed by a processor, cause the processor to perform operations, the operations comprising:
loading a software test program, comprising debugging software, onto a device under test (DUT), wherein the DUT comprises a plurality of computation hardware components (para. 32, the automatic factory test may be initiated by transmitting an executable image 312 of a sensor processing application from a host system to a sensor processing unit 200, fig. 4, label 402, para. 40, transmitting and storing executable image to the sensor processing unit (200), fig. 3, sensor processing unit 200, is located within the DUT (see fig. 3, labels DUT 320, and sensor processing unit 200));
executing the software test program on the DUT to test the plurality of computation hardware components, wherein, during the testing (para. 33, the sensor processing unit may execute and launch the executable image 312 in the DRAM 201 from the eMMC 202 storage device, fig. 4, label 404, para. 40, at block 404, the processing logic may cause the sensor processing unit 200 to execute and launch the executable image 312 of the sensor processing application in the DRAM 201 of the sensor processing unit 200, para. 29, fig. 2 shows an example sensor processing unit having interfaces for GPS and camera sensors. For additional sensors see [0021-0022]; see [0021] plural processing modules fig. 1B, labels 101A-C for each sensor unit 100):
the software test program instructs one or more devices external to the DUT to provide one or more signals to one or more of the plurality of computation hardware components (para. 34, sensor processing unit 200 may transmit a sequence of predetermined commands to the executed sensor processing application to perform a plurality of sensor data processing operations on sensor data obtained from a plurality of sensors or sensor simulators associated with an autonomous driving vehicle, fig. 4, label 406, para. 40, at block 406, the processing logic may transmit a sequence of predetermined commands to the executed sensor processing application to perform multiple sensor data processing operations on sensor data obtained from multiple sensors or sensor simulators associated with an autonomous driving vehicle, fig. 3, label Test Board 330, para. 31, the DUT 320 may also be connected to a test board 330, which may include a GPS simulator 331 and one or more sensors 332. Sensors 332 may be actual sensors or may be sensor simulators which may simulate various visual, electrical, or other suitable sensors.”, which shows the sensor signals (e.g., GPS simulator 331 located on a test board 330, which is external to DUT 320)); and
the DUT receives the one or more signals from the one or more devices external to the DUT (para. 34, sensor processing unit 200 may transmit a sequence of predetermined commands to the executed sensor processing application to perform a plurality of sensor data processing operations on sensor data obtained from a plurality of sensors or sensor simulators associated with an autonomous driving vehicle, fig. 4, label 406, para. 40, at block 406, the processing logic may transmit a sequence of predetermined commands to the executed sensor processing application to perform multiple sensor data processing operations on sensor data obtained from multiple sensors or sensor simulators associated with an autonomous driving vehicle); and
generating, by the software test program executing on the DUT, a set of test results in response to testing the plurality of computation hardware components based on the one or more signals received from the one or more devices external to the DUT (para. 35, the sensor processing unit 200 may compare processing results of the sensor processing operations against expected processing results to determine whether the sensor processing application operates properly, fig. 4, label 408, para. 48, at block 408, the processing logic may compare processing results of the sensor processing operations against expected processing results to determine whether the sensor processing application operates properly), wherein, responsive to the set of test results indicating a computation hardware failure (the sensor processing unit may compare processing results against expected processing results to determine whether the sensor processing application operates properly, para. 35), determining (determine that a PCIe interface on the sensor processing unit is functional, para. 38, determine that a GPS interface or any component on the sensor processing unit is functional, para. 39), by the debugging software, a cause of the computation hardware failure (the automatic failure test may be initiated by transmitting an executable image of a sensor processing application from host system to a sensor processing unit, the executable image my include the particular diagnostic instructions, para. 32, 37-39, diagnosing a sensor processing unit and determine whether the sensor processing application operates properly, fig. 4, para. 40).
In regard to claim 9, Zhang et al. teach the non-transitory machine-readable medium of claim 8, wherein at least one of the one or more devices external to the DUT comprises a sensor selected from a group consisting of a camera, a GPS unit, a radar unit, and a LIDAR unit (para. 24, sensor processing modules configured to process sensor data obtained from sensors, para. 21, sensors can include a variety of sensors such as, a camera, a LIDAR device, a RADAR device, a GPS receiver, an IMU, an ultrasonic sensor, a GNSS receiver, an LTE or cellular SIM card, vehicle sensors, and system sensors, etc.).
In regard to claim 10, Zhang et al. teach the non-transitory machine-readable medium of claim 9, wherein the software test program analyzes at least a portion of the plurality of computation hardware components while in a runtime environment (para. 50, functions of autonomous driving vehicle may be controlled or managed by perception and planning system when operating in an autonomous driving mode).
In regard to claim 11, Zhang et al. teach the non-transitory machine-readable medium of claim 8, wherein the operations further comprise: setting one of a plurality of levels to test the DUT, wherein one of the plurality of levels is a full functional level coverage test (para. 15, a factory test that covers all or some sensor components and communication interfaces).
In regard to claim 12, Zhang et al. teach the non-transitory machine-readable medium of claim 8, wherein the operations further comprise: displaying, on a display monitor, one or more vision-based condition patterns of the DUT during the testing of the plurality of computation hardware components; and capturing, by a camera, the one or more vision-based condition patterns displayed on the display monitor (para. 57, perception module may include a computer vision system to process and analyze images captured by one or more cameras).
In regard to claim 13, Zhang et al. teach the non-transitory machine-readable medium of claim 8, wherein the software test program is loaded onto a memory that is located on a same printed circuit board as at least a portion of the plurality of computation hardware components (para. 15, script may include instructions to diagnose a sensor processing unit located on an FPGA board, the instructions may include steps to test various I/O interfaces as well as other components).
In regard to claim 15, Zhang et al. teach a system comprising:
a processing device (sensor processing unit, fig. 2, 200); and
a memory to store instructions (DRAM, fig. 2, 200) that, when executed by the processing device cause the processing device to:
load a software test program, comprising debugging software, onto a device under test (DUT), wherein the DUT comprises a plurality of computation hardware components (para. 32, the automatic factory test may be initiated by transmitting an executable image 312 of a sensor processing application from a host system to a sensor processing unit 200, fig. 4, label 402, para. 40, transmitting and storing executable image to the sensor processing unit (200), fig. 3, sensor processing unit 200, is located within the DUT (see fig. 3, labels DUT 320, and sensor processing unit 200));
execute the software test program on the DUT to test the plurality of computation hardware components, wherein, during the testing (para. 33, the sensor processing unit may execute and launch the executable image 312 in the DRAM 201 from the eMMC 202 storage device, fig. 4, label 404, para. 40, at block 404, the processing logic may cause the sensor processing unit 200 to execute and launch the executable image 312 of the sensor processing application in the DRAM 201 of the sensor processing unit 200, para. 29, fig. 2 shows an example sensor processing unit having interfaces for GPS and camera sensors. For additional sensors see [0021-0022]; see [0021] plural processing modules fig. 1B, labels 101A-C for each sensor unit 100):
the software test program instructs one or more devices external to the DUT to provide one or more signals to one or more of the plurality of computation hardware components (para. 34, sensor processing unit 200 may transmit a sequence of predetermined commands to the executed sensor processing application to perform a plurality of sensor data processing operations on sensor data obtained from a plurality of sensors or sensor simulators associated with an autonomous driving vehicle, fig. 4, label 406, para. 40, at block 406, the processing logic may transmit a sequence of predetermined commands to the executed sensor processing application to perform multiple sensor data processing operations on sensor data obtained from multiple sensors or sensor simulators associated with an autonomous driving vehicle, fig. 3, label Test Board 330, para. 31, the DUT 320 may also be connected to a test board 330, which may include a GPS simulator 331 and one or more sensors 332. Sensors 332 may be actual sensors or may be sensor simulators which may simulate various visual, electrical, or other suitable sensors.”, which shows the sensor signals (e.g., GPS simulator 331 located on a test board 330, which is external to DUT 320)); and
the DUT receives the one or more signals from the one or more devices external to the DUT (para. 34, sensor processing unit 200 may transmit a sequence of predetermined commands to the executed sensor processing application to perform a plurality of sensor data processing operations on sensor data obtained from a plurality of sensors or sensor simulators associated with an autonomous driving vehicle, fig. 4, label 406, para. 40, at block 406, the processing logic may transmit a sequence of predetermined commands to the executed sensor processing application to perform multiple sensor data processing operations on sensor data obtained from multiple sensors or sensor simulators associated with an autonomous driving vehicle); and
generate, by the software test program executing on the DUT, a set of test results in response to testing the plurality of computation hardware components based on the one or more signals received from the one or more devices external to the DUT (para. 35, the sensor processing unit 200 may compare processing results of the sensor processing operations against expected processing results to determine whether the sensor processing application operates properly, fig. 4, label 408, para. 48, at block 408, the processing logic may compare processing results of the sensor processing operations against expected processing results to determine whether the sensor processing application operates properly), wherein, responsive to the set of test results indicating a computation hardware failure (the sensor processing unit may compare processing results against expected processing results to determine whether the sensor processing application operates properly, para. 35), determining (determine that a PCIe interface on the sensor processing unit is functional, para. 38, determine that a GPS interface or any component on the sensor processing unit is functional, para. 39), by the debugging software, a cause of the computation hardware failure (the automatic failure test may be initiated by transmitting an executable image of a sensor processing application from host system to a sensor processing unit, the executable image my include the particular diagnostic instructions, para. 32, 37-39, diagnosing a sensor processing unit and determine whether the sensor processing application operates properly, fig. 4, para. 40).
In regard to claim 16, Zhang et al. teach the system of claim 15, wherein at least one of the one or more devices external to the DUT comprises a sensor selected from a group consisting of a camera, a GPS unit, a radar unit, and a LIDAR unit (para. 24, sensor processing modules configured to process sensor data obtained from sensors, para. 21, sensors can include a variety of sensors such as, a camera, a LIDAR device, a RADAR device, a GPS receiver, an IMU, an ultrasonic sensor, a GNSS receiver, an LTE or cellular SIM card, vehicle sensors, and system sensors, etc.).
In regard to claim 17, Zhang et al. teach the system of claim 16, wherein the software test program analyzes at least a portion of the plurality of computation hardware components while in a runtime environment (para. 50, functions of autonomous driving vehicle may be controlled or managed by perception and planning system when operating in an autonomous driving mode).
In regard to claim 18, Zhang et al. teach the system of claim 15, wherein the processing device further to: set one of a plurality of levels to test the DUT, wherein one of the plurality of levels is a full functional level coverage test (para. 15, a factory test that covers all or some sensor components and communication interfaces).
In regard to claim 19, Zhang et al. teach the system of claim 15, further comprising: a display monitor that displays one or more vision-based condition patterns of the DUT during the testing of the plurality of computation hardware components; and a camera that captures the one or more vision-based condition patterns displayed on the display monitor (para. 57, perception module may include a computer vision system to process and analyze images captured by one or more cameras).
In regard to claim 20, Zhang et al. teach the system of claim 15, wherein the software test program is loaded onto a memory that is located on a same printed circuit board as at least a portion of the plurality of computation hardware components (para. 15, script may include instructions to diagnose a sensor processing unit located on an FPGA board, the instructions may include steps to test various I/O interfaces as well as other components).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. See PTO 892.
Poppe et al. (US 11,385,285) test controller
*****************
Chandhoke et al. (US 2023/0063629) DUT and external instruments
Ellis et al. (US 2023/0359548) DUT for deployment
*****************
Bhatnagar et al. (US 2013/0018624) test station load data onto the DUT
*****************
Gopalan et al. (US 11,636,244) simulation of device under test
Wo (US 11,774,323) camera system or sensor array being tested DUT
Hellweg et al. (US 2023/0168342) radar target simulators
Lin et al. (US 2022/0207664) DUT image displays a test pattern capture by a camera
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to LOAN TRUONG whose telephone number is 408-918-7552. The examiner can normally be reached on 10AM-6PM PST M-F.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, Applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Thomas Ashish can be reached on 571-272-0631. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Loan L.T. Truong/Primary Examiner, Art Unit 2114 Loan.truong@uspto.gov