DETAILED ACTION
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Specification
The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1, 12, 15-16, and 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Jeong et al. (US 2024/0149921) in view of Roose et al. (US 10635844), Anderson et al. (US 2024/0137651), and Tariq et al. (US 2021/0201464).
Regarding claim 1, Jeong teaches/suggests: An apparatus for controlling a vehicle (Jeong Fig. 18: vehicle 1800), the apparatus comprising:
a sensor (Jeong Fig. 18: sensor 1810);
a processor (Jeong Fig. 18: processor 1820); and
a memory configured to store (Jeong Fig. 18: memory 1830),
wherein the processor is configured to:
determine, based on a plurality of frames obtained by using the sensor, a plane formed by a first axis and a second axis, the second axis corresponding to a driving direction of the vehicle, and the first axis being perpendicular to the second axis (Jeong [0107] “an electronic device may collect sensing data during driving” [0059] “a position of the moving object 310 on a longitudinal axis (e.g., an x-axis)” [0061] “the ROI 320 may have the width of two lanes (e.g., the driving lane 381 and a left lane of the driving lane 381) in a lateral direction (e.g., the y-axis direction)” [The y- and x-axis meet the first and second axes, respectively.]);
detect objects, external to the vehicle, in regions of interest of the plane (Jeong [0108] “the electronic device may detect an object from the sensing data” [0068] “determine an available sensing region 360 of the sensor to be a region that is not occluded by an obstacle object 390 among an overlapping region of the ROI 320 and a sensing range 350 of the sensor”);
determine virtual boxes corresponding to the objects (Jeong Fig. 15: the illustrated virtual boxes);
generate a first field of view (FOV) based on the virtual boxes (Jeong [0049] “The sensing range (e.g., an FOV) of the sensor mounted on the autonomous vehicle may be limited by various surrounding environments” [0126] “The electronic device may secure an additional sensing region 1550 by driving with a target position offset in which a distance to the obstacle object 1590 increases” [The sensing region meets the first FOV.]);
determine a state of a boundary region of the sensor (Jeong [0110] “The electronic device in an example may monitor an FOV of the sensor during driving … when a ratio of the unavailable sensing region to the total sensing area of the sensor exceeds a threshold”);
Jeong does not teach/suggest:
determine ground points indicating a ground on the plane;
generate a second FOV based on the ground points;
Roose, however, teaches/suggests:
determine ground points indicating a ground on the plane (Roose col. 20 ll. 37-61 “calculate the region on the ground viewed by the camera's field of view out to the detector's maximum range”);
generate a second FOV based on the ground points (Roose col. 20 ll. 37-61 “calculate the region on the ground viewed by the camera's field of view out to the detector's maximum range” [The maximum range meets the second FOV.]);
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the FOV of Jeong to include the maximum range of Roose for the monitoring.
Jeong and Roose are silent regarding specification information of the sensor. Jeong and Roose are further silent regarding:
generate a third FOV based on at least one of: the first FOV, the second FOV, or angle information included in the specification information;
determine a state of a boundary region of the sensor based on the third FOV;
Anderson, however, teaches/suggests specification information of the sensor (Anderson [0067] “The memory 118 may store a third threshold range that includes angles greater than 135 degrees ... the amount of preset effective FOV values stored in the memory 118”). Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the FOV of Jeong as modified by Roose to be stored as taught/suggested by Anderson for the monitoring. As such, Jeong as modified by Roose and Anderson teaches/suggests:
generate a third FOV based on at least one of: the first FOV, the second FOV, or angle information included in the specification information (Jeong [0049] “The sensing range (e.g., an FOV) of the sensor mounted on the autonomous vehicle may be limited by various surrounding environments” [0126] “The electronic device may secure an additional sensing region 1550 by driving with a target position offset in which a distance to the obstacle object 1590 increases” Roose col. 20 ll. 37-61 “calculate the region on the ground viewed by the camera's field of view out to the detector's maximum range” Anderson [0067] “The memory 118 may store a third threshold range that includes angles greater than 135 degrees ... the amount of preset effective FOV values stored in the memory 118”);
determine a state of a boundary region of the sensor based on the third FOV (Jeong [0110] “The electronic device in an example may monitor an FOV of the sensor during driving … when a ratio of the unavailable sensing region to the total sensing area of the sensor exceeds a threshold” Anderson [0067] “The memory 118 may store a third threshold range that includes angles greater than 135 degrees ... the amount of preset effective FOV values stored in the memory 118”);
Jeong as modified by Roose and Anderson does not teach/suggest:
output a signal indicating the determined state of the boundary region of the sensor.
Tariq, however, teaches/suggests:
output a signal indicating the determined state of the boundary region of the sensor (Tariq [0074] “one or more operations of the autonomous vehicle 102 may be controlled in response to the detection of the degradation” [0079] “an automated cleaning process may be initiated to clean a surface of the sensor 104”).
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the monitoring of Jeong as modified by Roose and Anderson to be reported as taught/suggested by Tariq for automated cleaning.
Regarding claim 12, Jeong as modified by Roose, Anderson, and Tariq teaches/suggests: The apparatus of claim 1, wherein the processor is configured to:
determine a normal state of the boundary region of the sensor based on a difference between the third FOV and the angle information satisfying a threshold angle (Jeong [0110] “The electronic device in an example may monitor an FOV of the sensor during driving … when a ratio of the unavailable sensing region to the total sensing area of the sensor exceeds a threshold” Anderson [0067] “The memory 118 may store a third threshold range that includes angles greater than 135 degrees ... the amount of preset effective FOV values stored in the memory 118” [In view of Jeong and Anderson, the ratio not exceeding the threshold meets the normal state.]).
The same rationale to combine as set forth in the rejection of claim 1 above is incorporated herein.
Regarding claim 15, Jeong as modified by Roose, Anderson, and Tariq does not teach/suggest: The apparatus of claim 1, wherein the processor is configured to:
before determining a contamination state of the sensor and including a contamination level of the sensor in the specification information, determine a state of the boundary region based on at least one of: the first FOV, the second FOV, the third FOV, or the angle information, wherein the contamination state is determined based on the contamination level satisfying a threshold value.
However, the concept and advantages of including a contamination level of the sensor in the specification information are well known and expected in the art (Official Notice). It would have been obvious for the contamination level of the sensor of Jeong as modified by Roose, Anderson, and Tariq to be determined and stored during the factory installation for known contamination.
Claim 16 recites limitation(s) similar in scope to those of claim 1, and is rejected for the same reason(s).
Regarding claim 19, Jeong as modified by Roose, Anderson, and Tariq teaches/suggests: The method of claim 16, further comprising:
determining a normal state of the boundary region of the sensor based on a difference between the third FOV and the angle information satisfying a threshold angle (Jeong [0110] “The electronic device in an example may monitor an FOV of the sensor during driving … when a ratio of the unavailable sensing region to the total sensing area of the sensor exceeds a threshold” Anderson [0067] “The memory 118 may store a third threshold range that includes angles greater than 135 degrees ... the amount of preset effective FOV values stored in the memory 118” [In view of Jeong and Anderson, the ratio not exceeding the threshold meets the normal state.]); or
determining an abnormal state of the boundary region of the sensor based on the difference between the third FOV and the angle information not satisfying the threshold angle (Jeong [0110] “The electronic device in an example may monitor an FOV of the sensor during driving … when a ratio of the unavailable sensing region to the total sensing area of the sensor exceeds a threshold” Anderson [0067] “The memory 118 may store a third threshold range that includes angles greater than 135 degrees ... the amount of preset effective FOV values stored in the memory 118” [In view of Jeong and Anderson, the ratio exceeding the threshold meets the abnormal state.]).
The same rationale to combine as set forth in the rejection of claim 1 above is incorporated herein.
Claim(s) 2, 5, and 13-14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Jeong et al. (US 2024/0149921) in view of Roose et al. (US 10635844), Anderson et al. (US 2024/0137651), and Tariq et al. (US 2021/0201464) as applied to claim 1 above, and further in view of Russell (US 2017/0357270).
Regarding claim 2, Jeong, Roose, Anderson, and Tariq are silent regarding: The apparatus of claim 1, wherein the processor is configured to:
specify a first region of interest (ROI) of the regions of interest, wherein the first ROI includes a region, which is between: a line rotated by a first angle from a half-line facing a positive direction of the second axis, and the half-line; and
determine virtual boxes in the first ROI.
Russell, however, teaches/suggests:
specify a first region of interest (ROI) of the regions of interest, wherein the first ROI includes a region, which is between: a line rotated by a first angle from a half-line facing a positive direction of the second axis, and the half-line (Russell [0104] “An angle of sensor 505 with respect to pallet jack 500 in position 506b may comprise a minimum sensor angle that results in the extent of the field of view of the sensor 505 intersecting the position of object 510”);
determine virtual boxes in the first ROI.
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the FOV of Jeong as modified by Roose, Anderson, and Tariq to include the minimum sensor angle (the first angle) of Russell for the monitoring. As such, Jeong as modified by Roose, Anderson, Tariq, and Russell teaches/suggests:
determine virtual boxes in the first ROI (Jeong Fig. 15: the illustrated virtual boxes; Russell [0104] “An angle of sensor 505 with respect to pallet jack 500 in position 506b may comprise a minimum sensor angle that results in the extent of the field of view of the sensor 505 intersecting the position of object 510”).
Regarding claim 5, Jeong as modified by Roose, Anderson, Tariq, and Russell teaches/suggests: The apparatus of claim 2, wherein the processor is configured to:
specify a second ROI including a region, which is between: a line rotated by a second angle, greater than the first angle, from the half-line, and another line rotated by a third angle from the half-line (Russell [0104] “An angle of sensor 505 with respect to pallet jack 500 in position 506b may comprise a minimum sensor angle that results in the extent of the field of view of the sensor 505 intersecting the position of object 510” [The maximum sensor angle meets the second angle.]); and
determine virtual boxes in the second ROI (Jeong Fig. 15: the illustrated virtual boxes; Russell [0104] “An angle of sensor 505 with respect to pallet jack 500 in position 506b may comprise a minimum sensor angle that results in the extent of the field of view of the sensor 505 intersecting the position of object 510”).
The same rationale to combine as set forth in the rejection of claim 2 above is incorporated herein.
Regarding claim 13, Jeong as modified by Roose, Anderson, Tariq, and Russell teaches/suggests: The apparatus of claim 5, wherein the processor is configured to:
determine an abnormal state of the boundary region of the sensor based on a difference between the third FOV and the angle information satisfying a threshold angle (Jeong [0110] “The electronic device in an example may monitor an FOV of the sensor during driving … when a ratio of the unavailable sensing region to the total sensing area of the sensor exceeds a threshold” Anderson [0067] “The memory 118 may store a third threshold range that includes angles greater than 135 degrees ... the amount of preset effective FOV values stored in the memory 118” [In view of Jeong and Anderson, the ratio exceeding the threshold meets the abnormal state.]).
The same rationale to combine as set forth in the rejection of claim 1 above is incorporated herein.
Regarding claim 14, Jeong as modified by Roose, Anderson, Tariq, and Russell teaches/suggests: The apparatus of claim 13, wherein the processor is configured to:
after determining the abnormal state of the boundary region, determine an abnormal state of at least one of: the first ROI, or the second ROI, based on at least one of: the first FOV, the second FOV, the third FOV, or the angle information (Jeong [0110] “The electronic device in an example may monitor an FOV of the sensor during driving … when a ratio of the unavailable sensing region to the total sensing area of the sensor exceeds a threshold” Anderson [0067] “The memory 118 may store a third threshold range that includes angles greater than 135 degrees ... the amount of preset effective FOV values stored in the memory 118” [In view of Jeong and Anderson, the monitoring during the driving meets the determining.]).
The same rationale to combine as set forth in the rejection of claim 1 above is incorporated herein.
Allowable Subject Matter
Claims 3-4, 6-11, 17-18, and 20 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
The following is a statement of reasons for the indication of allowable subject matter: The limitations “determine a plurality of line segments, each of the plurality of line segments connecting the vehicle and one of a plurality of points included in the virtual boxes in the first ROI” and “store, in the memory, at least one of: first minimum angles comprising the minimum angle, or first minimum angle points, of the plurality of points, for forming each of the first minimum angles, wherein the first minimum angle points are determined in each of the plurality of frames” in claims 3 and 4; “determine a plurality of line segments, each of plurality of line segments connecting the vehicle and one of a plurality of points included in the virtual boxes in the second ROI” and “store, in the memory, at least one of: first maximum angles comprising the maximum angle, or first maximum angle points, of the plurality of points, for forming each of the first maximum angles, wherein the first maximum angle points are determined in each of the plurality of frames” in claims 6 and 7; “determine a plurality of third line segments, each of the plurality of third line segments connecting the vehicle and one of the ground points” and “store, in the memory, at least one of: second minimum angles comprising the minimum angle, or second minimum angle points, of the ground points, for forming each of the second minimum angles, wherein the second minimum angle points are determined in each of the plurality of frames” in claims 8 and 9; “determine a plurality of fourth line segments, each of the plurality of fourth line segments connecting the vehicle and one of the ground points” and “store, in the memory, at least one of: second maximum angles comprising the maximum angle, or second maximum angle points, of the ground points, for forming each of the second maximum angles, wherein the second maximum angle points are determined in each of the plurality of frames” in claims 10 and 11; “determining a plurality of first line segments, each of the plurality of first line segments connecting the vehicle and one of a plurality of first points included in the virtual boxes in the first ROI,” “determining a plurality of second line segments, each of the plurality of second line segments connecting the vehicle and one of a plurality of second points included in the virtual boxes in the second ROI,” and “generating the first FOV based on at least one of: a first global minimum angle, which is the smallest of the first minimum angles, a first global minimum angle point, of the first minimum angle points, for forming the first global minimum angle, a first global maximum angle, which is the greatest of the first maximum angles, or a first global maximum angle point, of the first maximum angle points, for forming the first global maximum angle” in claims 17 and 20; and “determining a plurality of third line segments, each of the plurality of third line segments connecting the vehicle and one of the ground points,” “determining a plurality of fourth line segments, each of the plurality of fourth line segments connecting the vehicle and one of the ground points,” and “generating the second FOV based on at least one of: a second global minimum angle, which is the smallest of second minimum angles, a second global minimum angle point, of second minimum angle points, for forming the second global minimum angle, a second global maximum angle, which is the greatest of second maximum angles, or a second global maximum angle point, of second maximum angle points, for forming the second global maximum angle among the second maximum angle points” in claim 19, taken as a whole render the respective claims patentable over the prior art.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
US 2019/0092287 – contamination level of sensor surface
US 2020/0098394 – detect errors in sensor data
US 2020/0244950 – sensor blemish detection
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ANH-TUAN V NGUYEN whose telephone number is 571-270-7513. The examiner can normally be reached on M-F 9AM-5PM ET. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, JASON CHAN can be reached on 571-272-3022. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ANH-TUAN V NGUYEN/
Primary Examiner, Art Unit 2619