Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-3, 5-10, and 14-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Klus (DE 102021121157 A1).
Regarding claim 1, Klus discloses a system comprising:
a set of ultrasonic sensors having respective fields of view that form a collective field of view having a detection range [[fig. 13] shows four sensors with overlapping fields of view], the detection range comprising a first range and a second range [[fig. 13] shows detection ranges USSE1, USSE2, USSE3, USSE4];
a processing system to perform operations including:
determining a first location of a first object within the first range based at least on a trilateration process that is performed based at least on detection of the first object using the set of ultrasonic sensors [[0050] the determination of an intersection point is referred to as trilateration in the following chapters.; [0051] in contrast, general 2D space trilateration is based on the intersection points of three circles. The third circle determines which intersection point of the two circles leads to the desired position. [0052] the trilateration of two ultrasonic sensors enables the calculation of the position of an object.; [0077] the trilaterations of each channel require three ultrasonic sensors. [fig. 16] shows transmissions and reflections from two objects from SNSB1 to SNSB2/SNSB3; [fig. 13] shows detection of an object O with sensors SNSB1, SNSB2, and SNSB3];
determining a second location of a second object that is ultrasonically detected in the second range using an individual ultrasonic sensor of the set of ultrasonic sensors, the determining of the second location being based at least on a detected distance between the individual ultrasonic sensor and the second object [[0080] if this trilateration does not lead to a solution, the procedure performed by the ultrasonic sensor system also includes a fallback to a single ultrasonic sensor. the method detects an obstacle in this outer area if only the transmitting ultrasonic sensor receives an ultrasonic echo. it first checks whether the ultrasonic echo does not belong to another object by comparing the ultrasonic echo with the distance to objects calculated by the other channel]
causing performance of one or more control operations associated with the system [[0003] ultrasonic based obstacle detection systems … an environment map in the form of a point cloud which serves as orientation for the application device i.e., the robots and/or vehicles … control computers of these application devices then determine for example a target route through the terrain symbolized by the environment map].
Regarding claim 2, Klus teaches the system of claim 1, wherein the determining of the second location of the second object based at least on the detected distance is based at least on an arc formed from the individual ultrasonic sensor and having a radius of the detected distance [[0048] Therefore, the position of the object can be visualized as the 2D intersection of two circles … the radii of the two circles are equal to the distance of a reflecting surface.; [0051] in contrast, general 2D space trilateration is based on the intersection points of three circles … ultrasonic sensor of the exemplary test setup generates a semicircle in the positive direction].
Regarding claim 3, Klus teaches the system of claim 2, wherein the determining of the second location of the second object based at least on the arc includes determining that the second object is located anywhere along the arc [[0051] ultrasonic sensor of the exemplary test setup generates a semicircle in the positive direction].
Regarding claim 5, Klus teaches the system of claim 1, wherein the first range includes parts of the detection range in which at least two or more individual fields of view overlap [[fig. 21] shows overlapping sensor views denoted 1, 2, 3].
Regarding claim 6, Klus teaches the system of claim 1, wherein the second range includes parts of the detection range in which the respective fields of view do not overlap [[fig. 21] shows a non-overlapping view denoted 0].
Regarding claim 7, Klus teaches the system of claim 1, wherein an individual sensor of the set of ultrasonic sensors is associated with one or more respective fields of view [[fig. 21] shows multiple overlapping sensor views labeled (0,1) (0,1,2) (0,1,2,3) and so on].
Regarding claim 8, Klus discloses a method comprising:
obtaining sensor data indicating an object that is ultrasonically detected using a single ultrasonic sensor [[0080] if this trilateration does not lead to a solution, the procedure performed by the ultrasonic sensor system also includes a fallback to a single ultrasonic sensor. the method detects an obstacle in this outer area if only the transmitting ultrasonic sensor receives an ultrasonic echo. it first checks whether the ultrasonic echo does not belong to another object by comparing the ultrasonic echo with the distance to objects calculated by the other channel];
determining an estimated location of the object based at least on a detected distance of the object from the single ultrasonic sensor, as indicated by the sensor data [[0003] ultrasonic sensor system sends out an ultrasonic signal, preferably with several ultrasonic pulses, and calculates the distance to an obstacle based on the first echo received. State-of-the-art ultrasonic sensor systems for parking sensors use the same principle]; and
performing one or more operations associated with a machine based at least on the estimated location of the object [[0003] ultrasonic based obstacle detection systems … an environment map in the form of a point cloud which serves as orientation for the application device i.e., the robots and/or vehicles … control computers of these application devices then determine for example a target route through the terrain symbolized by the environment map].
Regarding claim 9, Klus teaches the method of claim 8, wherein the determining of the estimated location of the object based at least on the detected distance is based at least on an arc formed from the single ultrasonic sensor and having a radius of the detected distance [[0051]].
Regarding claim 10, Klus teaches the method of claim 9, wherein the determining of the estimated location of the object based at least on the arc includes determining that the object is located anywhere along the arc [[0048][0051]].
Regarding claim 14, Klus teaches the method of claim 8, further comprising determining a second location of a second object based at least a trilateration process that is performed with respect to second sensor data corresponding to a set of ultrasonic sensors that includes the single ultrasonic sensor [[0050] the determination of an intersection point is referred to as trilateration in the following chapters.; [0051] in contrast, general 2D space trilateration is based on the intersection points of three circles. The third circle determines which intersection point of the two circles leads to the desired position. [0052] the trilateration of two ultrasonic sensors enables the calculation of the position of an object.; [0077] the trilaterations of each channel require three ultrasonic sensors].
Regarding claim 15, Klus teaches the method of claim 8, further comprising determining a second location of a second object based at least a trilateration process that is performed with respect to second sensor data corresponding to a set of ultrasonic sensors that excludes the single ultrasonic sensor [[0004] calculates trilaterations based on multiple received ultrasonic echoes from multiple ultrasonic sensors when measuring across multiple channels; [0080]].
Regarding claim 16, Klus teaches a system comprising: one or more processing units to perform operations comprising: obtaining sensor data indicating an object that is detected using less than two ultrasonic sensors [[fig. 13][0080]]; determining an estimated location of the object based at least on a detected distance, as indicated by the sensor data, of the object from an individual ultrasonic sensor corresponding to detection of the object; and causing performance of one or more operations associated with the system based at least on the estimated location of the object [[0003][0050][0051][0052][0077]].
Regarding claim 17, Klus teaches the system of claim 16, wherein the operations further comprise determining a second location of a second object based at least a trilateration process that is performed with respect to second sensor data corresponding to a set of ultrasonic sensors that includes the individual ultrasonic sensor [[0048][0051]].
Regarding claim 18, Klus teaches the system of claim 16, wherein the operations further comprise determining a second location of a second object based at least a trilateration process that is performed with respect to second sensor data corresponding to a set of ultrasonic sensors that excludes the individual ultrasonic sensor [[0004] calculates trilaterations based on multiple received ultrasonic echoes from multiple ultrasonic sensors when measuring across multiple channels; [0080]].
Regarding claim 19, Klus teaches the system of claim 16, wherein the determining of the estimated location of the object based at least on the detected distance is based at least on an arc formed from the individual ultrasonic sensor and having a radius of the detected distance [[0051] ultrasonic sensor of the exemplary test setup generates a semicircle in the positive direction].
Regarding claim 20, Klus teaches the system of claim 16, wherein the system is comprised in at least one of: a control system for an autonomous or semi-autonomous machine [[0001] ultrasonic sensor system for use in autonomous vehicles]; a perception system for an autonomous or semi-autonomous machine; a system for performing simulation operations; a system for performing digital twin operations; a system for performing light transport simulation; a system for performing collaborative content creation for 3D assets; a system for performing deep learning operations; a system for presenting at least one of augmented reality content, virtual reality content, or mixed reality content; a system for hosting one or more real-time streaming applications; a system implemented using an edge device; a system implemented using a robot; a system for performing conversational AI operations; a system for performing one or more generative AI operations; a system implementing one or more large language models (LLMs); a system implementing one or more vision language models (VLMs); a system implementing one or more multi-modal language models; a system for generating synthetic data; a system incorporating one or more virtual machines (VMs); a system implemented at least partially in a data center; or a system implemented at least partially using cloud computing resources.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 4 and 11-13 are rejected under 35 U.S.C. 103 as being unpatentable over Klus (DE 102021121157 A1) as applied to claim 2 and 8-9 above, and further in view of Smith (US 2015/0166060 A1).
Regarding claim 4, Klus does not explicitly teach and yet Smith teaches the system of claim 2, wherein the determining of the second location of the second object based at least on the arc includes assuming that the second object is located at a center of the arc [[fig. 10b] shows ray Dmin in the center of the arc #532 #534].
It would have been obvious to a person having ordinary skill in the art prior to the effective filing date of the invention to modify the ultrasonic distance ranging as taught by Klus, with the assumption of a center distance in the middle of the ultrasonic sensor arc as shown by Smith so that the shortest distance to a wall is registered (Smith) [[0073]].
Regarding claim 11, Klus does not explicitly teach and yet Smith teaches the method of claim 9, wherein the determining of the estimated location of the object based at least on the arc includes assuming that the object is located at a center of the arc [[fig. 10b] shows ray Dmin in the center of the arc #532 #534].
It would have been obvious to a person having ordinary skill in the art prior to the effective filing date of the invention to modify the ultrasonic distance ranging as taught by Klus, with the assumption of a center distance in the middle of the ultrasonic sensor arc as shown by Smith so that the shortest distance to a wall is registered (Smith) [[0073]].
Regarding claim 12, Klus does not explicitly teach and yet Smith teaches the method of claim 9, wherein a size of the arc is limited to a field of view of the single ultrasonic sensor [[fig. 9b] shows sensor fields of view r1, r2, r3, r4 which do not overlap similarly to instant fig. 2b and element #204c; [0090]].
It would have been obvious to a person having ordinary skill in the art prior to the effective filing date of the invention to modify the ultrasonic distance ranging as taught by Klus, overlapping or nonoverlapping sensors as taught by Smith so that a wider or narrower field of view may be obtained (Smith) [[0090]].
Regarding claim 13, Klus does not explicitly teach and yet Smith teaches the method of claim 8, wherein the estimated location is assumed to be straight away from the single ultrasonic sensor at the detected distance [[fig. 10b] ray Dmin is perpendicular to sensor face].
It would have been obvious to a person having ordinary skill in the art prior to the effective filing date of the invention to modify the ultrasonic distance ranging as taught by Klus, with the assumption of a perpendicular ray from sensor face as shown by Smith so that the shortest distance to a wall is registered (Smith) [[0073]].
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JONATHAN D ARMSTRONG whose telephone number is (571)270-7339. The examiner can normally be reached M - F 9am-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Isam Alsomiri can be reached at 571-272-6970. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JONATHAN D ARMSTRONG/ Examiner, Art Unit 3645