Prosecution Insights
Last updated: April 19, 2026
Application No. 18/579,353

SYSTEMS AND METHODS FOR SENSING ENVIRONMENT AROUND VEHICLES

Non-Final OA §103§112
Filed
Jan 14, 2024
Examiner
ABRAHAM, JOHN BISHOY SAM
Art Unit
3646
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
VAYYAR IMAGING LTD.
OA Round
1 (Non-Final)
71%
Grant Probability
Favorable
1-2
OA Rounds
2y 4m
To Grant
99%
With Interview

Examiner Intelligence

Grants 71% — above average
71%
Career Allow Rate
5 granted / 7 resolved
+19.4% vs TC avg
Strong +40% interview lift
Without
With
+40.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
37 currently pending
Career history
44
Total Applications
across all art units

Statute-Specific Performance

§101
13.7%
-26.3% vs TC avg
§103
44.1%
+4.1% vs TC avg
§102
19.4%
-20.6% vs TC avg
§112
22.3%
-17.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 7 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority Applicant’s claim for the benefit of a prior-filed application under 35 U.S.C. 119(e) or under 35 U.S.C. 120, 121, 365(c), or 386(c) is acknowledged. Applicant has not complied with one or more conditions for receiving the benefit of an earlier filing date under 35 U.S.C. 120 and 365(c) as follows: The later-filed application must be an application for a patent for an invention which is also disclosed in the prior application (the parent or original nonprovisional application or provisional application). The disclosure of the invention in the parent application and in the later-filed application must be sufficient to comply with the requirements of 35 U.S.C. 112(a) or the first paragraph of pre-AIA 35 U.S.C. 112, except for the best mode requirement. See Transco Products, Inc. v. Performance Contracting, Inc., 38 F.3d 551, 32 USPQ2d 1077 (Fed. Cir. 1994). The disclosure of the prior-filed application, Application No. 18135784, fails to provide adequate support or enablement in the manner provided by 35 U.S.C. 112(a) or pre-AIA 35 U.S.C. 112, first paragraph for one or more claims of this application. The disclosure of Application No. 18135784 fails to include claims elements of both independent claims 1 and 19. Accordingly, claims 1-5, 7-14, and 16-20 are not entitled to the benefit of the prior application. Regarding claim 1, Application No. 18135784 fails to disclose a method for determining the angle of a moving object including selecting a series of candidate angles; for each candidate angle, constructing a virtual box parallel to an associated candidate horizon, and counting reflections within the virtual box; selecting the candidate angle with largest number of reflections within the virtual box. Accordingly, dependent claims 2-3 and 7-14 are also not entitled to the benefit of the prior application. Regarding claim 19, Application No. 18135784 fails to disclose the system for sensing the surroundings of a vehicle comprising: a tilt detection module operable to calculate tilt angle of the vehicle. While the specification does disclose “calculating energy-profile within a virtual box” (Fig. 2B, Step 253; Page 14 line 21-27), it is not connected to measuring a tilt angle, rather it is used in a wall detection module. Accordingly, dependent claims 16-18 and 20 are also not entitled to the benefit of the prior application. Information Disclosure Statement The information disclosure statement (IDS) submitted on 01/22/2024 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Claim Objections Claims 1, 13 and 14 are objected to because of the following informalities: In claim 1, line 1: “the angle of a moving object” should read “the tilt angle of a moving object” In claim 13, line 3, “-vego●cos(φobj) for each object” should read “-vego●cos(φobj) for each object, where φobj is the horizontal angle to the object from the vehicle’s direction of travel” In claim 14, line 3, “-vego●cos(φobj)●cos(θobj)” should read “-vego●cos(φobj)●cos(θobj) for each object, where φobj is the horizontal angle from the vehicle’s direction of travel and θobj is the vertical angle to the object from the vehicle’s direction of travel” Appropriate correction is required. Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112: The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention. Claim 19 is rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. Regarding claim 19, the specification of the instant application does not contain the claim element “calculating energy-profile within a virtual box” or any phrase to that effect. There is nothing in the specification or drawings that describe “calculating energy-profile within a virtual box” nor how that would be relevant for a tilt detection module. Additionally, the incorporated U.S. Provisional Patents (Application No. 63/221,963, 63/230,755 and 63/284,057 contains no disclosure relevant to “calculating energy-profile within a virtual box”. While the applicant has proposed the instant application as a continuation in part of U.S. Patent Application No. 18/135,784, the disclosure of that application does contain “calculating energy-profile within a virtual box” (Fig. 2B, Step 253; Page 14 line 21-27), it is not connected to measuring a tilt angle, rather it is used in a wall detection module. For the purpose of examination, the claim element “the tilt detection module comprises a processing unit, and a memory unit storing executable code directed towards calculating energy-profile within a virtual box.” will not be considered. Dependent claims 16-18 and 20 are also rejected based on their dependency of the defected parent claim. The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-5, 10 and 16 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Regarding claim 1, it is unclear and not readily understood what is meant by “selecting a series of candidate angles”. The method of selection is not defined by the claim language, the specification does not provide a standard for ascertaining the meaning, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. The disclosure does not provide a criteria or algorithm for selecting candidate angles. Clarification is required, with reference to the disclosure to clarify the intended limitation to be imposed on the invention. Claims 2-5 and 7-14 are also rejected based on their dependency of the defected parent claim. Claim 2 recites the limitation "noisy data" in line 1. There is insufficient antecedent basis for this limitation in the claim. Claim 2 is a dependent claim of claim 1 and there are multiple sets of information which can be construed as data to which filter may be applied; the received radar signals, the electrical signals conveyed to the processing unit, the multiple layers of digital data generated from the aforementioned electrical signals (e.g. coordinates, angles, etc.). The claim fails to define what data is being filtered in the chain. For the purpose of examination, the examiner has interpreted this phrasing to mean any filtering of any data. Claims 3 recites the limitation “transmitting angle to concerned parties”, it is unclear and not readily understood what is meant by this limitation. The method of “transmitting the angle” is not defined by the claim language, the specification does not provide a standard for ascertaining the meaning, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. Is the radar transmitter of claim 1 the intended means for “transmitting angle to concerned parties”? Claims 4 recites the limitation “estimating the orientation of the vehicle mounted radar unit relative to the direction of movement of the vehicle”, it is unclear and not readily understood what is meant by this limitation. The method of measuring “the direction of movement of the vehicle” is not defined by the claim language, the specification does not provide a standard for ascertaining the meaning, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. Claims 5 recite the limitation “issuing a warning if…”, it is unclear and not readily understood what is meant by this limitation. The method of “issuing a warning” is not defined by the claim language, the specification does not provide a standard for ascertaining the meaning, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. Regarding claim 10, it is unclear and not readily understood what is meant by “aligning objects detected around the vehicle to the angle”. “Aligning objects detected around the vehicle to the angle” is not defined by the claim language, the specification does not provide a standard for ascertaining the meaning, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. For the purpose of examination, the examiner has interpreted this phrasing to mean “mapping the kinematic data of the radar from the reference frame of the vehicle to a reference frame with zero tilt angle”. Claim 16 recites the limitations “objects detected by the vehicle mounted radar unit " in lines 2-3 and “a direction corresponding to a direction in which the object was detected” in lines 3-4. There is insufficient antecedent basis for these limitation in the claim. Claim 16 is dependent upon claim 19 which recites, “a processor unit in communication with the radar receiving unit and configured to receive raw data from the radar unit and operable to generate environmental information based upon the received data”. “Environmental information is too generic and too broad to then use it as detecting objects and the direction of detected objects”. Clarification is required, with reference to the disclosure to clarify the intended limitation to be imposed on the invention. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 1-3, 7, 9 and 11-14 are rejected under 35 U.S.C. 103 as being unpatentable over Dvorecki (US PG Pub. 20200132812) further in view of Orlowski (US PG Pub. 20190004166) and He (US PG Pub. 20210323572). Regarding claim 1, Dvorecki teaches a method for determining the angle of a moving object: providing a vehicle mounted radar unit (Fig. 1, Radar system 104) comprising a radar transmission unit comprising an array of transmitter antennas connected to an oscillator (Fig. 3, transmission module 302 [0040]), and a radar receiving unit comprising at least one receiver antenna (Fig. 3, Receiver Module 306, [0041]); providing a processing unit in communication with the radar receiving unit (Fig. 6, in-vehicle processing system 610, [0050] FIGS. 5-10 illustrate example environments in which various aspects of the present disclosure may operate or various components that may be used to perform operations described herein.); transmitting electromagnetic radiation into the region surrounding the vehicle; receiving electromagnetic radiation reflected from objects in the region surrounding the vehicle ([0002] A radar system may comprise a transmission system that transmits electromagnetic waves (e.g., radio waves) via one or more antennas and a detection system comprising an array of antennas (that may or may not be the same antennas used to transmit the electromagnetic waves) that detect waves reflected off of various objects in the environment being sensed.); and transferring received electromagnetic signals to the processing unit ([0050] For example, any of the modules (e.g., 302, 304, 306, 308, 310, 312, 314, 316, or 318 may be implemented by a processor, such as roadside computing devices (e.g., 540), fog- or cloud-based computing systems (e.g., 550), processor 602, 900, 1070, or 1080, system 610,). Divorecki fails to teach plotting the two-dimensional coordinates of each reflected object in the horizontal-vertical plane perpendicular to the direction of motion of the vehicle; selecting a series of candidate angles; for each candidate angle, constructing a virtual box parallel to an associated candidate horizon and counting reflections within the virtual box; selecting the candidate angle with largest number of reflections within the virtual box. However, Orlowski teaches a radar based method of vehicle orientation detection ([0006] In one aspect of the invention is provided a method to determine the heading or orientation of a target vehicle by a host vehicle, said host vehicle equipped with a lidar or radar system, said system including a sensor unit adapted to receive signals emitted from said host vehicle and reflected by said target vehicle) with selecting a series of candidate angles ([0010] e) in respect of each orientation candidate angle γi/correction angle Δi, determining a cost function; and [0011] f) selecting that orientation candidate angle having the lowest cost function.) for each candidate angle, constructing a virtual box ([0006] formulating an initial rectangular boundary box from said point detections, where the bounding box is formulated such that such that two edges of the boundary box are drawn parallel to the reference angle and two sides are perpendicular to the reference angle and such that all the detections are either on the bounding box sides or lay within the bounding box); and counting reflections within the virtual box; selecting the candidate angle with largest number of reflections within the virtual box ([0098] The boundary box may be redrawn such that the edges encapsulate all the point detection with the point detections at the extremities coincident with the new bounding box perimeters (sides). In other words, all detection points lie on the sides or the reformulated boundary box or are contained within it.). Dvorecki and Orlowski are both considered to be analogous to the claimed invention because they are in the same field of endeavor of vehicular radar technology. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the method of Dvorecki by including the candidate angle techniques of Orlowski with a reasonable expectation of success, as both inventions are directed to the same field of endeavor – vehicular radar technology. The combination would improve the accuracy of vehicular orientation measurements as noted by Orlowski ([0005] It is an object of the invention to correct or refine an orientation estimate of a target such as another vehicle using the spatial distribution of detections from an extended radar target. The technique offers an improvement of estimation accuracy in comparison to methods that do not use spatial information of clustered detections are also specified.). Additionally, He teaches a method for analyzing radar point clouds for autonomous vehicle environmental perception ([0067] FIGS. 3A and 3B are block diagrams illustrating an example of a perception and planning system used with an autonomous vehicle according to one embodiment. System 300 may be implemented as a part of autonomous vehicle 101 of FIG. 1 including, but is not limited to, perception and planning system 110, control system 111, and sensor system 115.) with plotting the two-dimensional coordinates of each reflected object in the horizontal-vertical plane perpendicular to the direction of motion of the vehicle ([0071] Perception module 302 can also detect objects based on other sensors data provided by other sensors such as a radar and/or LIDAR.; [0080] Each data point is associated with location information of the data point (e.g., x, y, and z coordinates). Point clouds down-sampling module 403, which may be optional, can down-sample the point clouds spatially or temporally.) and constructing a virtual box parallel to an associated candidate horizon ([0107] These segments or super point objects may be extracted using structural information of objects detected in the frame. In one embodiment, the segments are categorized into segment types. Example segments or objects types may be cylindrical objects, planar patch objects, or any geometrically identifiable objects that may have peculiar geometric and spatial attributes.) Dvorecki, Orlowski and He are all considered to be analogous to the claimed invention because they are in the same field of endeavor of vehicular radar technology. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the method of Dvorecki as modified by Orlowski by plotting reflected objects and constructing the virtual boxes in the horizontal-vertical plane as done by He with a reasonable expectation of success, as the inventions are directed to the same field of endeavor - vehicular radar technology. The combination would improve the ability to perceive the driving environment in both horizontal, as is done by Orlowski, and vertical dimensions as is done by He. Regarding claim 2, Dvorecki as modified by Orlowski and He teaches the method of claim 1, accordingly the rejection of claim 1 above is incorporated. Dvorecki further teaches filtering noisy data ([0079] During a sensing and perception stage 805 data is generated by various sensors and collected for use by the autonomous driving system. Data collection, in some instances, may include data filtering). Regarding claim 3, Dvorecki as modified by Orlowski and He teaches the method of claim 1, accordingly the rejection of claim 1 above is incorporated. Dvorecki further teaches transmitting angle to concerned parties ([0053] Vehicles may also communicate with other connected vehicles over wireless communication channels to share data and coordinate movement within an autonomous driving environment, among other example communications.). Regarding claim 7, Dvorecki as modified by Orlowski and He teaches the method of claim 1, accordingly the rejection of claim 1 above is incorporated. Dvorecki fails to teach that the virtual box comprises a rectangular box. Orlowski teaches a vehicular radar method where the virtual box comprises a rectangular box ([0074] Here a rectangular boundary box is formulated, which is eventually orientated with respect to local co-ordinate system.). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Dvorecki by including the rectangular virtual box of Orlowski to yield a predictable result of generating a spatial structure from the radar data that can be oriented with respect to the horizon to determine a tilt angle with improved accuracy as noted by Orlowski ([0005] The technique offers an improvement of estimation accuracy in comparison to methods that do not use spatial information of clustered detections are also specified.). Regarding claim 9, Dvorecki as modified by Orlowski and He teach the method of claim 1, accordingly the rejection of claim 1 above is incorporated. Dvorecki further teaches storing selected angle in a database ([0041] Memory 308 may comprise any suitable volatile or non-volatile memory to store data utilized by any of the modules of the radar system 104, such as intermediate or final results.). Regarding claims 11, Dvorecki as modified by Orlowski and He teach the method of claim 1, accordingly the rejection of claim 1 above is incorporated. Dvorecki further teaches calculating an orientation error ([0024] The doppler measured from a static object is proportional to the sine of an angle θ between an axis perpendicular to the direction of the vehicle 102 and the line of sight to the object, and the phase offset between the antennas is proportional to the cosine of the same angle θ, but is also corrupted by the unknown calibration error.) between direction of motion of the vehicle and boresight direction of the vehicle mounted radar ([0024] Thus, two different measurement types may be used to determine the angle θ: the direct angle measurement based on the phase offsets between antennas (e.g., the standard radar measurement) or an angle measurement based on the dopplers measured by the antennas. By comparing the doppler measurements with the phase measurements, the phase offset calibration errors for the antennas may be estimated.). Regarding claim 12, Dvorecki as modified by Orlowski and He teach the method of claim 11, accordingly the rejection of claim 11 above is incorporated. Dvorecki further teaches the step of calculating orientation error comprises: detecting objects surrounding vehicle ([0020] In operation, the radar system 104 may transmit a radar signal (e.g., an electromagnetic wave in the radio or microwave spectrum) and antennas 106 of the radar system 104 may each detect the radar signal reflected by a target object in response to the transmitted radar signal.); obtaining or estimating the self velocity vego of the vehicle ([0031] The speed estimate S.sub.r of the radar along the y-axis may generally be determined by the vehicle 102 to a high degree of accuracy (e.g., within a few cm/s). The vehicle 102 may utilize any suitable sensor or other information to determine the speed); calculating expected apparent speed of objects detected surrounding the vehicle ([0028] The value of the expected doppler is D.sub.exp=S.sub.r sin(θ.sub.obj), where D.sub.exp is the expected doppler, S.sub.r is the true speed of the radar system 104 along the y-axis, and θ.sub.obj is the angle between the x-axis and the line of sight to the target object.); plotting a first distribution over angle of expected apparent speeds of objects detected surrounding the vehicle given the known self-velocity ([0027] FIG. 2 illustrates a graph depicting expected doppler from stationary target objects as a function of the angle of the target in accordance with certain embodiments. The graph assumes that the is vehicle 102 (also referred to as ego) is traveling at a speed of 15 m/sec along the y-axis.); plotting a second distribution over angle of measured apparent speeds of objects detected surrounding the vehicle ([0029] The measured doppler from target i is given by: D.sub.i=S.sub.r sin(θ.sub.i)+n.sub.i, where D.sub.i is the measured doppler, S.sub.r is the true speed of the radar system 104 along the y-axis, θ.sub.i is the angle between the line of sight to the stationary target i and the x-axis and n.sub.i is a residual noise term (having a magnitude based on the accuracy of the radar), whose distribution is usually considered to be known.); and calculating offset of first distribution from second distribution ([0024] Thus, two different measurement types may be used to determine the angle θ: the direct angle measurement based on the phase offsets between antennas (e.g., the standard radar measurement) or an angle measurement based on the dopplers measured by the antennas. By comparing the doppler measurements with the phase measurements, the phase offset calibration errors for the antennas may be estimated.). Regarding claims 13 and 14, Dvorecki as modified by Orlowski and He teach the method of claim 12, accordingly the rejection of claim 12 above is incorporated. Dvorecki further teaches the step of calculating expected apparent speed of objects of objects detected surrounding the vehicle given the known self-velocity ([0028] Referring again to FIG. 2, the magnitude of the expected doppler from a stationary target object is zero for objects that are located on the x-axis (i.e., have a θ of 0 or 180). The expected doppler increases as θ increases from 0 up to a maximum (equal to the velocity of the vehicle 102) at 90 degrees and then decreases as θ increases from 90 degrees to 180 degrees. The value of the expected doppler is D.sub.exp=S.sub.r sin(θ.sub.obj), where D.sub.exp is the expected doppler, S.sub.r is the true speed of the radar system 104 along the y-axis, and θ.sub.obj is the angle between the x-axis and the line of sight to the target object.). Dvorecki uses the horizontal axis as the reference for the angle φobj rather than the direction of travel. Thus, the apparent speed expression of Dvorecki involves the sine of the angle rather than the cosine. Although, Dvorecki does not explicitly state the expressions vr = -vego●cos(φobj) for each object, it is implicit in the disclosure since the difference between the expressions is a difference in reference axis. Dvorecki fails to teach the expression vr = -vego●cos(φobj)●cos(θobj) since Dvorecki is limited to the horizontal plane. As noted in the rejection of claim 1, He extends the radar method to the vertical axis. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Dvorecki in view of He to incorporate such horizontal and vertical calculation of the apparent speed expression as taught by He to gain the advantage of extending the angle measurement and apparent speed calculation in the vertical dimension; and also since it has been held that if a technique has been used to improve one device, and a person of ordinary skill in the art would recognize that it would improve similar devices in the same way, using the technique is obvious unless its actual application is beyond his or her skill (MPEP 2143). Claim(s) 4-5 and 10 are rejected under 35 U.S.C. 103 as being unpatentable over Dvorecki (US PG Pub. 20200132812) as modified Orlowski (US PG Pub. 201900041661) and He (US PG Pub. 20210323572) as applied to claim 1 above and further in view of Steinbunch (US PG Pub. 20140333473). Regarding claims 4 and 5, Dvorecki as modified by Orlowski and He teach the method of claim 1, accordingly the rejection of claim 1 above is incorporated. Dvorecki fails to teach estimating the orientation of the vehicle mounted radar unit relative to the direction of movement of the vehicle and issuing a warning if estimated orientation deviates from a required orientation. However, Steinbunch teaches a method for measuring and correcting for a vehicle radar sensor ([0001] The present invention relates to a method and a device for ascertaining and compensating for a misalignment angle of a radar sensor of a vehicle.) with estimating the orientation of the vehicle mounted radar unit relative to the direction of movement of the vehicle ([0008] The method according to an example embodiment of the present invention provides the following steps: generating a first set of data which contains information about a measured alignment of the radar sensor with respect to an instantaneous movement of the vehicle) and issuing a warning if estimated orientation deviates from a required orientation ([0017] According to another preferred example embodiment, the ascertained misalignment angle is compared to a predetermined limiting value. If the predefined limiting value is exceeded, an emergency action is triggered… According to another preferred example embodiment, the triggered emergency action includes the transmission of a visual, acoustic and/or haptic warning signal.). Dvorecki, Orlowski, He and Steinbunch are all considered to be analogous to the claimed invention because they are in the same field of endeavor of vehicular radar technology. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the method of Dvorecki as modified by Orlowski and He by including the alignment measurement and warning technique of Steinbunch to improve the radar sensor method by providing a means to measure and issue a warning in the case of misalignment thereby improving vehicle safety through awareness of a sensor misalignment as noted by Steinbunch ([0018] The vehicle driver and/or the vehicle owner may, for example, be notified that he/she should drive to a repair shop in order to remedy the misalignment.) Regarding claim 10, Dvorecki as modified by Orlowski and He teach the method of claim 1, accordingly the rejection of claim 1 above is incorporated. Dvorecki fails to teach aligning objects detected around the vehicle to the angle. However, Steinbunch teaches a method for measuring and correcting for a vehicle radar sensor ([0001] The present invention relates to a method and a device for ascertaining and compensating for a misalignment angle of a radar sensor of a vehicle.) with aligning objects detected around the vehicle to the angle ([0010] Example embodiments of the present invention provide a method for ascertaining and compensating for a misalignment angle of a radar sensor of a vehicle with the aid of which an examination and a compensation for the alignment of the radar sensor is possible during the operation of the vehicle.). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the method of Dvorecki as modified by Orlowski and He by including the alignment technique of Steinbunch with a reasonable expectation of success, as both inventions are directed to the same field of endeavor – vehicular radar technology. The combination would improve the safety of the vehicle by proving a means to measure and compensate for the orientation offset of the radar system to the vehicle as noted by Steinbunch ([0011] Furthermore, the present invention provides an option of compensating for a misalignment without a repair shop visit and even during the driving operation of a vehicle, thus making the situation more convenient for the vehicle driver and/or the vehicle owner.). Claim 8 is rejected under 35 U.S.C. 103 as being unpatentable over Dvorecki (US PG Pub. 20200132812) as modified Orlowski (US PG Pub. 20190004166) and He (US PG Pub. 20210323572) as applied to claim 1 above and further in view of Gao (US PG Pub. 20170220876). Regarding claim 8, Dvorecki as modified by Orlowski and He teach the method of claim 1, accordingly the rejection of claim 1 above is incorporated. Dvorecki fails to teach constructing a virtual box parallel to an associated candidate horizon comprises constructing a box extending from 20 centimeters below candidate ground level to 2 meters above candidate ground level. However, Gao discloses constructing a virtual box parallel to an associated candidate horizon comprises constructing a box ([0077] Objects of interest can then be classified. In general, these objects will be anything in the image that is a certain height range above the ground (e.g., between 20 centimeters and four meters above the ground) and meets a size constraint (e.g., the objects must have a longest side of their bounding box region proposals 695 that is at least 30 pixels in the projected image in the image in order to be classified).) except for extending from 20 centimeters below candidate ground level to 2 meters above candidate ground level. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to limit the height of the virtual box from 20 centimeters below candidate ground level to 2 meters above candidate ground level, since it has been held that discovering an optimum value of a result effective variable involves only routine skill in the art. In re Boesch, 617 F.2d 272, 205 USPQ 215 (CCPA 1980) Claim 19 is rejected under 35 U.S.C. 103 as being unpatentable over Dvorecki (US PG Pub. 20200132812) in view of Nagaishi (WO 2020021812 – machine translation). Regarding claim 19, Dvorecki teaches a system for sensing the surroundings of a vehicle comprising: a vehicle mounted radar unit (Fig. 1, Radar system 104) comprising: a radar transmission unit comprising an array of transmitter antennas connected to an oscillator and configured to transmit electromagnetic waves into a region surrounding the vehicle (Fig. 3, Transmission Module 302 [0040]), and a radar receiving unit comprising at least one receiver antenna configured to receive electromagnetic waves reflected by objects within the region surrounding the vehicle and operable to generate raw data (Fig. 3, Receiver Module 306, [0041]); a processor unit (Fig. 6, in-vehicle processing system 610) in communication with the radar receiving unit and configured to receive raw data from the radar unit and operable to generate environmental information based upon the received data ([0050] For example, any of the modules (e.g., 302, 304, 306, 308, 310, 312, 314, 316, or 318 may be implemented by … system 610); wherein the processor comprises: a self-velocity calculation module operable to calculate velocity of the vehicle from raw data ([0043] Speed identification module 314 is operable to determine a speed of the vehicle 102. Speed identification module 314 may determine the speed from the received radar signals or may communicate with other sensors or modules of the vehicle to determine the speed of the vehicle.). Dvorecki fails to teach wherein the processor comprises a tilt detection module operable to calculate tilt angle of the vehicle. However, Nagaishi teaches a radar sensor for vehicles ([0150] The radar sensor 10 shown in FIG. 9 differs from the radar sensor 10 shown in FIG. 8 of the fourth embodiment in that an inclination angle detection unit 70 and a radar cross section calculation unit 71 are newly provided.) wherein the processor comprises a tilt detection module operable to calculate tilt angle of the vehicle ([0151] The tilt angle detection unit 70 detects the tilt angle of the radar sensor 10, in other words, the tilt angle of the vehicle. The information on the tilt angle detected by the tilt angle detection unit 70 is output to the information processing unit 55. The information processing unit 55 detects the deviation of the radar radiation direction based on the input tilt angle information.). Dvorecki and Nagaishi are both considered to be analogous to the claimed invention because they are in the same field of endeavor of vehicular radar technology. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system for sensing the surroundings of a vehicle of Dvorecki by including the tilt detection module of Nagaishi to yield a predictable result of a means to detect the vehicle tilt so as to register the surroundings to the orientation of the vehicle thereby improving the accuracy of the spatial relationship between the vehicle and the surroundings. Claim(s) 16-18 are rejected under 35 U.S.C. 103 as being unpatentable over Dvorecki (US PG Pub. 20200132812) as modified by Nagaishi (WO 2020021812 – machine translation) as applied to claim 19 above and further in view of Sicconi (US PG Pub. 20200057487). Regarding claims 16-18, Dvorecki as modified by Nagaishi teach the system of claim 19, accordingly the rejection of claim 19 above is incorporated. Dvorecki further teaches an audio signal generator ([0066] In some cases, informational presentations may be generated and provided through user displays (e.g., audio, visual, and/or tactile presentations)). Dvorecki as modified by Nagaishi fail to teach the audio signal generator is configured and operable to produce sounds in response to objects detected by the vehicle mounted radar unit such that a listener perceives the sounds as originating at a direction corresponding to a direction in which the object was detected; wherein the audio signal generation unit is selected from a group consisting of an array of speakers, a pair of stereo speakers, a set of four quadrophonic, earphones and combinations thereof; and wherein the audio signal generation unit is provided within a safety helmet worn by a motorist. However, Sicconi teaches a system for providing directional alerts to vehicle operators (Abstract, In an aspect, a system for using artificial intelligence to evaluate, correct, and monitor user attentiveness includes … at least a user alert mechanism configured to output a directional alert to a user,) where an audio signal generator is configured and operable to produce sounds in response to objects detected by the vehicle mounted radar unit such that a listener perceives the sounds as originating at a direction corresponding to a direction in which the object was detected ([0094] Embodiments disclosed herein may include methods for spatially located audio feedback using multiple speakers installed in a vehicle, and phase modulation across the available channels. Sound or voice of the system alerting the driver may be projected to come from the direction where the driver is requested to pay attention to); wherein the audio signal generation unit is selected from a group consisting of an array of speakers (Fig. 12, speaker array 1215), a pair of stereo speakers ([0066] Audio output devices may include, as a further non-limiting example, one or more speakers for acoustic/voice feedback to driver.), earphones ([0130] Directional alert may simulate a direction using stereo sound manipulation in earphones or a headset of a user wearing such devices.) and combinations thereof; and wherein the audio signal generation unit is provided within a safety helmet worn by a motorist ([0085] Embodiments disclosed herein may include methods for extending driver attention monitoring to use in trucks, motorcycle helmets). Dvorecki, Nagaishi and Sicconi are all considered to be analogous to the claimed invention because they are in the same field of endeavor of vehicular safety technology. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system of Dvorecki as modified by Nagaishi by including the audio alert elements of Sicconi to yield a predictable result of improving the awareness of a vehicle operator to their surroundings. Claim 20 is rejected under 35 U.S.C. 103 as being unpatentable over Dvorecki (US PG Pub. 20200132812) as modified by Nagaishi (WO 2020021812 – machine translation) as applied to claim 19 above and further in view of Cattle (US PG Pub. 20200158861). Regarding claim 20, Dvorecki as modified by Nagaishi teach the system of claim 19, accordingly the rejection of claim 19 above is incorporated. Dvorecki as modified by Nagaishi fails to teach the self-velocity calculation module comprises an image generation unit, a memory unit, wherein the image generation unit configured and operable to construct a constructing a three dimensional image representing the region surrounding the vehicle comprising a matrix of voxels, each voxel characterized by a set of voxel parameters including: a horizontal spatial coordinate, x, of a reflecting object along an axis parallel to the path of the vehicle; a vertical spatial coordinate, y, of the reflecting object along a vertical axis orthogonal to the path of the vehicle; a radial spatial coordinate, R, of the reflecting object along an axis diverging radially from the vehicle; an intensity value; and a Doppler-shift value indicating an apparent radial velocity vR of the reflecting object. However, Cattle teaches a radar self-velocity calculation module comprising: an image generation unit (Fig. 3, an imaging module 316 [0055]), a memory unit (Fig. 3, memory 322; [0055]), wherein the image generation unit configured and operable to construct a constructing a three dimensional image representing the region surrounding the vehicle comprising a matrix of voxels (Fig. 2D, voxel data structure 260), each voxel characterized by a set of voxel parameters including: a horizontal spatial coordinate, x, of a reflecting object along an axis parallel to the path of the vehicle; a vertical spatial coordinate, y, of the reflecting object along a vertical axis orthogonal to the path of the vehicle ([0108] The voxel data structure 260 includes range values 262 arranged along a first axis, angle values 264 arranged along a second orthogonal axis, and relative velocity values 266 arranged along a third orthogonal axis.); a radial spatial coordinate, R, of the reflecting object along an axis diverging radially from the vehicle ([0103] Voxels may be characterized by, e.g., range values, angle values, and/or velocity values.); an intensity value ([0091] Further, the imaging module 316 can be configured to identify a reflectiveness of one or more voxels in the one or more areas of interest… For example, the imaging module 316 can identify the locations of a certain number of high-intensity targets, e.g. which might be spectral reflections from metal on cars.); and a Doppler-shift value indicating an apparent radial velocity vR of the reflecting object ([0108] and the relative velocity values 266 can represent velocit(ies) of the radar imaging system 102 relative to the each voxel.). Dvorecki, Nagaishi and Cattle are all considered to be analogous to the claimed invention because they are in the same field of endeavor of vehicular radar technology. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system of Dvorecki as modified by Nagaishi by including the self-velocity calculation module of Cattle with a reasonable expectation of success, as both inventions are directed to the same field of endeavor – vehicular radar technology. The combination would improve the safety of vehicular operation by performing object detection in a cost-effective and efficient manner as noted by Cattle ([0035] The present technology solves these and other deficiencies by performing object detection in a cost-effective and efficient way that is meaningful for a variety of industries, including various vehicular industries.). For applicant’s benefit portions of the cited reference(s) have been cited to aid in the review of the rejection(s). While every attempt has been made to be thorough and consistent within the rejection it is noted that the PRIOR ART MUST BE CONSIDERED IN ITS ENTIRETY, INCLUDING DISCLOSURES THAT TEACH AWAY FROM THE CLAIMS. See MPEP 2141.02 VI. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure: US 20100104199 discloses a method for detecting a clear path of travel for a host vehicle including fusion of clear path detection by image analysis and detection of an object within an operating environment of the host vehicle including monitoring an image from a camera device, analyzing the image through clear path detection analysis to determine a clear path of travel within the image, monitoring sensor data describing the object, analyzing the sensor data to determine an impact of the object to the clear path, utilizing the determined impact of the object to describe an enhanced clear path of travel, and utilizing the enhanced clear path of travel to navigate the host vehicle. US 20170192433 discloses a method for assisting a driver of a two-wheeled vehicle. The method includes sensing and evaluating a driving environment of the motorcycle as a function of a driving state of the motorcycle, especially an inclination of the motorcycle, in order to detect objects in the driving environment; determining a hazard potential as a function of the detected objects and the driving state; and warning the driver and/or triggering a driver assistance system and/or a vehicle safety system of the motorcycle as a function of the determined hazard potential. US 20180232947 discloses a system and method for generating a high-density three-dimensional (3D) map are disclosed. The system comprises acquiring at least one high density image of a scene using at least one passive sensor; acquiring at least one new set of distance measurements of the scene using at least one active sensor; acquiring a previously generated 3D map of the scene comprising a previous set of distance measurements; merging the at least one new set of distance measurements with the previous set of upsampled distance measurements, wherein merging the at least one new set of distance measurements further includes accounting for a motion transformation between a previous high-density image frame and the acquired high density image and the acquired distance measurements; and overlaying the new set of distance measurements on the high-density image via an upsampling interpolation, creating an output 3D map. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JOHN BS ABRAHAM whose telephone number is (571)272-4145. The examiner can normally be reached Monday - Friday 9:00 am - 5:00 pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jack Keith can be reached at (571)272-6878. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JBSA/Examiner, Art Unit 3646 /JACK W KEITH/Supervisory Patent Examiner, Art Unit 3646
Read full office action

Prosecution Timeline

Jan 14, 2024
Application Filed
Jan 15, 2026
Non-Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12584991
UWB-BASED IN-VEHICLE 3D LOCALIZATION OF MOBILE DEVICES
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 1 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
71%
Grant Probability
99%
With Interview (+40.0%)
2y 4m
Median Time to Grant
Low
PTA Risk
Based on 7 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month