Prosecution Insights
Last updated: April 19, 2026
Application No. 18/492,891

Detecting Unfamiliar Signs

Non-Final OA §103§DP
Filed
Oct 24, 2023
Examiner
CASS, JEAN PAUL
Art Unit
3666
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Waymo LLC
OA Round
5 (Non-Final)
73%
Grant Probability
Favorable
5-6
OA Rounds
3y 1m
To Grant
99%
With Interview

Examiner Intelligence

Grants 73% — above average
73%
Career Allow Rate
719 granted / 984 resolved
+21.1% vs TC avg
Strong +26% interview lift
Without
With
+25.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
83 currently pending
Career history
1067
Total Applications
across all art units

Statute-Specific Performance

§101
10.5%
-29.5% vs TC avg
§103
56.8%
+16.8% vs TC avg
§102
12.6%
-27.4% vs TC avg
§112
12.8%
-27.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 984 resolved cases

Office Action

§103 §DP
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to the Applicant’s arguments The previous rejection is withdrawn. Applicant’s amendments are entered. Applicant’s remarks are also entered into the record. A new search was made necessitated by the applicant’s amendments. A new rejection is made herein. Applicant’s arguments are now moot in view of the new rejection of the claims. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1 and 21-27 are rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent Application Pub. No.: US20080137908A1 to Stein et al. (hereinafter “Stein”) that was filed on 12-6-2007 and in further in view of United States Patent Application Pub. No.: US20190114493A1 to Ewert filed in 2017 and in view of United States Patent Application Pub. No.: US20190258251A1 to Ditty that was filed on 11-9-20218 (hereinafter “DITTY”). PNG media_image1.png 774 1115 media_image1.png Greyscale In regard to claim 1 and 27, Stein discloses “1. A method comprising: ;(see FIG. 1 where the camera 100 and the vision system can detect a type of a sign shape shown as 20e, 20d, 20c and see FIG. 3 where the vehicle has a camera 110 and a processor 130 and a traffic sign recognition software block 300) Ewert teaches “of an autonomous vehicle”, ; (see FIG. 3, element 300 to detect signs on a processor; see paragraph 4 where by plausibility checking of traffic signs with the aid of a server, an autonomously driving vehicle may independently recognize traffic signs, as the result of which a driving pattern of a vehicle may be positively influenced. It is thus also possible to control the autonomously driving vehicle with regard to a maximum speed with the aid of such a system. In addition, the autonomous vehicle may be controlled via further traffic signs such as yield signs or stop signs.) PNG media_image2.png 571 624 media_image2.png Greyscale Stein discloses “....identifying, by one or more processors, an unfamiliar traffic sign in an image from a perception system... (see FIG. 2 where the camera 100 and the vision system can detect a type of a sign shape shown as 20e, 20d, 20c, 20b and 20a and the type of a shape of the sign being a hexagonal or rectangular or circular shape or a triangular shape or an upside down triangle shape 20c) ;(see FIG. 7 where the sign can be determined to be a 1. No passing zone sign. 2. End of zone sign, or a two digit or 3 digit speed limit zone in block 422-444 see FIG. 3 where the vehicle has a camera 110 and a processor 130 and a traffic sign recognition software block 300) ) (see FIG. 2 where a shape of the sign is determined and where the sign includes text and a two or three digit text on the sign and in FIG. 7 where the sign is not a no passing sign or an end sign and include digits and the device can determine a speed limit zone in blocks 442 and 444; see paragraph 14-15 and 38-41, and 44 where 1. A shape of the sign is determined, 2. A color of the sign is determined, and 3. Digits or text on the sign are determined, and 4. A location of the sign can be determined, for example ,there is no speed limit sign in Germany on certain roads and this can be excluded, and then a scoring is provided and see paragraph 62 where the sign can be compared to the memory to determine 1. Electronic sign 2. An end of zone sign 3. Speed limit or two digit or three digit signs) PNG media_image3.png 912 1506 media_image3.png Greyscale Claim 1 is amended to recite and Ditty teaches “....identifying, by the one or more processors, content of a-the unfamiliar traffic sign.... (see paragraph 269-271 and 43, 255-256, 131, 138, 166-169 where the neural network can provide an interference value when the value for the sign is not found based on the training and where the vehicle 1. Does not know about the sign and has never been trained about the sign and 2. Consults other neural network to provide a semantic meaning of the sign and then attempts to read the sign and provide that reading to the sematic model and then provide a semantic understanding to the path module for a sign that is not found or obtain some “inference” about the sign’s semantic meaning that is not understood) determining, by the one or more processors, whether the unfamiliar traffic sign is associated with travel to a destination of the autonomous vehicle based on the content; (see paragraph 649) determining, by the one or more processors, whether to ignore the unfamiliar traffic sign based on whether the unfamiliar traffic sign is determined to be associated with travel to the destination; and (see FIG .12 where the content of the sign is detected as “flashing lights indicate icy conditions for the text semantic meaning for the road on the destination and this requires the vehicle to maintain the speed or slow down due to the icy conditions expected from the current location to the destination) (see paragraph 255-230 where the CNN can determine that the sign reads no parking from 4 pm to 7 pm Monday to Friday and the av can understand that this is instructive and not parking in that area or that the vehicle is driving by and does not need to park at all) (see paragraph 269-271 and 43, 255-256, 131, 138, 166-169 where the neural network can provide an interference value when the value for the sign is not found based on the training and where the vehicle 1. Does not know about the sign and has never been trained about the sign and 2. Consults other neural network to provide a semantic meaning of the sign and then attempts to read the sign and provide that reading to the sematic model and then provide a semantic understanding to the path module for a sign that is not found or obtain some “inference” about the sign’s semantic meaning that is not understood) PNG media_image4.png 622 820 media_image4.png Greyscale Diitty teaches “...controlling, by the one or more processors, the autonomous vehicle to travel to the destination based on determining whether to ignore the unfamiliar traffic sign” (see paragraph 48 where the av can perform sign reading and see Fig. 42 where in block 3104a a route map of the autonomous mode with the map perception and the trajectory estimation can be provided in blocks 3030a and 3105-31022 (see paragraph 255-257 where the autonomous vehicle can 1. Obtain from a first neural network that this is or is not a traffic sign, then the second neural network can detected flashing lights indicate icy conditions and then a third neural network can detect that there are flashing lights and then the icy conditions exists and the path planning software may be controlled that an icy condition exists and see paragraph 713 where a traffic sign can be identified and in FIG. 42 a wait condition can be provided or an advanced trajectory estimation can also be provided in block 3120 to 3030 ) It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine Ditty with the disclosure of STEIN with a reasonable expectation of success since DITTY teaches that the autonomous vehicle can interface with three different neural networks to detect a sign and then read the text semantics of the sign that the flashing lights indicate icy conditions and then a second neural network can recognize the flashing lights and then infer that the conditions are in fact icy and then to based on the keywords slow the vehicle or plan the trajectory or wait. This can be provided an improved and trained semantic detection. See paragraph 255-270 of Ditty. Ewert teaches of the autonomous vehicle; ; (see FIG. 3, element 300 to detect signs on a processor and paragraph 4-9) It would have been obvious for one of ordinary skill in the art at the time of the effective filing date to combine the disclosure of Stein and the teachings of Ewert since Ewert teaches that a vehicle and image processor can determine a traffic sign and determine with high or low confidence the type of sign. Then the autonomous vehicle based on the confidence can control a speed and a position of the vehicle based on the high confidence detection of the sign. Signs with low confidence such as a traffic sign associated with a construction site that are not readily know can be provided to a server and the server can provide an identification of the sign with high confidence so as to control the vehicle safely in the autonomous mode of operation. See paragraph 33-53 and claims 1-4 and the abstract of Ewert. The phrase unfamiliar is defined as something unknown, strange, or not well-acquainted, often due to a lack of previous experience, knowledge, or exposure. Claim 2 is cancelled. Claim 2 is rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent Application Pub. No.: US20080137908A1 to Stein et al. (hereinafter “Stein”) that was filed on 12-6-2007 and in further in view of United States Patent Application Pub. No.: US20190114493A1 to Ewert filed in 2017 and in view of Chinese Patent Pub. No.: CN 205302634 U that was filed on 6-8-2016 and in view of Ditty. The primary reference to Stein is silent but the ‘634 teaches “2. The method of claim 1, wherein the traffic sign is any one of a construction sign, a regulatory sign, a warning sign, a guide sign a detour sign, a speed limit sign, a stop sign, services sign, recreation sign, construction zone sign, a school zone sign, or a rail road crossing sign”.(See page 1). It would have been obvious for one of ordinary skill in the art at the time of the effective filing date to combine the disclosure of Stein and the teachings of the ‘634 since the ‘634 teaches that a vehicle and image/camera device can detect a construction sign and then provide a message to avoid this area for increased safety. See page 1. Claims 3-4 and 14 are rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent Application Pub. No.: US20080137908A1 to Stein et al. (hereinafter “Stein”) that was filed on 12-6-2007 and in further in view of United States Patent Application Pub. No.: US20190114493A1 to Ewert filed in 2017 and in view of Ditty. Stein discloses “...3. The method of claim 1, further comprising determining by one or more processors at least one attribute of the unfamiliar traffic sign, and comparing by the processor the attribute of the unfamiliar traffic sign to attributes of other known traffic signs wherein identifying the unfamiliar traffic sign is based on comparing the at least one attribute of the unfamiliar traffic sign to the at least one attribute of the set of known traffic signs.”. ;(see FIG. 7 where the sign can be determined to be a 1. No passing zone sign. 2. End of zone sign, or a two digit or 3 digit speed limit zone in block 422-444 see FIG. 3 where the vehicle has a camera 110 and a processor 130 and a traffic sign recognition software block 300) ) (see FIG. 2 where a shape of the sign is determined and where the sign includes text and a two or three digit text on the sign and in FIG. 7 where the sign is not a no passing sign or an end sign and include digits and the device can determine a speed limit zone in blocks 442 and 444; see paragraph 14-15 and 38-41, and 44 where 1. A shape of the sign is determined, 2. A color of the sign is determined, and 3. Digits or text on the sign are determined, and 4. A location of the sign can be determined, for example ,there is no speed limit sign in Germany on certain roads and this can be excluded, and then a scoring is provided and see paragraph 62 where the sign can be compared to the memory to determine 1. Electronic sign 2. An end of zone sign 3. Speed limit or two digit or three digit signs).(See FIG. 2,where the sign is a warning zone of no trucks in block 22; see FIG. 10 where the sign can be no passing, or no entrance in block 468, 478, and a 3 digit or 2 digit speed limit zone and in paragraph 30 the adaptive cruise control can or cannot be adjusted based on the speed limit or passing or no passing based on a no passing zone). In regard to claim 4 and 14, Stein discloses “..4. The method of claim 1, wherein the at least one attribute includes at least one of color, shape, reflection coefficient, placement, text, figures, or accessories. (see FIG. 7 where the sign can be determined to be a 1. No passing zone sign. 2. End of zone sign, or a two digit or 3 digit speed limit zone in block 422-444 see FIG. 3 where the vehicle has a camera 110 and a processor 130 and a traffic sign recognition software block 300) ) (see FIG. 2 where a shape of the sign is determined and where the sign includes text and a two or three digit text on the sign and in FIG. 7 where the sign is not a no passing sign or an end sign and include digits and the device can determine a speed limit zone in blocks 442 and 444; see paragraph 14-15 and 38-41, and 44 where 1. A shape of the sign is determined, 2. A color of the sign is determined, and 3. Digits or text on the sign are determined, and 4. A location of the sign can be determined, for example ,there is no speed limit sign in Germany on certain roads and this can be excluded, and then a scoring is provided and see paragraph 62 where the sign can be compared to the memory to determine 1. Electronic sign 2. An end of zone sign 3. Speed limit or two digit or three digit signs) .(See FIG. 2,where the sign is a warning zone of no trucks in block 22; see FIG. 10 where the sign can be no passing, or no entrance in block 468, 478, and a 3 digit or 2 digit speed limit zone and in paragraph 30 the adaptive cruise control can or cannot be adjusted based on the speed limit or passing or no passing based on a no passing zone). Claim 5 and 15 are cancelled. Claim 6 and 16 are cancelled. Claims 5 and 15 are rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent Application Pub. No.: US20080137908A1 to Stein et al. (hereinafter “Stein”) that was filed on 12-6-2007 and in further in view of United States Patent Application Pub. No.: US20190114493A1 to Ewert filed in 2017 and in further in view of International Patent Pub. No.: WO 2019222358 A1 to Shapira et al. that was effectively filed on 5-15-2018 which is prior to the effective filing date of 12-14-18 and in view of Ditty. In regard to claim 5 and 15, Stein is silent but Shapira teaches “...5. The method of claim 1, wherein the content includes an arrow pointing in a particular direction. (see paragraph 388)”. It would have been obvious for one of ordinary skill in the art at the time of the effective filing date to combine the disclosure of Stein and the teachings of SHAPIRA since SHAPIRA teaches that a vehicle and image processor can determine a traffic sign and determine with high or low confidence that this is a directional arrow. Then the autonomous vehicle navigation can be updated based on the confidence of the directional arrow and can control a speed and a direction of the vehicle based on the high confidence detection of the arrow. See paragraph 1-10 and claims 28-35 and the abstract. Claims 6 and 8 and 10 and 16 and 18 and 20 are rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent Application Pub. No.: US20080137908A1 to Stein et al. (hereinafter “Stein”) that was filed on 12-6-2007 and in further in view of United States Patent Application Pub. No.: US20190114493A1 to Ewert filed in 2017 and in view of Ditty. In regard to claim 6 and 16, Stein discloses “...6. The method of claim 1, further comprising controlling by the processor the autonomous vehicle to take the action when the content is informaiv3e and when at least one additional conditions has been met”. (See FIG. 2,where the sign is a warning zone of no trucks in block 22; see FIG. 10 where the sign can be no passing, or no entrance in block 468, 478, and a 3 digit or 2 digit speed limit zone and in paragraph 30 the adaptive cruise control speed can or cannot be adjusted based on the speed limit or passing or no passing based on a no passing zone). Claims 7 and 17 are rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent Application Pub. No.: US20080137908A1 to Stein et al. (hereinafter “Stein”) that was filed on 12-6-2007 and in further in view of United States Patent Application Pub. No.: US20190114493A1 to Ewert filed in 2017 and in view of NPL, Fatin, Zaklouta, et al, Real-time traffic sign recognition in three stages, Robotic and autonomous systems, Volumes 62, Jan. 2014 (https://www.sciencedirect.com/science/article/pii/S0921889012001236) and in view of Ditty. In regard to claim 7 and 17, FATIN teaches “...7. The method of claim 1, further comprising wherein identifying the unfamiliar traffic sign includes: inputting, image data corresponding to the unfamiliar traffic sign into at least one software module running on the one or more processors; and determining whether the at least one software module is able to identify the content of the unfamiliar traffic sign with at least a threshold confidence level.” (See section 3.3 where the sign can be classified to provide an attribute for the traffic sign classification using a k tree and when the classification is 75 percent and is poor but in section 2.1 the sign can be determined to be a speed limit sign or a warning sign and in section 3.2 a minimum of a category detection can be made). It would have been obvious for one of ordinary skill in the art at the time of the effective filing date to combine the disclosure of Stein and the teachings of FATIN since FATIN teaches that a vehicle and image processor can determine 1. An input image and them using the HOG/linear SVM detector can compare attributes of the image to classify the image as either 1. A speed limit sign or a 2. Warning sign at a minimum. Then the vehicle can determine a category detection with an accuracy level. See table 6. This can provide an improved detection of a hazard or warning sign or a speed limit sign. In regard to claim 8 and 18, Stein discloses “..8. (currently amended) The method of claim 1, wherein controlling the autonomous vehicle includes at least one of stopping the autonomous vehicle, starting the autonomous vehicle, changing ~course of the autonomous vehicle, or changing.J! speed of the autonomous vehicle; (See FIG. 2,where the sign is a warning zone of no trucks in block 22; see FIG. 10 where the sign can be no passing, or no entrance in block 468, 478, and a 3 digit or 2 digit speed limit zone and in paragraph 30 the adaptive cruise control speed can or cannot be adjusted based on the speed limit or passing or no passing based on a no passing zone). Claims 9 and 19 are rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent Application Pub. No.: US20080137908A1 to Stein et al. (hereinafter “Stein”) that was filed on 12-6-2007 and in further in view of United States Patent Application Pub. No.: US20190114493A1 to Ewert filed in 2017 and in view of International Patent Pub. No.: WO 2016130719 A2 to Shashua et al. that was filed in 2-10-2015 assigned to MOBILE EYE and in view of Ditty. In regard to claim 9, and 19, Stein is silent but Shashua teaches “...9. The method of claim 1, wherein identifying content further includes performing optical character recognition on the image”. (see paragraph 967). It would have been obvious for one of ordinary skill in the art at the time of the effective filing date to combine the disclosure of Stein and the teachings of SHASHUA since SHASHUA teaches that a vehicle and image processor can determine a text on a traffic sign. This can be provided to a server for further evaluation and in response to a so called sparse mapping. The image analysis techniques employed by the server may also include a text recognition component to determine a meaning associated with text present in an image. For example, where text appears in one or more uploaded images from an environment of a host vehicle, the server may determine whether text exists in the images. If text exists, the server may use techniques such as optical character recognition to assist in determining whether the text may relate to a reason that a system or user of a host vehicle caused a navigational maneuver differing from that expected based on sparse model 800. This can provide improved safe operation. See paragraphs 960-968. In regard to claim 10 and 20, Stein discloses “...10. The method of claim 1, wherein identifying the content includes one or more keywords”. (See FIG. 2,where the sign is a warning zone of no trucks in block 22; see FIG. 10 where the sign can be no passing, or no entrance in block 468, 478, and a 3 digit or 2 digit speed limit zone and in paragraph 30 the adaptive cruise control speed can or cannot be adjusted based on the speed limit or passing or no passing based on a no passing zone). Claim 11 is rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent Application Pub. No.: US20080137908A1 to Stein et al. (hereinafter “Stein”) that was filed on 12-6-2007 and in further in view of United States Patent Application Pub. No.: US20190114493A1 to Ewert filed in 2017 and in view of Ditty. PNG media_image1.png 774 1115 media_image1.png Greyscale Stein discloses “11. A system comprising: ....in an image from a perception system ;(see FIG. 1 where the camera 100 and the vision system can detect a type of a sign shape shown as 20e, 20d, 20c and see FIG. 3 where the vehicle has a camera 110 and a processor 130 and a traffic sign recognition software block 300) Ewert teaches “of an autonomous vehicle”, ; (see FIG. 3, element 300 to detect signs on a processor; see paragraph 4 where by plausibility checking of traffic signs with the aid of a server, an autonomously driving vehicle may independently recognize traffic signs, as the result of which a driving pattern of a vehicle may be positively influenced. It is thus also possible to control the autonomously driving vehicle with regard to a maximum speed with the aid of such a system. In addition, the autonomous vehicle may be controlled via further traffic signs such as yield signs or stop signs.) PNG media_image2.png 571 624 media_image2.png Greyscale Stein discloses “.. identify an unfamiliar traffic sign in an image from a perception system (see FIG. 2 where the camera 100 and the vision system can detect a type of a sign shape shown as 20e, 20d, 20c, 20b and 20a and the type of a shape of the sign being a hexagonal or rectangular or circular shape or a triangular shape or an upside down triangle shape 20c) Ewert teaches of the autonomous vehicle; ; (see FIG. 3, element 300 to detect signs on a processor and paragraph 4-9) Stein discloses “.identify content of a-the unfamiliar traffic sign ;(see FIG. 7 where the sign can be determined to be a 1. No passing zone sign. 2. End of zone sign, or a two digit or 3 digit speed limit zone in block 422-444 see FIG. 3 where the vehicle has a camera 110 and a processor 130 and a traffic sign recognition software block 300) ; (see FIG. 2 where a shape of the sign is determined and where the sign includes text and a two or three digit text on the sign and in FIG. 7 where the sign is not a no passing sign or an end sign and include digits and the device can determine a speed limit zone in blocks 442 and 444; see paragraph 14-15 and 38-41, and 44 where 1. A shape of the sign is determined, 2. A color of the sign is determined, and 3. Digits or text on the sign are determined, and 4. A location of the sign can be determined, for example ,there is no speed limit sign in Germany on certain roads and this can be excluded, and then a scoring is provided and see paragraph 62 where the sign can be compared to the memory to determine 1. Electronic sign 2. An end of zone sign 3. Speed limit or two digit or three digit signs) PNG media_image3.png 912 1506 media_image3.png Greyscale Ditty teaches “... determine whether the unfamiliar traffic sign is associated with travel to a destination of the autonomous vehicle based on the content; determine whether to ignore the unfamiliar traffic sign based on whether the unfamiliar traffic sign is determined to be associated with travel to the destination; and control the autonomous vehicle to travel to the destination based on a determination whether to ignore the unfamiliar traffic sign (see FIG .12 where the content of the sign is detected as “flashing lights indicate icy conditions for the text semantic meaning) (see paragraph 255-230 where the CNN can determine that the sign reads no parking from 4 pm to 7 pm Monday to Friday and the av can understand that this is instructive and not parking in that area) (see paragraph 255-257 where the autonomous vehicle can 1. Obtain from a first neural network that this is or is not a traffic sign, then the second neural network can detected flashing lights indicate icy conditions and then a third neural network can detect that there are flashing lights and then the icy conditions exists and the path planning software may be controlled that an icy condition exists and see paragraph 713 where a traffic sign can be identified and in FIG. 42 a wait condition can be provided or an advanced trajectory estimation can also be provided in block 3120 to 3030 ) It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine Ditty with the disclosure of STEIN with a reasonable expectation of success since DITTY teaches that the autonomous vehicle can interface with three different neural networks to detect a sign and then read the text semantics of the sign that the flashing lights indicate icy conditions and then a second neural network can recognize the flashing lights and then infer that the conditions are in fact icy and then to based on the keywords slow the vehicle or plan the trajectory or wait. This can be provided an improved and trained semantic detection. See paragraph 255-270 of Ditty. Stein is silent but Ewert teaches “… autonomous vehicle to take an action (see abstract and paragraph 9 and 33-45 and claims 1-5 and see paragraph 50) It would have been obvious for one of ordinary skill in the art at the time of the effective filing date to combine the disclosure of Stein and the teachings of Ewert since Ewert teaches that a vehicle and image processor can determine a traffic sign and determine with high or low confidence the type of sign. Then the autonomous vehicle based on the confidence can control a speed and a position of the vehicle based on the high confidence detection of the sign. Signs with low confidence such as a traffic sign associated with a construction site that are not readily know can be provided to a server and the server can provide an identification of the sign with high confidence so as to control the vehicle safely in the autonomous mode of operation. See paragraph 33-53 and claims 1-4 and the abstract of Ewert. PNG media_image4.png 622 820 media_image4.png Greyscale Claim 12 is cancelled. Claim 12 is rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent Application Pub. No.: US20080137908A1 to Stein et al. (hereinafter “Stein”) that was filed on 12-6-2007 and in further in view of United States Patent Application Pub. No.: US20190114493A1 to Ewert filed in 2017 and in view of Chinese Patent Pub. No.: CN 205302634 U that was filed on 6-8-2016 and in view of Ditty. The primary reference to Stein is silent but the ‘634 teaches “The system of claim 11, wherein the traffic sign is any of a construction sign, a regulatory sign, warning sign, a guide sign, a detour sign, speed limit sign, stop sign, services sign, recreation sign, construction zone sign, school zone sign, or railroad crossing sign.”.(See page 1). See motivation statement above. Claims 13-14 are rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent Application Pub. No.: US20080137908A1 to Stein et al. (hereinafter “Stein”) that was filed on 12-6-2007 and in further in view of United States Patent Application Pub. No.: US20190114493A1 to Ewert filed in 2017 and in view of Ditty. Stein discloses “... The system of claim 11, wherein the one or more processors are configured to: determine at least one attribute of the unfamiliar traffic sign and identify the unfamiliar traffic sign based on a comparison of the at least one attribute of the unfamiliar traffic sign to at least one attribute of a set of known traffic signs..(See FIG. 2,where the sign is a warning zone of no trucks in block 22; see FIG. 10 where the sign can be no passing, or no entrance in block 468, 478, and a 3 digit or 2 digit speed limit zone and in paragraph 30 the adaptive cruise control can or cannot be adjusted based on the speed limit or passing or no passing based on a no passing zone). The primary reference is silent but Ditty teaches “...21. (new) The method of claim 1, wherein the unfamiliar traffic sign is not in map information of the autonomous vehicle”. (see paragraph 168-169 where the autonomous vehicle may consult the super computer and not know what the sign is and instead rely on a deep learning inference) It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine Ditty with the disclosure of STEIN with a reasonable expectation of success since DITTY teaches that the autonomous vehicle can interface with three different neural networks to detect a sign and then read the text semantics of the sign that the flashing lights indicate icy conditions and then a second neural network can recognize the flashing lights and then infer that the conditions are in fact icy and then to based on the keywords slow the vehicle or plan the trajectory or wait. This can be provided an improved and trained semantic detection. See paragraph 255-270 of Ditty. The primary reference is silent but Ditty teaches “...22. (new) The method of claim 1, wherein identifying the unfamiliar traffic sign includes determining that a sign type of the unfamiliar traffic sign cannot be identified with a confidence level that meets a particular confidence threshold level. (see paragraph 225-230 where the neural network can provide a detection that has a low weight and thus a poor detection value and thus a trigger to not use this due to a poor measurement) It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine Ditty with the disclosure of STEIN with a reasonable expectation of success since DITTY teaches that the autonomous vehicle can interface with three different neural networks to detect a sign and then read the text semantics of the sign that the flashing lights indicate icy conditions and then a second neural network can recognize the flashing lights and then infer that the conditions are in fact icy and then to based on the keywords slow the vehicle or plan the trajectory or wait. This can be provided an improved and trained semantic detection. See paragraph 255-270 of Ditty. The primary reference is silent but Ditty teaches “...23. (new) The method of claim 1, further comprising, responsive to determining to not ignore the unfamiliar traffic sign, controlling, by the one or more processors, the autonomous vehicle based on the content. (see paragraph 43 where the visual data can capture the sign content and compare this to the neural network training information and react to the sign) It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine Ditty with the disclosure of STEIN with a reasonable expectation of success since DITTY teaches that the autonomous vehicle can interface with three different neural networks to detect a sign and then read the text semantics of the sign that the flashing lights indicate icy conditions and then a second neural network can recognize the flashing lights and then infer that the conditions are in fact icy and then to based on the keywords slow the vehicle or plan the trajectory or wait. This can be provided an improved and trained semantic detection. See paragraph 255-270 of Ditty. The primary reference is silent but Ditty teaches “...24. (new) The system of claim 11, wherein the unfamiliar traffic sign is not in map information of the autonomous vehicle. (see paragraph 43, 255-256, 131, 138, 166-169 where the neural network can provide an interference value when the value for the sign is not found based on the training) The primary reference is silent but Ditty teaches “...25. (new) The system of claim 11, wherein the one or more processors are further configured to determine that a sign type of the unfamiliar traffic sign cannot be identified with a confidence level that meets a particular confidence threshold level to identify the unfamiliar traffic sign. (see paragraph 43, 255-256, 131, 138, 166-169 where the neural network can provide an interference value when the value for the sign is not found based on the training and where the vehicle 1. Does not know about the sign and has never been trained about the sign and 2. Consults other neural network to provide a semantic meaning of the sign and then attempts to provide a semantic understanding to the path module for a sign that is not found) The primary reference is silent but Ditty teaches “...26. (new) The system of claim 11, wherein the one or more processors are further configured to responsive to a determination to not ignore the unfamiliar traffic sign, control the autonomous vehicle based on the content. (see paragraph 269-271 and 43, 255-256, 131, 138, 166-169 where the neural network can provide an interference value when the value for the sign is not found based on the training and where the vehicle 1. Does not know about the sign and has never been trained about the sign and 2. Consults other neural network to provide a semantic meaning of the sign and then attempts to read the sign and provide that reading to the sematic model and then provide a semantic understanding to the path module for a sign that is not found or obtain some “inference” about the sign’s semantic meaning that is not understood) See motivation statement above. Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP §§ 706.02(l)(1) - 706.02(l)(3) for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/process/file/efs/guidance/eTD-info-I.jsp. Claims 1-20 are rejected under obviousness double patenting in view of claim 1-17 of U.S. Patent No.: 10,928,828 that recites “ a method of controlling an autonomous vehicle in response to detecting and analyzing an unfamiliar traffic sign, the method comprising: receiving, by one or more processors, an image generated by a perception system of the autonomous vehicle, the perception system including one or more sensors; identifying, by the one or more processors, image data corresponding to a traffic sign in the image generated by the perception system of the autonomous vehicle; inputting, by the one or more processors, the identified image data corresponding to the traffic sign into one or more system software modules running on the one or more processors; determining, by the one or more processors, that the traffic sign in the image is an unfamiliar traffic sign that cannot be identified by the one or more system software modules; identifying, by the one or more processors, one or more attributes of the unfamiliar traffic sign; comparing, by the one or more processors, the identified one or more attributes of the unfamiliar traffic sign to known attributes of other traffic signs; determining, by the one or more processors, a category of the unfamiliar traffic sign based on the comparing; and controlling, by the one or more processors, the vehicle in an autonomous driving mode based on the determined category of the unfamiliar traffic sign”. The only differences are in claim of the present claims it recites causing an unfamiliar traffic sign is detected. The claims are otherwise identical. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JEAN PAUL CASS whose telephone number is (571)270-1934. The examiner can normally be reached Monday to Friday 7 am to 7 pm; Saturday 10 am to 12 noon. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Scott A. Browne can be reached on 571-270-0151. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JEAN PAUL CASS/Primary Examiner, Art Unit 3668
Read full office action

Prosecution Timeline

Oct 24, 2023
Application Filed
May 13, 2024
Non-Final Rejection — §103, §DP
Aug 05, 2024
Response Filed
Oct 26, 2024
Final Rejection — §103, §DP
Jan 28, 2025
Response after Non-Final Action
Jan 30, 2025
Request for Continued Examination
Jan 31, 2025
Response after Non-Final Action
Mar 21, 2025
Non-Final Rejection — §103, §DP
Jun 23, 2025
Applicant Interview (Telephonic)
Jun 23, 2025
Examiner Interview Summary
Jun 26, 2025
Response Filed
Sep 12, 2025
Final Rejection — §103, §DP
Nov 18, 2025
Examiner Interview Summary
Nov 18, 2025
Applicant Interview (Telephonic)
Nov 24, 2025
Request for Continued Examination
Dec 05, 2025
Response after Non-Final Action
Jan 05, 2026
Non-Final Rejection — §103, §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12593752
SYSTEM AND METHOD FOR CONTROLLING HARVESTING IMPLEMENT OPERATION OF AN AGRICULTURAL HARVESTER BASED ON TILT ACTUATOR FORCE
2y 5m to grant Granted Apr 07, 2026
Patent 12596986
GLOBAL ADDRESS SYSTEM AND METHOD
2y 5m to grant Granted Apr 07, 2026
Patent 12590801
REAL TIME DETERMINATION OF PEDESTRIAN DIRECTION OF TRAVEL
2y 5m to grant Granted Mar 31, 2026
Patent 12583572
MARINE VESSEL AND MARINE VESSEL PROPULSION CONTROL SYSTEM
2y 5m to grant Granted Mar 24, 2026
Patent 12571183
EXCAVATOR
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
73%
Grant Probability
99%
With Interview (+25.9%)
3y 1m
Median Time to Grant
High
PTA Risk
Based on 984 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month