Prosecution Insights
Last updated: April 19, 2026
Application No. 18/485,552

AUGMENTED REALITY SAFETY GLASSES

Final Rejection §103
Filed
Oct 12, 2023
Examiner
JUNG, JAEWOOK
Art Unit
3656
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Textron Inc.
OA Round
2 (Final)
33%
Grant Probability
At Risk
3-4
OA Rounds
2y 8m
To Grant
99%
With Interview

Examiner Intelligence

Grants only 33% of cases
33%
Career Allow Rate
1 granted / 3 resolved
-18.7% vs TC avg
Strong +100% interview lift
Without
With
+100.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
27 currently pending
Career history
30
Total Applications
across all art units

Statute-Specific Performance

§101
7.9%
-32.1% vs TC avg
§103
53.7%
+13.7% vs TC avg
§102
14.1%
-25.9% vs TC avg
§112
23.2%
-16.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 3 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment This office action is in response to the amendments filed September 9, 2025. Claims 1, 2, 9, 11, 16, and 19 are amended. Claims 12 and 18 are canceled. Claims 21-22 are introduced. Claims 1-11, 13-17, and 19-22 are pending and addressed below. Response to Arguments Applicant’s arguments with respect to the rejection of claims 1-20 under 35 USC 101 have been fully considered and are persuasive. The rejection under 35 USC 101 is withdrawn. Applicant’s arguments with respect to the rejection of claims 1-20 under 35 USC 103 have been fully considered but are not persuasive, as the arguments are directed towards the claims as amended. Applicant’s arguments with respect to claims 1, 11, and 16 have been considered but are moot because the new ground of rejection does not rely on any combination of references applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Applicant’s arguments are only directed to the claim amendments, which add new limitations to the claims and are addressed below by the addition of US20180196425 (Kobayashi) and US20200057488A1 (Johnson et al.) in the prior art rejection below. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-4, 9, 11, 13-14, 16-17, and 19-22 are rejected under 35 U.S.C. 103 as being unpatentable over US10188029B1 (Brown et al.) in view of US20180196425 (Kobayashi) and US20200057488A1 (Johnson et al.). Regarding claims 1, 11, and 16, Brown et al. discloses a system, method, and apparatus comprising: a movable apparatus configured to be operated by a user, the movable apparatus having processing circuitry including at least a processor and a memory; and [Brown et al., Column 2, Lines 23-26, “One embodiment disclosed herein includes a communications system that is operable between a vehicle and its operator and between the vehicle operator and a remote location.”] It would have been obvious to one of ordinary skill in the art, prior to the applicant’s effective filing date, that this operable vehicle also contains processing circuitry such as a processor and memory as it is required for remote operation. a head-mounted augmented reality display device configured to be mounted on a head of the user, wherein While Brown discloses the use of a display device configured to be worn by a user (Column 6, Lines 46-51), Brown does not explicitly disclose a head-mounted augmented reality display device, where said head-mounted augmented reality display device is configured to be mounted on a head of the user. From a similar field of endeavor, Kobayashi disclose a system comprising a head-mounted display device with a controller operably configured with an external apparatus (Kobayashi, Abstract), where the display device is worn by the user and receives data from the external apparatus. One of ordinary skill in the art would find it obvious to try, prior to the applicant’s effective filing date, the head-mounted display device of Kobayashi to the system of Brown et al. as they would be simple substitutes. the processing circuitry of the movable apparatus is configured to: obtain area data associated with a geographical area of a real-world location, wherein the area data includes a plurality of sub-areas separated by geofenced boundaries; [Brown et al., Column 7, Line 62 - Column 8, Line 1 “FIG. 4 shows representative types of data inputs that can be used to generate a three-dimensional map of a commercial or residential lawn. In one embodiment, a lawn mower is equipped with a sensor (such as, but not limited to, GPS equipment) to at least measure geographic location. In another embodiment, the sensor can also measure pitch and roll.”] While geofenced boundaries are not explicitly mentioned, it would have been obvious to one of ordinary skill in the art, prior to the applicant’s effective filing date, that the system is capable of recognizing geofences given that it is equipped with location-based sensing (GPS). In addition, Brown et al. discloses in the Background at column 2, lines 30-42, “Today, many farmers use GPS-derived products to enhance their farming operations. Location information is collected by GPS receivers for mapping field boundaries, roads, irrigation systems, and problem areas in crops such as weeds or disease.” As shown above, Brown et al. discloses using a GPS system for measuring location data, and it is well known in the art that GPS systems use location information to create defining sub-areas as evidence by the Background of Brown et al. One of ordinary skill in the art would find it obvious, prior to the applicant’s effective filing date, that dividing areas into sub-areas by defining boundaries is a well-known functionality of a GPS system. determine a location of the movable apparatus within the geographical area of the real-world location; [Brown et al., Abstract, “A method is provided for generating a three-dimensional geographical map of a lawn using a mower equipped with sensors for monitoring geographic location, pitch and roll, comprising the steps of: periodically acquiring position, pitch and roll data as the lawn is mowed; transmitting the geographic location, pitch and roll data to a computer processor; and processing the geographic location, pitch and roll data to generate the map. The method is useful for enhancing mowing efficiency and managing the activities of a fleet of mower units.”] determine a facing direction of the movable apparatus, wherein the facing direction is associated with a direction a front of the movable apparatus is currently facing; Referring back to the citation to the abstract by Brown et al., it would have been obvious to one of ordinary skill in the art, prior to the applicant’s effective filing date, that a system periodically acquiring position, pitch, and roll data would be able to determine the facing position as “pitch” is defined to be a measurement of “front/back tilt”, indicating that the “front” is known. determine an operating state of the movable apparatus; [Brown et al., Column 8, Lines 31-43, “The monitoring system can include a GPS unit 123 so that the location of vehicle 100 can be monitored. As discussed above in connection with the wheel speed data provided by an RPM sensor, the GPS and wheel speed data can be correlated to provide information about the speed at which the vehicle can operate in the various areas of a plot in which it is operating. With this information, a supervisor or a software program has the ability to determine ways to improve performance, such as by finding alternate routes at which the vehicle can operate more quickly, or monitoring the lawnmower operator to determine whether the operator is either moving too quickly to be operating safely, or too slowly to be efficient.”] generate media data using the location, the facing direction, and the operating state of the movable apparatus, wherein the media data includes: See the following rationales pertaining to the media data and how the location, facing direction, and operating state of the movable apparatus are used. boundary objects associated with the geofenced boundaries separating the plurality of sub-areas, [Brown et al., Column 10, Lines 48-60, “The sensor data set is transmitted to a processor and analyzed using appropriate software to generate a number of outputs to improve operating efficiency, some of which are shown in FIG. 5. For example, software can be applied to the data set to convert the geographic position, pitch and roll data into a three-dimensional terrain map of the lawn (e.g., a contour map). The contour map can include the perimeter and contours of the mowing surface along with the size, shape and position of specific structures and features within and around the mowing surface. In a further embodiment, an operator can provide through direct input supplemental information by identifying specific structures and features with the GPS equipment, mapping software and the like.”] line data indicating movable areas for the movable apparatus to move within a sub-area of the plurality of sub-areas, [Brown et al., Column 8, Lines 7-19, “While geographic location and pitch/roll data are adequate for generating a three-dimensional terrain map of a lawn, dynamic data collection and processing can also be useful to accurately determine time and cost of mowing a lawn as a whole or on an area-by-area basis. For example, time and instantaneous power usage can be acquired as a function of position while the lawn is being mowed according to a particular path of travel. The velocity (or speed) of a mower and its acceleration (or deceleration) at each of a set of geographic-position points is valuable as well, since features in the lawn which cause changes in lawn mower velocity and acceleration also add additional time and energy costs to a job.”] current line data indicating a current area for the movable apparatus to move within the movable areas, and In light of the rationale regarding the line data motion within a sub-area of the plurality of sub-areas, one of ordinary skill in the art would have found it further obvious that the system of Brown et al. is configured to also navigate current line data of a current area for the movable apparatus to move within the movable areas. direction indication information indicating a current moving direction of the movable apparatus; and [Brown et al., Abstract, “pitch and roll data as the lawn is mowed; transmitting the geographic location, pitch and roll data to a computer processor; and processing the geographic location, pitch and roll data to generate the map. The method is useful for enhancing mowing efficiency and managing the activities of a fleet of mower units.”] communicate the media data to the head-mounted augmented reality display device, and the head-mounted augmented reality display device is configured to: In light of the rationale of the limitation “a head-mounted augmented reality display device configured to be …”, Brown discloses transmitting data to the display device (Column 3, Lines 12-24), where the head-mounted display device of Kobayashi is also capable of receiving data from an external device. One of ordinary skill in the art would find it obvious to try, prior to the applicant’s effective filing date, communicating the media data to the head-mounted display of Kobayashi in place of the worn display device of Brown as there is a reasonable expectation of success from the substitution. obtain the media data from the movable apparatus; and [Brown et al., Column 3, Lines 34-41, “Still further features of this aspect can include a map which indicates a recommended mower type, a recommended mowing path, a recommended mowing path with increased efficiency or safety, a project time, and a project power or fuel usage. Features can also include a map which can be transmitted to a handheld computing device, a portable communication device, an operator's cell phone, a laptop, and an onboard processor for storage or display.”] using the media data, generate a user interface displayable by the head-mounted augmented reality display and overlaying a real-world terrain, wherein the user interface includes: While Brown et al. discloses using the media data, Brown et al. does not disclose overlaying a real-world terrain. From a similar field of endeavor, Johnson et al. disclose a fused image superimposing the viewed terrain with information (see Fig. 3) displayed by a display device (see Fig. 4, portable display device 420). One of ordinary skill in the art would find it obvious, prior to the applicant’s effective filing date, to combine the system of Johnson et al. to the system of Brown et al. as displaying the information in real-time ([0221]) would provide a direct upgrade to Brown et al. as Brown et al. does not combine a user interface with the sensor information. boundaries displayed using the boundary objects associated with the geofenced boundaries separating the plurality of sub-areas, See Fig. 3 of Johnson et al. Johnson et al. discloses boundaries (detected waterline 205 b) displayed using the boundary objects associated with boundaries (waterline 205) but does not disclose that the boundaries are geofenced ([0121]). One of ordinary skill in the art would find it obvious, prior to the applicant’s effective filing date to modify the system of Brown et al. to be compatible with the generated media data regarding boundary objects as the data gathered being displayed by the device would allow gauging distances between boundaries. line objects displayed using the line data indicating movable areas for the movable apparatus to move within the sub-area, Johnson et al. discloses the use of contour lines that may distinguish features such as elevation, relative distances, and various other characteristics of terrestrial features ([0121]). One of ordinary skill in the art would find it obvious, prior to the applicant’s effective filing date, to modify the system of Brown et al. as a user would be able to better identify the terrain being viewed through the fused image. current line objects displayed using the current line data indicating a current area for the movable apparatus to move within the movable areas, and In light of the rationale of “line objects displayed using the line data indicating movable areas for the movable apparatus …” and the suggested route 338 of Johnson et al. ([0122]), one of ordinary skill in the art would find it obvious, prior to the applicant’s effective filing date, that modifying the system of Brown et al. with the system of Johnson et al. to generate user interfaces comprising line objects displayed using the line data indicating movable areas for the movable apparatus is configured for current line objects displayed using current line data indicating a current area for the movable apparatus to move within the movable areas. direction indicators displayed using the direction indication information indicating the current moving direction of the movable apparatus; and As Brown et al. discloses generating the direction indicators (see citation in limitation “direction indication information indicating a current moving direction of the movable apparatus above) and Johnson et al. disclose that icon 350 may be a planned destination ([0122]) with route displaying functionality (suggested route 338 of Johnson et al.), one of ordinary skill in the art would find it obvious to try, prior to the applicant’s effective filing date, displaying direction indicators that indicating the current moving direction of the movable apparatus as it would be advantageous to allow a user to gauge the distance of the destination from the current position. update the user interface based on changes associated with the location, the facing direction, and the operating state of the movable apparatus. One of ordinary skill in the art would find it obvious, prior to the applicant’s effective filing date, to combine the real-time fused image generation of Johnson et al. ([0221]) of the system of Brown et al. as a real-time update of the environment surrounding a user would provide information relevant to the user locally as processed to improve decision making based on the parameters displayed. Regarding claims 2, 13, and 19, with all the limitations of claims 1, 11, and 16, the processing circuitry of the movable apparatus is further configured to: determine a moving amount of the movable apparatus; [Brown et al. Column 10, Lines 61-64, “Other useful outputs can include the total operation time, cost, and price to cut an entire lawn, or portion thereof; instructions for an operator regarding the selection of equipment; and instructions regarding an efficient mowing path.”] Brown et al. discloses that the system outputs total operation time of cutting an entire lawn. However, calculating the total operation time would require other variables as well such as the moving amount of the apparatus as the time a job takes is conditional to the speed of the apparatus and ground covered by the job. Furthermore, Brown et al. also discloses an efficient mowing path which obviously would require knowledge of minimizing redundant movement. update the media data based on the moving amount, the location, the facing direction, and the operating state of the movable apparatus; and See citation to Brown et al. column 8, lines 7-19 in claim 1. communicate the updated media data to the display device. See citation to Brown et al. column 3, lines 12-24 in claim 1. Regarding claims 3, 14, and 20, with all the limitations of claims 1, 11, and 16, the system, method, and apparatus further comprise: wherein the area data includes a plurality of markers indicating specific objects located in each of the plurality of sub-areas. See citation to Brown et al. column 10, lines 48-60 in claim 1. Regarding claim 4, with all the limitations of claim 1, the system further comprises: the location of the movable apparatus is determined using global position system (GPS) coordinate data, and [Brown et al., Column 6, Lines 10-12, “The GPS unit 123 is provided for serving as a geo-location device for vehicle 100, so that the location of vehicle 100 can be determined at any particular time.”] Regarding claim 9, with all the limitations of claim 1, the system further comprises: the display device includes glasses configured to generate audio and visual output to the user, and the movable apparatus includes a mowing device configured to mow a terrain. [Brown et al., Abstract, “A method is provided for generating a three-dimensional geographical map of a lawn using a mower equipped with sensors for monitoring geographic location, pitch and roll, comprising the steps of: periodically acquiring position, pitch and roll data as the lawn is mowed; transmitting the geographic location, pitch and roll data to a computer processor; and processing the geographic location, pitch and roll data to generate the map.”] However, Brown et al. does not disclose a display device with the inclusion of glasses capable of generating audio and visuals. While Brown et al. does not disclose the displace device described by the limitation above, Kobayashi discloses an augmented reality displace device in the form of wearable glasses capable of both audio and visual output. See Fig. 1. The figure clearly shows that the AR device as wearable glasses. Regarding the audio output, see Fig. 5. According to paragraph 30, Fig. 5 is a block diagram functionally showing the configuration of an HMD (head-mounted display). It would then have been obvious to one of ordinary skill in the art to combine the glasses by Kobayashi to the system of Brown et al. as the glasses would function as a substitute for the smartwatch disclosed by Brown et al. with more features that provide more functionality than said wristwatch such as audio. Regarding claim 17, with all the limitations of claim 16, the apparatus further comprises the processing circuitry further configured to: determine a facing direction of the movable apparatus; and generate the media data using the location, the facing direction, and the operating state of the movable apparatus. See each of the respective limitation’s rationales in claim 1. Regarding claim 21, with all the limitations of claim 1, the system further comprises: wherein the user interface displays portions in a first form associated with locations the moving apparatus has yet to navigate, and displays other portions in a second form associated with locations the moving apparatus has already navigated While Johnson et al. discloses displaying portions in a first form associated with locations the moving apparatus has yet to navigate ([0149], projected route 724) and displays other portions in a second form associated with locations the moving apparatus has already navigated ([0149], traveled route 722), Brown et al. in view of Johnson et al. do not disclose the use of a user interface to display the information regarding navigated locations. One of ordinary skill in the art would find it obvious to try, prior to the applicant’s effective filing date, to integrate the navigation database containing projected and travel routes disclosed by Johnson et al. ([0149]) to the system of Brown et al. as this would reduce overlapping in area traveled, minimizing backtracking. Regarding claim 22, with all the limitations of claim 1, the system further comprises: wherein the media data further includes at least one highlight object highlighting an area where the user should avoid operating the movable apparatus. While Johnson et al. discloses the highlighting of an area where the user should avoid operating the movable apparatus ([0154], underwater hazard 712), Brown et al. in view of Johnson et al. do not disclose the use of a user interface to display the information regarding the highlighted area. One of ordinary skill in the art would find it obvious to try, prior to the applicant’s effective filing date, to displaying the hazardous highlighted object area as this would allow user to avoid hazards in real-time. Claims 5, 6, and 15 are rejected under 35 U.S.C. 103 as being unpatentable over US10188029B1 (Brown et al.) in view of US20180196425 (Kobayashi) and US20200057488A1 (Johnson et al.) and further in view of US11197414B2 (Zeiler et al.). Regarding claim 5, with all the limitations of claim 1, the system further comprises: the facing direction is determined using compass data. While Brown et al. does not teach the use of compass data to determine the facing direction of the moving apparatus, Zeiler et al., in a similar field of endeavor, teaches an automatic lawn care device and service remotely operable (Zeiler et al., Column 5, Lines 40-43) and capable of supporting augmented reality function (Zeiler et al., Column 27, Lines 32-39). In column 5, lines 6-10, Zeiler et al. specifically disclose the use of further sensors such as “moisture sensors, rain sensors, air quality sensors, magnetic field sensors (e.g. compass), temperature sensors, rotation sensors, gyroscopes, chemical detection sensors, and the like.” Therefore, it would have been obvious for one of ordinary skill in the art, prior to the applicant’s effective filing date, to implement the compass disclosed by Zeiler et al. to the system of Brown et al. as the compass is an obvious substitute to the current roll and pitch sensors from Brown et al. that determine orientation of the mower. Regarding claim 6, with all the limitations of claim 1, the system further comprises: wherein the facing direction of the movable apparatus is determined using image processing data obtained by the movable apparatus. While Brown et al. do not disclose the use of image data in their system, Zeiler et al. discloses a navigation module capable of utilizing image data. In column 12, lines 46-49 of their disclosure, Zeriler et al. disclose “The image data may have been captured by various methods, including visual recordings by the autonomous lawn mower 100, a user, a drone, satellite imagery, and the like.” From the same paragraph, the navigation module 426 is “configured to control the navigation of the autonomous lawn mower 100.” (Lines 23-24). Furthermore, Zeiler et al. proceeds to disclose that the module “communicates navigation instructions to the operating/control module 420.” (Lines 25-26). Regarding claim 15, with all the limitations of claim 11, the method further comprises: the location of the movable apparatus is determined using global position system (GPS) coordinate data, and [Brown et al., Column 6, Lines 10-12, “The GPS unit 123 is provided for serving as a geo-location device for vehicle 100, so that the location of vehicle 100 can be determined at any particular time.”] the facing direction is determined using compass data. See rationale to claim 5. It would have been obvious for one of ordinary skill in the art, prior to the applicant’s effective filing, to combine the image processing system of Zeiler et al. to the system of Brown et al. Given that Brown et al. discloses the use of navigation by GPS, including the feature disclosed by Zeiler et al. would both create a more robust, reliable system from the sensor fusion and act as a more featured visual data sample as surrounding environment details would be available to be closely examined versus a GPS only system. Claims 7-8 are rejected under 35 U.S.C. 103 as being unpatentable over US10188029B1 (Brown et al.) in view of US20180196425 (Kobayashi) and US20200057488A1 (Johnson et al.) and further in view of US20210100166A1 (Becke et al.) Regarding claim 7, with all the limitations of claim 1, the system further comprises: wherein the operating state of the movable apparatus includes a current height of mower blades of the movable apparatus, and whether the mower blades of the movable apparatus are active. While Brown et al. does not disclose a system that explicitly monitors the state of the mower blades of a movable apparatus, Becke et al., in a similar field of endeavor, disclose a remotely controllable lawn mower with a grass height sensor and blade motor actuator. From Fig.13A of Becke et al., an actuator system 72 is disclosed to be comprised of motors and members for blade and deck. From paragraph 128 of Becke et al.’s disclosure, “In one embodiment, the blade adjustment motor 78 is configured to adjust the height and/or angle of the blade control ring 82.”, where the blade control ring 82 is configured to house blade 28 (Becke et al., Paragarph 130). Regarding the state of the blade’s activity, Fig. 7 of Becke et al.’s disclosure shows a blade motor current sensor. Therefore, it would have been obvious for one of ordinary skill in the art, prior to the applicant’s effective filing date, to implement these features into the system of Brown et al. As disclosed in Becke et al., grass height is a consideration made when the height of the blade is decided on (Becke et al., Paragraph 108). One of ordinary skill would include this system to Brown et al. as adjustable grass height could be seen as a direct upgrade to a fixed height. Regarding the state of the blades of the movable apparatus, one of ordinary skill would understand that an unmonitored blade ring height and activity of high speed, sharp objects is an obvious safety concern and look to implement a sensor in this way. Regarding claim 8, with all the limitations of claim 7, the system further comprises: wherein the operating state of the movable apparatus further includes indication of whether certain reels of the movable apparatus are active. [Becke et al., Paragraph 90, “In one or more embodiment, the one or more blades 28 are selected from the group consisting of cylinder/reel blades, deck blades, mulching blades, and lifting blades.”] Claims 10 is rejected under 35 U.S.C. 103 as being unpatentable over US10188029B1 (Brown et al.) in view of US20180196425 (Kobayashi) and US20200057488A1 (Johnson et al.) and further in view of US20180225875A1 (Yasrebi). Regarding claim 10, with all the limitations of claim 1, the system further comprises: While Brown et al. does not disclose a triangulation system comprised of base stations, Yasrebi discloses a system of utilizing a vehicle with a real-world camera to formulate an AR display image to a user comprising of location, orientation, and field of view data. a plurality of base stations, wherein the location of the movable apparatus is determined using triangulation techniques based on a position of the movable apparatus with respect to some of the plurality of base stations. Similarly to the previous limitation, Yasrebi further discloses from the same paragraph, “These location units can include, but are not limited to at least, one of a Global positioning systems (GPS) unit 188-1 for obtaining the location data using satellites, a cellular tower reception positioning unit 188-2, a WiFi networks positioning unit 188-3, and a surrounding video positioning unit 18-4 via processing a video stream from the vehicle’s surrounding. The cellular tower reception positioning unit 188-2 and the WiFi networks positioning unit 188-3 can use triangulation techniques that are known by those skilled in the art to obtain the location data.” It would have been obvious to one of ordinary skill in the art, prior to the applicant’s effective filing date, to implement this feature disclosed by Yasrebi to the system of Brown et al. As Brown et al. primarily utilizes GPS information and pose information (such as pitch and yaw) as information sent to a display device of a user by WiFi, one of ordinary skill in the art would find that there already exists support to integrate this feature to create a more robust, and accurate tracking system for the moving apparatus by this multi-sensor approach. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JAEWOOK JUNG whose telephone number is (571)272-5470. The examiner can normally be reached Monday - Friday, 9:00 AM - 5:00 PM.. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Wade Miles can be reached on (571) 270-7777. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /J.J./Examiner, Art Unit 3656 /WADE MILES/Supervisory Patent Examiner, Art Unit 3656
Read full office action

Prosecution Timeline

Oct 12, 2023
Application Filed
Jun 04, 2025
Non-Final Rejection — §103
Sep 09, 2025
Response Filed
Nov 25, 2025
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12514149
SYSTEMS AND METHODS FOR SPRAYING SEEDS DISPENSED FROM A HIGH-SPEED PLANTER
2y 5m to grant Granted Jan 06, 2026
Patent 12480561
VEHICLE AND CONTROL METHOD THEREOF
2y 5m to grant Granted Nov 25, 2025
Study what changed to get past this examiner. Based on 2 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
33%
Grant Probability
99%
With Interview (+100.0%)
2y 8m
Median Time to Grant
Moderate
PTA Risk
Based on 3 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month