Prosecution Insights
Last updated: April 19, 2026
Application No. 18/548,410

CONTROL SYSTEM, CONTROL METHOD AND PROGRAM

Final Rejection §103
Filed
Aug 30, 2023
Examiner
LIU, JUNG-JEN
Art Unit
2473
Tech Center
2400 — Computer Networks
Assignee
Nippon Telegraph and Telephone Corporation
OA Round
2 (Final)
89%
Grant Probability
Favorable
3-4
OA Rounds
2y 7m
To Grant
94%
With Interview

Examiner Intelligence

Grants 89% — above average
89%
Career Allow Rate
1070 granted / 1198 resolved
+31.3% vs TC avg
Minimal +5% lift
Without
With
+4.7%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
36 currently pending
Career history
1234
Total Applications
across all art units

Statute-Specific Performance

§101
6.2%
-33.8% vs TC avg
§103
71.4%
+31.4% vs TC avg
§102
5.6%
-34.4% vs TC avg
§112
2.9%
-37.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1198 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION Allowable Subject Matter 0. Claims 10-12 are objected to as dependent upon rejected claims, but would be allowable if rewritten in independent form including all the limitations of the base claim and any intervening claims. Response to Applicant’s Remarks 0a. Applicant’s arguments and remarks, filed on 1/20/2026 (hereinafter Remarks), are acknowledged, and have been fully considered. Regarding Applicant’s claim amendments: a prediction unit configured to periodically predict a wireless parameter indicative of future wireless communication quality based on information related to a wireless communication device and environment information that affects the wireless communication quality, in a predetermined cycle, the Examiner finds Veijalainen (US 20230345271 A1) discloses periodical predicting future RSSP at time i over a time period T in a predetermined cycle (Fig 7, NxM and NxL durations): PNG media_image1.png 200 400 media_image1.png Greyscale Fig 7, NxM and NxL durations. The Examiner updates the rejection based on Applicant’s amendments, and this office action is made final. Claim Interpretation and Prior Art Matching 1. The claimed limitation “quality” is interpreted as “network performance parameters” based on Applicant’s disclosures in: [0025] The prediction/estimation function unit 120 predicts or estimates wireless parameters such as wireless quality (the quality of wireless communication) based on information acquired by the grasping/visualization function unit 110. 1a. For ease of discussion, the Examiner creates a table to match Applicant’s claimed limitations with prior art disclosures: Claim Instant Application 18/548,410 Prior Art Ditty US 20240045426 A1 1 control system Fig 4, Controller 100(1)-100(3) 1 prediction unit Fig 41, World Model 3002, Planning 3004 1 wireless communication device V2V or I2V see [0022] … Direct links may be provided by vehicle-to-vehicle (“V2V”) communication link, while indirect links are often referred to as infrastructure-to-vehicle (“I2V”) links. 1 target device or target system Any devices or sensors controlled by the control system (Fig 4, LIDAR 70(1), Stereo Camera (74), Steering Sensor 78, …. 1 quality network performance parameters, for example: [0021] ….. Based on the vehicle's position, distance and the speed of the vehicle ahead, … Claim Rejections - 35 USC § 103 2. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. 2a. Claims 1-9 are rejected under 35 U.S.C. 103 as being unpatentable over Ditty (US 20240045426 A1) in view of Veijalainen (US 20230345271 A1). 2b. Summary of the Cited Prior Art Ditty discloses a method for autonomous vehicle control. Veijalainen discloses a method for evaluating and predicting network properties. 2c. Claim Analysis Regarding Claim 1, Ditty discloses: A control system comprising [(Ditty discloses autonomous driving control system: [0121] Each controller is essentially one or more onboard supercomputers that can operate in real-time to process sensor signals, and output autonomous operation commands to self-drive vehicle (50) and/or assist the human vehicle driver in driving. Each vehicle may have any number of distinct controllers for functional safety and additional features. For example, Controller (100(1)) may serve as the primary computer for autonomous driving functions, Controller (100(2)) may serve as a secondary computer for functional safety functions, Controller (100(3)) may provide artificial intelligence functionality for in-camera sensors, and Controller (100(4)) (not shown) may provide infotainment functionality and provide additional redundancy for emergency situations. Fig 4, Controller 100(1)-100(3); Figs 65-68; Fig 13, Advanced SoC 100; see also Figs 3, and 5-6)]: a prediction unit configured to periodically predict a wireless parameter indicative of future wireless communication quality based on information related to a wireless communication device and environment information that affects the wireless communication quality, in a predetermined cycle [(Ditty discloses controller with deep-learning neural network that predicts network performance parameters (or qualities): [0021] Model-based systems are typically based on proportional—integral—derivative controller (“PID controller”) or model predictive control (“MPC”) techniques. Based on the vehicle's position, distance and the speed of the vehicle ahead, the controller optimally calculates the wheel torque taking into consideration driving safety and energy cost. [0516] The example non-limiting platform of FIG. 42 includes the processing functional blocks shown in FIG. 41, and also shows additional detail concerning interconnections between those functional blocks. For example, trajectory estimation 3030 has been expanded to include basic trajectory estimation 3030 and advanced trajectory estimation 3030a. Similarly, planning 3004 has been expanded to include a basic behavior planner 3004a, an advanced behavior planners 3004b, a lane planner 3004c, a route planner 3004e, and a behavior selector 3004d. Low level processing 3018 is shown as including for example basic lane detection 3160, vertical landmark detection 3158, LIDAR ground plane detection 3154, LIDAR ICP 3152, feature tracking 3150, freespace tracking 3162, object tracking 3164, and a dynamic occupancy grid 3102. Fig 41, World Model 3002, Planning 3004; Fig 42, In-Path Determination 3120; Figs 3, neural network function as a prediction unit; Figs 6 and 10; see also Fig 4, Controller 100(1)-100(3); Fig 13, Advanced SoC 100; Figs 63-68)]: wherein at least one of a target device or a target system is controlled based on the future wireless communication quality, and [(Ditty discloses a plurality of target devices and system controlled by controllers base on feedback information: [0124] Controller (100) provides autonomous driving outputs in response to an array of sensor inputs including, for example: one or more ultrasonic sensors (66), one or more RADAR sensors (68), one or more Light Detection and Ranging (“LIDAR”) sensors (70), one or more surround cameras (72) (typically such cameras are located at various places on vehicle body (52) to image areas all around the vehicle body), one or more stereo cameras (74) (in preferred embodiments, at least one such stereo camera faces forward to provide depth-perception for object detection and object recognition in the vehicle path), one or more infrared cameras (75), GPS unit (76) that provides location coordinates, a steering sensor (78) that detects the steering angle, speed sensors (80) (one for each of the wheels (54)), an inertial sensor or inertial measurement unit (“IMU”) (82) that monitors movement of vehicle body (52) (this sensor can be for example an accelerometer(s) and/or a gyro-sensor(s) and/or a magnetic compass(es)), tire vibration sensors (85), and microphones (102) placed around and inside the vehicle. Other sensors may be used, as is known to persons of ordinary skill in the art. Fig 4, LIDAR 70(1), Stereo Camera (74), Steering Sensor 78, …. That function as target system; Fig 13, Advanced SoC 100; see also Figs 3, and 5-6)]; wherein information obtained from at least one of the target device or the target system that is controlled is used for prediction of a next cycle by the prediction unit [(Ditty discloses controllers receive feedback information from target devices for further predictions: [0123] …… The hardware provides a bridge between the vehicle's CAN bus and the controller (100), forwarding vehicle data to controller (100) including the turn signal, wheel speed, acceleration, pitch, roll, yaw, Global Positioning System (“GPS”) data, tire pressure, fuel level, sonar, brake torque, and others. [0125] Controller (100) also receives inputs from an instrument cluster (84) and can provide human-perceptible outputs to a human operator via human-machine interface (“HMI”) display(s) (86), an audible annunciator, a loudspeaker and/or other means. Fig 4, Controller 100(1)-100(3) function as a control system, and LIDAR 70(1), Stereo Camera (74), Steering Sensor 78, …. That function as target system; Fig 13, Advanced SoC 100; see also Figs 3, and 5-6)]. Ditty does not elaborate about periodically predict a wireless parameter indicative of future wireless communication quality. However, Veijalainen discloses: a prediction unit configured to periodically predict a wireless parameter indicative of future wireless communication quality based on information related to a wireless communication device and environment information that affects the wireless communication quality, in a predetermined cycle [(Veijalainen discloses periodical predicting future RSSP at time i over a time period T in a predetermined cycle (Fig 7, NxM and NxL durations): PNG media_image1.png 200 400 media_image1.png Greyscale Fig 7, NxM and NxL durations. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to integrate Ditty’s method for autonomous vehicle control with Veijalainen’s method for evaluating and predicting network properties with the motivation being enabling/realizing evaluation and control of predictive machine learning models in mobile networks (Veijalainen, [0036]). Regarding Claim 2, Ditty discloses: wherein the target device includes at least one of a base station or a terminal, and wherein a wireless parameter for at least one of the base station or the terminal is controlled [(Ditty discloses V2V and I2V communication technology comprising on-board base station, terminals to control target device: [0022] Cooperative Adaptive Cruise Control (“CACC”) uses information from other vehicles. This information may be received through an antenna and a modem directly from other vehicles (in proximity), via wireless link, or indirectly, from a network connection. Direct links may be provided by vehicle-to-vehicle (“V2V”) communication link, while indirect links are often referred to as infrastructure-to-vehicle (“I2V”) links. In general, the V2V communication concept provides information about the immediately preceding vehicles (i.e., vehicles immediately ahead of and in the same lane as the ego vehicle), while the I2V communication concept provides information about traffic further ahead. Figs 42, 48, 64 and 65-68)]. Regarding Claim 3, Ditty discloses: wherein the target device includes a reflector, and wherein at least one of a radio wave reflection direction or radio wave reflection power of the reflector is controlled based on the future wireless communication quality [(Ditty disclose LIDAR that collects reflected radio wave infoemation: [0145] In another embodiment, newer LIDAR technologies, such as 3D Flash LIDAR, may also be used. 3D Flash LIDAR uses a flash of a laser as a transmission source, to illuminate vehicle surroundings approximately 200 m. A Flash LIDAR unit includes a receptor, which records the laser pulse transit time and the reflected light on each pixel, which in turn corresponds to the range from the vehicle to the objects. Flash LIDAR allows highly accurate and distortion-free images of the surroundings to be generated with every laser flash. In a preferred embodiment, four Flash LIDARs are deployed, one at each side of the autonomous vehicle. Figs 42, 48-49, 64 and 65-68)]. Regarding Claim 4, Ditty discloses: wherein the target device includes a movable base station, and wherein a position of the movable base station is controlled based on the future wireless communication quality [(Ditty discloses V2V and I2V communication technology comprising on-board base station: [0022] Cooperative Adaptive Cruise Control (“CACC”) uses information from other vehicles. This information may be received through an antenna and a modem directly from other vehicles (in proximity), via wireless link, or indirectly, from a network connection. Direct links may be provided by vehicle-to-vehicle (“V2V”) communication link, while indirect links are often referred to as infrastructure-to-vehicle (“I2V”) links. In general, the V2V communication concept provides information about the immediately preceding vehicles (i.e., vehicles immediately ahead of and in the same lane as the ego vehicle), while the I2V communication concept provides information about traffic further ahead. Figs 42, 48, 64 and 65-68)]. Regarding Claim 5, Ditty discloses: wherein the target system includes a video system for control and a self-driving vehicle traveling system, and wherein a video for controlling a self-driving vehicle and traveling of the self-driving vehicle are controlled based on the future wireless communication quality [(Ditty discloses video system for path planning and control: [0139] In a preferred embodiment, all cameras record and provide video information simultaneously. [0131] Front-facing cameras help identify forward facing paths and obstacles, and provide information critical to making an occupancy grid and determining the preferred vehicle paths. Front-facing cameras may be used to perform many of the same ADAS functions as LIDAR, including emergency braking, pedestrian detection, and collision avoidance. Front-facing cameras may also be used for ADAS functions and systems including Lane Departure Warnings (“LDW”), and Autonomous Cruise Control (“ACC”), and other functions such as traffic sign recognition. Figs 42, 48, 64 and 65-68)]. Regarding Claim 6, Ditty discloses: wherein the environment information includes at least one of video information captured by a camera, sensor information sensed by a sensor, or map information acquired from a map information database [(see: [0012] The success of Level 1 and Level 2 ADAS products, coupled with the promise of dramatic increases in traffic safety and convenience, have driven investments in self-driving vehicle technology. Yet despite that immense investment, no vehicle is available today that provides Level 4 or Level 5 functionality and meets industry safety standards, and autonomous driving remains one of the world's most challenging computational problems. Very large amounts of data from cameras, RADAR, LIDAR, and HD-Maps must be processed to generate commands to control the car safely and comfortably in real-time. Ensuring that cars can react correctly in a fraction of a second to constant- and rapidly-changing circumstances requires interpreting the torrent of data rushing at it from a vast range of sensors, such as cameras, RADAR, LIDAR and ultrasonic sensors. Figs 42, 48 and 65-68)]. Regarding Claim 7, Ditty discloses: wherein the information related to the wireless communication device includes received power information for the wireless communication device and object information obtained by detecting an object around the wireless communication device by wireless sensing [(Ditty discloses V2V and I2V communication technology comprising on-board wireless communication device: [0022] Cooperative Adaptive Cruise Control (“CACC”) uses information from other vehicles. This information may be received through an antenna and a modem directly from other vehicles (in proximity), via wireless link, or indirectly, from a network connection. Direct links may be provided by vehicle-to-vehicle (“V2V”) communication link, while indirect links are often referred to as infrastructure-to-vehicle (“I2V”) links. In general, the V2V communication concept provides information about the immediately preceding vehicles (i.e., vehicles immediately ahead of and in the same lane as the ego vehicle), while the I2V communication concept provides information about traffic further ahead. Figs 42, 48, 64 and 65-68)]. Regarding Claim 8, the claim discloses similar features as of Claim 1, and is rejected accordingly. Regarding Claim 9, the claim discloses similar features as of Claim 1, and is rejected accordingly. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Jung-Jen Liu whose telephone number is 571-270-7643. The examiner can normally be reached on Monday to Friday, 9:00 AM to 5:00 PM. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kwang B. Yao can be reached on 571-272-3182. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JUNG LIU/Primary Examiner, Art Unit 2473
Read full office action

Prosecution Timeline

Aug 30, 2023
Application Filed
Oct 24, 2025
Non-Final Rejection — §103
Jan 20, 2026
Response Filed
Mar 05, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12604207
APPARATUSES AND METHODS FOR FACILITATING AN INTEGRATED NETWORK AND SYSTEM PLANNING TOOL
2y 5m to grant Granted Apr 14, 2026
Patent 12604332
TECHNIQUES FOR SENDING ASSISTANCE INFORMATION FOR CANCELLING INTERFERENCE IN WIRELESS COMMUNICATIONS
2y 5m to grant Granted Apr 14, 2026
Patent 12603695
Reconfigurable Wireless Radio System for Providing Highly Sensitive Nationwide Radar Functionality Using a Limited Number of Frequencies and Adaptable Hardware
2y 5m to grant Granted Apr 14, 2026
Patent 12593241
COMPUTERIZED SYSTEMS AND METHODS FOR NON-DISRUPTIVE CAC ON A NETWORK VIA MLO FUNCTIONALITY
2y 5m to grant Granted Mar 31, 2026
Patent 12593375
METHOD OF DETECTING STATUS OF USER OUTSIDE VEHICLE, SYSTEM, STORAGE MEDIUM AND VEHICLE THEREOF
2y 5m to grant Granted Mar 31, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
89%
Grant Probability
94%
With Interview (+4.7%)
2y 7m
Median Time to Grant
Moderate
PTA Risk
Based on 1198 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month