Prosecution Insights
Last updated: April 19, 2026
Application No. 18/306,802

CONTROL SYSTEM AND CONTROL METHOD FOR CONTROLLING ELECTRIC WALKING AID DEVICE

Final Rejection §103
Filed
Apr 25, 2023
Examiner
KNUTSON, JACOB D
Art Unit
3611
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Pegatron Corporation
OA Round
2 (Final)
79%
Grant Probability
Favorable
3-4
OA Rounds
2y 9m
To Grant
99%
With Interview

Examiner Intelligence

Grants 79% — above average
79%
Career Allow Rate
824 granted / 1043 resolved
+27.0% vs TC avg
Strong +21% interview lift
Without
With
+21.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
36 currently pending
Career history
1079
Total Applications
across all art units

Statute-Specific Performance

§101
0.3%
-39.7% vs TC avg
§103
45.9%
+5.9% vs TC avg
§102
22.3%
-17.7% vs TC avg
§112
25.9%
-14.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1043 resolved cases

Office Action

§103
DETAILED ACTION The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1, 2, 4 – 6, 8 – 10, and 12 – 15 are is/are rejected under 35 U.S.C. 103 as being unpatentable over Li et al. (CN 112869969 A) in view of Yang et al. (WO 2020073168 A1). For claim 1, Li et al. discloses a control system for controlling an electric walking aid device (electric wheelchair), comprising: [a panoramic camera disposed on the electric walking aid device, configured to capture a panoramic image around the electric walking aid device] (page 37, paragraph [n0073]); [a navigation information device (point cloud data acquisition equipment) disposed on the electric walking aid device, configured to generate navigation information] (page 31, paragraph [n0064] and page 39, paragraph [n0076]); and [a controller coupled to the panoramic camera and the navigation information device, configured to perform at least one of human detection and joint point detection of a user according to the panoramic image] (page 11, paragraph [n0024], page 25, paragraph [n0059]), and page 35, paragraph [n0070]), [obtain a distance between the electric walking aid device and the user through point cloud information] (page 37, paragraph [n0073]) and [control the electric walking aid device to approach the user according to the navigation information] (page 35, paragraph [n0070]); but does not explicitly disclose wherein when the distance is less than a preset distance, the controller searches for a position behind the user’s back according to a detection result of the at least one of the human detection and the joint point detection, and controls the electric walking aid device to move toward the user’s back. Yang et al. discloses a wheelchair 130 [may include other sensors to obtain the spatial position information of the wheelchair, the posture information of the wheelchair, and determine whether the wheelchair is in contact with objects in the spatial environment, wherein other sensors may include an inertial sensor, force sensors, contact sensor, optical fiber sensor, Hall effect sensor, displacement sensor, or any combination thereof; wherein the inertial sensor may include inclination sensors, acceleration sensors, an angular velocity sensors, attitude and heading reference system, inertia measuring units, etc., or any combination thereof] (pages 42 – 43, paragraph [0061]); [analysis module 530 may obtain known contour data of the target data from a memory 140; specifically, the known contour data may take form of pseudo-grayscale, point cloud, grid, or any combination thereof] (pages 76 and 77, paragraph [0093]); and [the distance and/or distance of the target object relative to the wheelchair can also be determined directly through at least one image acquisition device (e.g., a depth camera)] (page 84, paragraph [0101]); and [based on the orientation of the target object relative to the wheelchair, the wheelchair can be controlled to rotate a certain angle according to the orientation such that the front of the wheelchair faces the target object] (page 87, paragraph [0104]) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to additionally use the inertia measurement unit and depth camera of Yang et al. with the control system of Li et al. with a reasonable expectation of success because it would allow for lightweight and relatively inexpensive sensors, while providing continuous measurements, thus improving overall sensing of the environment. For claim 2, Li et al. modified as above discloses the control system [wherein the controller controls the electric walking aid device to move in a field such that the navigation information device generates the navigation information of the field] (page 35, paragraph [n0070]). For claim 4, Li et al. modified as above discloses the control system [wherein the controller knows the user’s position in the panoramic image through the human detection, and the controller knows the user’s posture through the joint point detection] (page 25, paragraph [n0059]). For claim 5, Li et al. modified as above discloses the control system wherein the navigation information comprises [acceleration information] (page 43, paragraph [0061] of Yang et al.) and [point cloud information] (page 37, paragraph [n0073]), wherein the navigation information device comprises: [an inertial measurement unit configured to obtain the acceleration information of the electric walking aid device as the electric walking aid device moves in a field] (pages 42 and 43, paragraph [0061] of Yang et al.); and [a depth camera configured to generate the point cloud information of the field] (page 84, paragraph [0101] of Yang et al.). For claim 6, Li et al. modified as above discloses the control system [wherein the controller generates a real-time map according to the point cloud information and the acceleration information, wherein the real-time map corresponds to an environment of the field] (page 39, paragraph [n0076] of Li et al. and page 43, paragraph [0061] and page 84, paragraph [0101] of Yang et al.). For claim 8, Li et al. modified as above discloses the control system wherein: [the controller determines whether the user is sitting on the electric walking aid device according to a detection result of the at least one of the human detection and the joint point detection] (page 26, paragraph [n0060]), [when the controller determines that the user is sitting on the electric walking aid device, the controller controls the electric walking aid device to stop moving, and when the controller determines that the user is not sitting on the electric walking aid device, the controller controls the electric walking aid device to move toward the user’s back] (pages 25 – 27, paragraph [n0060] and [n0061], wherein the electric walking aid device moves toward the user’s back depends upon the user and the ability of the user to reposition themselves). For claim 9, Li et al. discloses a control method for controlling an electric walking aid device (electric wheelchair), comprising: [capturing a panoramic image around the electric walking aid device by a panoramic camera] (page 37, paragraph [n0073]), and [performing at least one of human detection and joint detection of the user according to the panoramic image] (page 11, paragraph [n0024] and page 25, paragraph [n0059]); [generating navigation information by a navigation information device; detecting a user according to the panoramic image] (page 31, paragraph [n0064] and page 39, paragraph [n0076]), and [controlling the electric walking aid device to approach the user according to the navigation information] (page 35, paragraph [n0070]); [obtain a distance between the electric walking aid device and the user through point cloud information] (page 37, paragraph [n0073]); but does not explicitly disclose wherein when the distance is less than a preset distance, searching for a position behind the user’s back according to a detection result of the at least one of the human detection and the joint point detection, and controls the electric walking aid device to move toward the user’s back. Yang et al. discloses a wheelchair 130 [may include other sensors to obtain the spatial position information of the wheelchair, the posture information of the wheelchair, and determine whether the wheelchair is in contact with objects in the spatial environment, wherein other sensors may include an inertial sensor, force sensors, contact sensor, optical fiber sensor, Hall effect sensor, displacement sensor, or any combination thereof; wherein the inertial sensor may include inclination sensors, acceleration sensors, an angular velocity sensors, attitude and heading reference system, inertia measuring units, etc., or any combination thereof] (pages 42 – 43, paragraph [0061]); [analysis module 530 may obtain known contour data of the target data from a memory 140; specifically, the known contour data may take form of pseudo-grayscale, point cloud, grid, or any combination thereof] (pages 76 and 77, paragraph [0093]); and [the distance and/or distance of the target object relative to the wheelchair can also be determined directly through at least one image acquisition device (e.g., a depth camera)] (page 84, paragraph [0101]); and [based on the orientation of the target object relative to the wheelchair, the wheelchair can be controlled to rotate a certain angle according to the orientation such that the front of the wheelchair faces the target object] (page 87, paragraph [0104]) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to additionally use the inertia measurement unit and depth camera of Yang et al. with the control system of Li et al. with a reasonable expectation of success because it would allow for lightweight and relatively inexpensive sensors, while providing continuous measurements, thus improving overall sensing of the environment. For claim 10, Li et al. modified as above discloses the control method [wherein generating the navigation information by the navigation information device comprises: controlling the electric walking aid device to move in a field such that the navigation information device generates the navigation information of the field] (page 35, paragraph [n0070]). For claim 12, Li et al. modified as above discloses the control method wherein performing the at least one of the human detection and the joint point detection according to the panoramic image comprises: [knowing the user’s position in the panoramic image through the human detection; and knowing the user’s posture through the joint point detection] (page 25, paragraph [n0059]). For claim 13, Li et al. modified as above discloses the control method wherein the navigation information comprises acceleration information and point cloud information, wherein generating the navigation information by the navigation information device comprises: [obtaining the acceleration information as the electric walking aid device moves in a field] (pages 42 and 43, paragraph [0101] of Yang et al.); and [generating the point cloud information of the field] (page 37, paragraph [n0073]). For claim 14, Li et al. modified as above discloses the control method further comprising: [generating a real-time map according to the point cloud information and the acceleration information, wherein the real-time map corresponds to an environment of the field] (page 39, paragraph [n0076] of Li et al. and page 43, paragraph [0061] and page 84, paragraph [0101] of Yang et al.). For claim 15, Li et al. modified as above discloses the control method further comprising: [determining whether the user is sitting on the electric walking aid device according to a detection result of the at least one of the human detection and the joint point detection] (page 26, paragraph [n0060]); [controlling the electric walking aid device to stop moving when that the user is sitting on the electric walking aid device is determined; and controlling the electric walking aid device to move toward the user’s back when that the user is not sitting on the electric walking aid device is determined] (pages 25 – 27, paragraph [n0060] and [n0061], wherein the electric walking aid device moves toward the user’s back depends upon the user and the ability of the user to reposition themselves). Response to Arguments Applicant's arguments filed 11/28/25 have been fully considered but they are not persuasive. Applicant argues the prior art fails to disclose “when the distance is less than a preset distance, the controller searches for a position behind the user’s back according to a detection result of the at least one of the human detection and the joint point detection, and controls the electric walking aid device to move toward the user’s back”. Specifically, applicant argues “Yang teaches away by giving embodiments of the wheelchair moving towards the object in a face-to-face direction; likewise, Yang neither teaches nor implies at least one of the human detection and the joint point detection.” Applicant states Li and Yang does not read on “searches for a position behind the user’s back according to a detection result of the at least one of the human detection and the joint point detection”. However, Li et al. discloses [wherein a reference object may be a user themselves] (page 11, paragraph [n0024]). In addition, Yang et al. discloses a process of bringing a wheelchair closer to a target object. Thus, when the wheelchair is brought closer to the user, the wheelchair is brought closer to the user’s back. The examples given by Li et al. modified above, is not limited to only face-to-face examples, but may include examples of wheelchair face-to-user back based on the decision of the user. Therefore, based upon the decision of the user on their positioning relative to the wheelchair, the prior art reads upon the required claim limitations. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Jacob D. Knutson whose telephone number is (571)270-5576. The examiner can normally be reached 8:00 am - 4:00 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Valentin Neacsu can be reached at (571)-272-6265. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JACOB D KNUTSON/Primary Examiner, Art Unit 3611
Read full office action

Prosecution Timeline

Apr 25, 2023
Application Filed
Sep 19, 2025
Non-Final Rejection — §103
Nov 28, 2025
Response Filed
Feb 21, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12583504
STEERING MECHANISM, STEERING SYSTEM, VEHICLE, AND CONTROL METHOD
2y 5m to grant Granted Mar 24, 2026
Patent 12565261
MOTOR DRIVEN POWER STEERING SYSTEM OF REDUNDANCY STRUCTURE
2y 5m to grant Granted Mar 03, 2026
Patent 12565258
RUDDER SYSTEM
2y 5m to grant Granted Mar 03, 2026
Patent 12559162
STEERING APPARATUS
2y 5m to grant Granted Feb 24, 2026
Patent 12559163
CONTROLLER FOR ROTARY ELECTRIC MACHINE, AND ELECTRIC POWER STEERING APPARATUS
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
79%
Grant Probability
99%
With Interview (+21.0%)
2y 9m
Median Time to Grant
Moderate
PTA Risk
Based on 1043 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month