Prosecution Insights
Last updated: April 19, 2026
Application No. 18/988,466

APPARATUS, SYSTEM AND METHOD OF INFORMATION PROCESSING, AND STORAGE MEDIUM

Non-Final OA §102§103§112
Filed
Dec 19, 2024
Examiner
HAGHANI, SHADAN E
Art Unit
2485
Tech Center
2400 — Computer Networks
Assignee
Canon Kabushiki Kaisha
OA Round
1 (Non-Final)
60%
Grant Probability
Moderate
1-2
OA Rounds
2y 11m
To Grant
79%
With Interview

Examiner Intelligence

Grants 60% of resolved cases
60%
Career Allow Rate
221 granted / 366 resolved
+2.4% vs TC avg
Strong +19% interview lift
Without
With
+18.6%
Interview Lift
resolved cases with interview
Typical timeline
2y 11m
Avg Prosecution
33 currently pending
Career history
399
Total Applications
across all art units

Statute-Specific Performance

§101
2.1%
-37.9% vs TC avg
§103
60.3%
+20.3% vs TC avg
§102
13.8%
-26.2% vs TC avg
§112
16.1%
-23.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 366 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Objections Claim 11 is objected to because of the following informalities: “tracking targe” is a typo. Appropriate correction is required. Claim 18 is objected to because of the following informalities: it depends on itself, likely due to a typo. Appropriate correction is required. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claim 13 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. “Content of control” is unclear and has no definition in the specification. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim(s) 1, 3-8, 12, 16-18 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Parikh (US Patent 12,488,595). Regarding Claim 1, Parikh (US Patent 12,488,595) discloses an information processing apparatus (process managing object tracking; Column 7 lines 58-end; computing device 2000, Column 8 lines 10-15; Figs. 1, 20) comprising: a memory (memory 2004, Fig. 20) and at least one processor (CPU 2006/GPU 2008, Fig. 20) which function as: a detection unit (tracking component 102 using sensor data such as image data, Column 8 lines 14-36) configured to detect a plurality of subjects (object data 104 may represent feature points associated with image data, Column 8 lines 14-36; detection data 106 representing identifiers of objects detected by tracking component 102, Column 8 lines 36-50) from an image (image data, Column 8 lines 14-36); a selection unit (election component 118 processing the object sorting data 116, Column 11 lines 15-30) configured to select a subject as a tracking target (selection component 118 may select a threshold number of the detected objects that includes the highest scores for tracking, Column 11 lines 15-30); a tracking unit (tracking component 102, Column 14 lines 8 - 23) configured to track the subject (create a new track, terminate a track, update a state of a detected object, Column 14 lines 8 - 23) as the tracking target selected by the selection unit (using the selection data 120 from selection component 118, Column 13 line 64 – Column 14 line 23) using information about the plurality of subjects (scoring component 112 of the track-management component 108 determining scores for the detected objects based on class, distance, direction, speed, TTC, confidence, acceleration…, Column 9 line – Column 10 line 27) detected by the detection unit (detection data 106 representing identifiers of objects detected by tracking component 102, Column 8 lines 36-50); a counting unit configured to count the number of subjects currently being tracked by the tracking unit (determining that the number of objects satisfies the threshold number of objects, Column 9 lines 58-62; determining that a number of the objects is greater than a threshold number of objects, Column 33 lines 10-15); and a notification unit configured to notify (outputs selection data, Column 13 lines 64-end) a tracking state by the tracking unit (to the tracking component 102, Column 14 lines 1-7), wherein the notification unit notifies the tracking state (selection component 118 may then send the selection data 120 to the tracking component 102, Column 14 lines 1-7) according to the number of subjects selected by the selection unit (identifiers of objects selected for tracking, Column 13 line 64 – Column 14 line 7) and the number of subjects counted by the counting unit (identifiers of objects selected for tracking, and identifiers of objects not selected for tracking, Column 13 line 64 – Column 14 line 7). Regarding Claim 3, Parikh (US Patent 12,488,595) discloses the information processing apparatus according to claim 1, wherein the notification unit is configured to notify, as the tracking state (selection component 118 may then send the selection data 120 to the tracking component 102, Column 14 lines 1-7), the number of subjects selected by the selection unit (identifiers of objects selected for tracking, Column 13 line 64 – Column 14 line 7) and the number of subjects counted by the counting unit (identifiers of objects selected for tracking, and identifiers of objects not selected for tracking, Column 13 line 64 – Column 14 line 7). Regarding Claim 4, Parikh (US Patent 12,488,595) discloses the information processing apparatus according to claim 1, wherein the selection unit is configured to select the plurality of subjects detected by the detection unit based on a plurality of subjects being continuously detected for a predetermined time period or a predetermined number of times (object existence confidence – i.e., determining object does exist, based on threshold number of images, e.g., one image, two images, five images, ten images, Column 28 lines 1-24), and exclude from the plurality of subjects any subjects not being detected for the predetermined time period or the predetermined number of times (object existence confidence – i.e., determining object does not exist, based on threshold number of images, e.g., one image, two images, five images, ten images, Column 28 lines 1-24). Regarding Claim 5, Parikh (US Patent 12,488,595) discloses the information processing apparatus according to claim 1, further comprising a subject management unit configured to identify and manage a subject (detected object parameters are scored by the scoring component 112 of the track management component 108 based on parameters, Column 9 line 4 – Column 10 line 27), wherein a plurality of subjects selected by the selection unit from subject information stored by the subject management unit is compared with a subject detected by the detection unit (detected object parameters are scored by the scoring component 112 of the track management component 108 based on parameters, Column 9 line 4 – Column 10 line 27), and when each of the plurality of subjects selected by the selection unit from the subject information stored by the subject management unit does not match the subject detected by the detection unit (object is selected for tracking, not selected for tracking, Column 13 line 64 – Column 14 line 7), a notification is issued by the notification unit (outputting selection data, send the selection data, Column 13 line 64 – Column 14 line 7). Regarding Claim 6, Parikh (US Patent 12,488,595) discloses the information processing apparatus according to claim 5, wherein the selection unit is configured to compare the subject information stored by the subject management unit with the subject detected by the detection unit (detected object parameters are scored by the scoring component 112 of the track management component 108 based on parameters, Column 9 line 4 – Column 10 line 27), and when the subject information corresponds to the subject (subject score is above score threshold, or subject priority/rank is above count threshold, Column 11 lines 18 – end), the subject is selected as the tracking target (selection component 118 selects objects to be tracked based on scores, priorities, and score thresholds, and count thresholds, Column 11 lines 18 – end). Regarding Claim 7, Parikh (US Patent 12,488,595) discloses the information processing apparatus according to claim 1, further comprising a subject management unit (track management component, Column 5 lines 38-40) configured to preregister and manage subject information including a characteristic of the subject (parameters associated with a detected object may include a classification of the detected object, e.g., pedestrian, animal, building, vehicle, sign, curb, driving surface, etc., a distance to the detected object from the vehicle, a direction to the detected object from the vehicle, whether the detected object is located along the path of the vehicle, a time-to-collision (TTC) associated with the detected object, a confidence indicating whether the detected object is an actual object, a velocity of the detected object, an acceleration of the detected object, a direction of travel of the detected object, a lane that the detected object is navigating, Column 5 lines 35-60), wherein the selection unit is configured to select the tracking target (selection component 118 selects objects to be tracked based on scores, priorities, and score thresholds, and count thresholds, Column 11 lines 18 – end) based on the subject information registered with the subject management unit (track management component calculates scores for the detected objects based on parameters, where the scores are associated with the priorities, Column 5 lines 58-61). Regarding Claim 8, Parikh (US Patent 12,488,595) discloses the information processing apparatus according to claim 1, further comprising a subject management unit (track management component, Column 5 lines 38-40) configured to preregister and manage subject information including a characteristic of the subject (parameters associated with a detected object may include a classification of the detected object, e.g., pedestrian, animal, building, vehicle, sign, curb, driving surface, etc., a distance to the detected object from the vehicle, a direction to the detected object from the vehicle, whether the detected object is located along the path of the vehicle, a time-to-collision (TTC) associated with the detected object, a confidence indicating whether the detected object is an actual object, a velocity of the detected object, an acceleration of the detected object, a direction of travel of the detected object, a lane that the detected object is navigating, Column 5 lines 35-60), wherein priority is set to a plurality of the subjects (prioritize objects based on parameters, Column 5 lines 38-40) registered with the subject management unit (track management component parameters, Column 5 lines 38-40). Regarding Claim 12, Parikh (US Patent 12,488,595) discloses the information processing apparatus according to claim 1, wherein the tracking unit is configured to delete the subject from the tracking targets (tracking component 102 may terminate the current track associated with the detected object, Column 14 lines 8-22). Regarding Claim 16, the claim is rejected on the grounds provided in Claim 1. Regarding Claim 17, the claim is rejected on the grounds provided in Claim 1. Regarding Claim 18, Parikh (US Patent 12,488,595) discloses a non-transitory computer-readable storage medium storing a program (functions being performed in software, Column 15 lines 40-43) according to claim 18 (grounds in Claim 1). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 2, 13, 15 are rejected under 35 U.S.C. 103 as being unpatentable over Parikh (US Patent 12,488,595) in view of Kerofsky (US PG Publication 2025/0392778). Regarding Claim 2, Parikh (US Patent 12,488,595) discloses the information processing apparatus according to claim 1. Parikh does not disclose but Kerofsky (US PG Publication 2025/0392778) teaches wherein the notification unit is configured to change content of a notification between a case where the number of subjects selected by the selection unit matches the number of subjects being tracked by the tracking unit and a case where the number of subjects selected by the selection unit does not match the number of subjects being tracked by the tracking unit (display a visual clue or indication to indicate that the tracking of the object was lost [0051]). One of ordinary skill in the art before the application was filed would have been motivated to supplement the tracking system of Parikh with a user interface to notify the operator of a track-lost state because Parikh is directed to, among other uses, semi-autonomous systems, Column 4 lines 22-47, and informing the operator of lost targets is essential for maintaining integrity and safety of electro-mechanical systems where computer vision fails and manual intervention is required. Regarding Claim 13, Parikh (US Patent 12,488,595) discloses the information processing apparatus according to claim 1. Parikh does not disclose but Kerofsky (US PG Publication 2025/0392778) teaches wherein the tracking unit is configured to change content of control according to a number of subjects selected by the selection unit (may display a visual clue or indication to indicate that the tracking of the object was lost [0051]). One of ordinary skill in the art before the application was filed would have been motivated to supplement the tracking system of Parikh with a user interface to notify the operator of a track-lost state because Parikh is directed to, among other uses, semi-autonomous systems, Column 4 lines 22-47, and informing the operator of lost targets is essential for maintaining integrity and safety of electro-mechanical systems where computer vision fails and manual intervention is required. Regarding Claim 15, Parikh (US Patent 12,488,595) discloses the information processing apparatus according to claim 1. Parikh does not disclose but Kerofsky (US PG Publication 2025/0392778) teaches wherein the notification unit is configured to notify the tracking state by changing a lighting pattern of a tally lamp (display a visual clue or indication to indicate that the tracking of the object was lost [0051]). One of ordinary skill in the art before the application was filed would have been motivated to supplement the tracking system of Parikh with a user interface to notify the operator of a track-lost state because Parikh is directed to, among other uses, semi-autonomous systems, Column 4 lines 22-47, and informing the operator of lost targets is essential for maintaining integrity and safety of electro-mechanical systems where computer vision fails and manual intervention is required. Claim(s) 9-10, 14 are rejected under 35 U.S.C. 103 as being unpatentable over Parikh (US Patent 12,488,595) in view of Liu (US PG Publication 2023/0056334 A1). Regarding Claim 9, Parikh (US Patent 12,488,595) discloses the information processing apparatus according to claim 1 wherein the notification unit is configured to select a main subject as the tracking target (prioritize objects based on parameters, Column 5 lines 38-40: i.e., the highest ranked detected object). Parikh does not disclose but Liu (US PG Publication 2023/0056334 A1) teaches distinguishably notify of tracking states of the main subject (identification of main photographed object [0078]) and other subjects (callout frame is used to mark out the target object [0044]; identifications of the multiple photographed objects [0078]). One of ordinary skill in the art before the application was filed would have been motivated to supplement Parikh with the ability to mark and follow a primary target because Liu teaches that automating image capture can reduce the need for manual labor performing redundant tasks [0002], maintaining quality and improving convenience. Regarding Claim 10, Parikh (US Patent 12,488,595) discloses the information processing apparatus according to claim 1. Parikh does not disclose but Liu (US PG Publication 2023/0056334 A1) teaches wherein the notification unit is configured to distinguishably notify (callout frame [0044]) a region of a subject detected by the detection unit (identify these target objects from the video data through a target detection algorithm [0044]) and not selected by the selection unit (the object that needs to be tracked is only several of the multiple targets [0044]) and a region of a subject detected by the detection unit and selected by the selection unit (callout frame is used to mark out the target object [0044]). One of ordinary skill in the art before the application was filed would have been motivated to supplement Parikh with the ability to mark and follow a primary target because Liu teaches that automating image capture can reduce the need for manual labor performing redundant tasks [0002], maintaining quality and improving convenience. Regarding Claim 14, Parikh (US Patent 12,488,595) discloses the information processing apparatus according to claim 13. Parikh does not disclose but Liu (US PG Publication 2023/0056334 A1) teaches wherein the tracking unit is configured to use at least one of pan control, tilt control, and zoom control (video acquisition device 23 is controlled to move left and right, to move back and forth and to rotate [0065]) to track the subject (control the video acquisition device 23 to track and shoot the main photographed object [0078]). One of ordinary skill in the art before the application was filed would have been motivated to supplement Parikh with the ability to mark and follow a primary target because Liu teaches that automating image capture can reduce the need for manual labor performing redundant tasks [0002], maintaining quality and improving convenience. Claim(s) 11 is rejected under 35 U.S.C. 103 as being unpatentable over Parikh (US Patent 12,488,595) in view of Kim (US 20160140391 A1). Regarding Claim 11, Parikh (US Patent 12,488,595) discloses the information processing apparatus according to claim 1, wherein in response to a [] instruction for the subject as the tracking targe (track-management component 108 does not select a detected object, Column 14 lines 1-23), the tracking unit is configured to delete the subject from the tracking targets (tracking component 102 may terminate the current track associated with the detected object, Column 14 lines 1-23). Parikh does not disclose but Kim (US 20160140391 A1) teaches user’s instruction (The user may select from the objects using an input device such as a touch screen that also displays the scene including the objects and optional indicators [0002]). Kim provides evidence that it was routine in the art to draw boxes around the detected objects, distinguishing them from non-objects [0002], enabling users to recognize which objects are trackable in captured video and select which objects to track [0002], giving users control over the video capture process. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure: Feng (US 20240037759 A1) – for new, found, or lost targets, mark the corresponding information on the display device Aman (US 2016/0008695 A1) – automatic sports video capture with multiple object tracking system Fujita (US 2011/0019027) – highlight tracked targets and unhighlight track-lost targets Any inquiry concerning this communication or earlier communications from the examiner should be directed to SHADAN E HAGHANI whose telephone number is (571)270-5631. The examiner can normally be reached M-F 9AM - 5PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jay Patel can be reached at 571-272-2988. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /SHADAN E HAGHANI/ Examiner, Art Unit 2485
Read full office action

Prosecution Timeline

Dec 19, 2024
Application Filed
Jan 09, 2026
Non-Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12604020
VIDEO DECODING METHOD AND DECODER DEVICE
2y 5m to grant Granted Apr 14, 2026
Patent 12598323
INTER PREDICTION-BASED VIDEO ENCODING AND DECODING
2y 5m to grant Granted Apr 07, 2026
Patent 12586336
WEARABLE DEVICE, METHOD, AND NON-TRANSITORY COMPUTER READABLE STORAGE MEDIUM CONTROLLING LIGHT RADIATION OF LIGHT SOURCE
2y 5m to grant Granted Mar 24, 2026
Patent 12574549
CHROMA INTRA PREDICTION WITH FILTERING
2y 5m to grant Granted Mar 10, 2026
Patent 12568225
LIMITING A NUMBER OF CONTEXT CODED BINS FOR RESIDUE CODING
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
60%
Grant Probability
79%
With Interview (+18.6%)
2y 11m
Median Time to Grant
Low
PTA Risk
Based on 366 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month