Prosecution Insights
Last updated: April 17, 2026
Application No. 18/469,550

SYSTEM FOR TRACKING SHOOTING PERFORMANCE INCLUDING A LIVE CAMERA FOR CALCULATING SCORING

Non-Final OA §103
Filed
Sep 18, 2023
Examiner
RATCLIFFE, LUKE D
Art Unit
3645
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
unknown
OA Round
1 (Non-Final)
87%
Grant Probability
Favorable
1-2
OA Rounds
2y 11m
To Grant
98%
With Interview

Examiner Intelligence

Grants 87% — above average
87%
Career Allow Rate
1476 granted / 1690 resolved
+35.3% vs TC avg
Moderate +10% lift
Without
With
+10.2%
Interview Lift
resolved cases with interview
Typical timeline
2y 11m
Avg Prosecution
43 currently pending
Career history
1733
Total Applications
across all art units

Statute-Specific Performance

§101
2.3%
-37.7% vs TC avg
§103
50.2%
+10.2% vs TC avg
§102
26.3%
-13.7% vs TC avg
§112
13.6%
-26.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1690 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) below is/are rejected under 35 U.S.C. 103 as being unpatentable over Stender (20030082502) in view of Chung (20200038743). Referring to claim 1, Stender shows a firearm analysis system for use with a firearm with a trigger comprising: a shot detection device capable of detecting when a trigger is pulled on a firearm and a projectile is expelled (see figure 1 Ref 30) ; a camera system that receives shot detection from the shot detection device, wherein (see figure 1 Ref 26) : the camera system consists of an image sensor (see figure 2 Ref 26) , lens (see the lens Ref 24) , and eyepiece note the eyepiece attached to the lens) ; and when a camera system receives information indicating that a firearm has been shot (see the communication wire Ref 54 also see figure 9 Ref 907 and 910) , the camera system stores image data via the image sensor that captures the moment when a projectile strikes the target (see figure 9 Ref 910) ; the stored images of the target strike are compared to base images of the target that are stored when the camera system is set up (see paragraph 11) ; the target strike images are compared to the base images to determine whether the target was struck by a projectile (see paragraph 11) , wherein: if the comparison determines that no projectile has struck the target, the camera system registers the shot as a miss (see figure 10B Ref 1049 also see paragraph 72 ; if a strike is detected, the camera system registers that the shot resulted in a target strike (see figure 9 Ref 918) , then the target strike images are compared to a scoring target that assigns a score to each shot that hits the target (see figure 9 Ref 921) ; a user device for receiving and transmitting information between the shot detection device and the camera system (see figure 1 note the unreferenced computer , wherein: the user device displays information indicating a score for the shot that hit the target (see figure 9 Ref 923) or information that indicates a miss (see paragraph 72 “ in step 1047 a message is given to the user that the system is unable to automatically detect the shot ”) . However Stender fails to specifically show the camera system stores the moment when a projectile strikes the target. Chung shows a similar projectile image capturing device that includes a camera system that stores the moment when a projectile strikes the target (see paragraph 126). It would have been obvious to include capture of the image at the moment when a projectile strikes the target as shown by Chung because this allows for the transmission of the impact to the user, spectators, and opponents as taught by Chung. Referring to claim 2, Stender fails to show but Chung shows the camera system stores the image data continuously and the capturing is done in a moment before and after the trigger was pulled (see paragraph 126). It would have been obvious to include the continuously captured image data and stored a moment before and after the trigger was pulled because this allows for a real-time view of the impact of the projectile to the user, opponent, and spectators as taught by Chung . With the modification of the display of the real -time target hits it would have been obvious to include the trigger pulled as the trigger for storing the image when used in a device for tracking projectiles from a firearm . Referring to claim 4, Stender shows the camera system can automatically recognize, zoom, and focus on a target (see figure 5 Ref 514-520 also see paragraph 66) . Referring to claim 5, Stender shows the target does not contain any generic identifying marks (see figure 1 Ref 16) . Referring to claim 6, Stender shows the shot detection device measures the recoil of a firearm (see paragraph 62 note the motion or shock sensor) . Referring to claim 7, Stender shows the shot detection device detects the sound of a firearm firing (see paragraph 62 note the microphone) . Referring to claim 8, Stender shows the camera system performs image recognition on the target to identify target strikes and misses (see paragraph 72 note the comparison of the pixels in the images to identify an impact by a projectile) . Referring to claim 9, Stender shows the image processing enhances the captured image data (see paragraph 71 note the distinguishing colors that enhance the image) . Referring to claim 10, Stender inherently shows the image processing is performed by a mathematical algorithm (note any imaging processing algorithm inherently includes a mathematical algorithm) . Referring to claim 15, Stender shows the comparison step is performed by the user device (note the computer device that performs the comparison shown in figure 10B Ref 1049) . Referring to claim 16, Stender shows the image data is transferred from the image sensor to the user device for the storing, capturing, and comparison steps, where: the user device performs the image processing and scoring based on the image data (see paragraph 73 note the storing and archiving of images and scoring data) . Claim(s) 3 is/are rejected under 35 U.S.C. 103 as being unpatentable over Stender (20030082502) in view of Chung (20200038743) and Clark (20090277065) . Referring to claim 3, Stender fails to show but Clark shows the continuously stored image data is stored in a circular buffer (see paragraph 66) . It would have been obvious to include the circular buffer as shown by Clark because this is extremely common to temporarily store video data in a loop for a time needed in the future. Claim(s) 11 is/are rejected under 35 U.S.C. 103 as being unpatentable over Stender (20030082502) in view of Chung (20200038743) and Tamir (10551148 ). Referring to claim 11, Stender fails to show but Tamir shows the image processing mathematical algorithm is performed by AI or machine learning (see column 18 lines 39-43) . It would have been obvious to include the AI or machine learning as shown by Tamir because this allows for stronger computational algorithms to determine target impact. Claim(s) 1 2 -14 and 17-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Stender (20030082502) in view of Chung (20200038743) and Valdez (20200398168 ). Referring to claim 12, Stender fails to show but Valdez shows the scores are shared on a social network (see paragraph 40) . It would have been obvious to include the shared social network scores because this allows for competition between remote competitors. Referring to claims 13 and 14, Stender fails to show but Valdez shows the online contest is one of a city, town, county, state, country, or global contest (see paragraph 40) . It would have been obvious to include an online contest as shown by Valdez because this allows for remote competition. Referring to claim 17, Stender shows the camera system connected to the user device displays a real-time video feed from the camera system; when the camera system receives a firing signal, the user device captures an image from the real-time video stream; and the captured image is subsequently processed for scoring (see figure 1 note the connection from the hub Ref 32 and the camera Ref 26 with the wired connection 54 and 58 also see paragraph 62 also note the option to have a wireless link as shown in paragraph 62) . However Stender fails to specifically show establishing a point-to-point WiFi connection between the camera system and the user device. Valdez shows a similar device that includes establishing a point-to-point WiFi connection between the camera system and the user device (see paragraph 35) . It would have been obvious to include the WiFi link as shown by Valdez because this is an extremely common wireless connection and adds no new or unexpected results. Referring to claim 18, Stender shows the image captured from the real-time video stream is utilized for the scoring process (see paragraph 72 note the scoring performed by pixel comparison) . Referring to claim 19, the combination of Stender and Valdez shows the point-to-point WiFi connection enables direct communication between the camera system and the user device without the need for intermediary networks (note the point to point WiFi of Valdez in paragraph 35) . It would have been obvious to remove the need for intermediary networks because this reduces latency between the video and the scoring, this is well known and adds no new or unexpected results. Referring to claim 20, the combination of Stender and Valdez shows the real-time video stream provides an uninterrupted view of the target, allowing the user to monitor the target in real-time prior to and post firing (see Stender Ref 511 also see figure 9 Ref 923) . It would have been obvious to include the prior to and post firing images as shown by Stender because this allows the user to confirm video settings and compare the most recent shots to previous shots. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Caldwell “Ballistic Precision LR Target Camera System” is considered close prior art. Included is a copy of the website from Caldwell that is dated 5/22/2022 and includes a live streaming HD video footage that is connected to smart phone or tablet and allows for scoring. Any inquiry concerning this communication or earlier communications from the examiner should be directed to FILLIN "Examiner name" \* MERGEFORMAT LUKE D RATCLIFFE whose telephone number is FILLIN "Phone number" \* MERGEFORMAT (571)272-3110 . The examiner can normally be reached FILLIN "Work Schedule?" \* MERGEFORMAT M-F 9:00AM-5:00PM EST . Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, FILLIN "SPE Name?" \* MERGEFORMAT Isam Alsomiri can be reached at FILLIN "SPE Phone?" \* MERGEFORMAT 571-272-6970 . The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /LUKE D RATCLIFFE/ Primary Examiner, Art Unit 3645
Read full office action

Prosecution Timeline

Sep 18, 2023
Application Filed
Sep 17, 2024
Response after Non-Final Action
Mar 24, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12591049
TRANSMIT SIGNAL DESIGN FOR AN OPTICAL DISTANCE MEASUREMENT SYSTEM
2y 5m to grant Granted Mar 31, 2026
Patent 12590798
Multi-sensor depth mapping
2y 5m to grant Granted Mar 31, 2026
Patent 12585021
ADDRESSABLE PROJECTOR FOR DOT BASED DIRECT TIME OF FLIGHT DEPTH SENSING
2y 5m to grant Granted Mar 24, 2026
Patent 12578475
Processing Of Lidar Images
2y 5m to grant Granted Mar 17, 2026
Patent 12571893
DISTANCE MEASURING APPARATUS AND METHOD OF DETERMINING DIRT ON WINDOW
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
87%
Grant Probability
98%
With Interview (+10.2%)
2y 11m
Median Time to Grant
Low
PTA Risk
Based on 1690 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in for Full Analysis

Enter your email to receive a magic link. No password needed.

Free tier: 3 strategy analyses per month