Prosecution Insights
Last updated: April 19, 2026
Application No. 18/437,291

A System for Tracking, Locating and Calculating the Position of a First Moving Object in Relation to a Second Object

Non-Final OA §103§112§DP
Filed
Feb 09, 2024
Examiner
THOMAS, SOUMYA
Art Unit
2664
Tech Center
2600 — Communications
Assignee
Hall Matthew
OA Round
1 (Non-Final)
100%
Grant Probability
Favorable
1-2
OA Rounds
2y 9m
To Grant
99%
With Interview

Examiner Intelligence

Grants 100% — above average
100%
Career Allow Rate
2 granted / 2 resolved
+38.0% vs TC avg
Minimal +0% lift
Without
With
+0.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
17 currently pending
Career history
19
Total Applications
across all art units

Statute-Specific Performance

§101
6.8%
-33.2% vs TC avg
§103
64.4%
+24.4% vs TC avg
§102
13.6%
-26.4% vs TC avg
§112
11.9%
-28.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 2 resolved cases

Office Action

§103 §112 §DP
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority Applicant is reminded of the following requirement: To claim the benefit of a prior-filed application, a continuation or divisional application (other than a continued prosecution application filed under 37 CFR 1.53(d)), must include a specific reference to the prior-filed application in compliance with 37 CFR 1.78. If the application was filed before September 16, 2012, the specific reference must be included in the first sentence(s) of the specification following the title or in an application data sheet; if the application was filed on or after September 16, 2012, the specific reference must be included in an application data sheet. For benefit claims under 35 U.S.C. 120, 121, 365(c), or 386(c), the reference must include the relationship (i.e., continuation, divisional, or continuation-in-part) of the applications. The presentation of a benefit claim may result in an additional fee under 37 CFR 1.17(w)(1) or (2) being required, if the earliest filing date for which benefit is claimed under 35 U.S.C. 120, 121, 365(c), or 386(c) and § 1.78(d) in the application is more than six years before the actual filing date of the application. Applicant has listed Application No 18,231,821 as the parent of the instant application in the Application Disclosure Statement (ADS). Applicant has also listed the same number in the Specification. However, this appears to be a typo. The correct parent application appears to be Application No 18,213,821 which corresponds to US Patent No 11,900,678. A corrected and properly signed ADS is required. Specification The disclosure is objected to because of the following informalities: On page 1, under “RELATIONSHIP TO OTHER APPLICATIONS”, “application No. 18/231,821”, should read “application No. 18/213,821” On page 12, the fourth paragraph under “Exemplary embodiments include the following.”, “calculous”, should read “calculus”. On page 20, under “Object Detection & Tracking”, “Subsequently” should read “subsequently”. On page 22, under “An embodiment of the tracking system”, “withing” should read “within”. On page 23, the second paragraph under “Pose Detection”, “media pipe” should read “MediaPipe”. On page 25, remove the last two sentences of paragraph 1 (“I don’t want to intro….”). Appropriate correction is required. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1, 4, 10 and 11 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 1 recites the limitation "the relative location" in line 1 of page 1. There is insufficient antecedent basis for this limitation in the claim. Claim 1 recites the limitation "the location of the target object" in line 5 of page 1. There is insufficient antecedent basis for this limitation in the claim. Claim 1 recites the limitation "the target vertex " in line 6 of page 1. There is insufficient antecedent basis for this limitation in the claim. Claim 1 recites the limitation "the field of view " in line 11 of page 1. There is insufficient antecedent basis for this limitation in the claim. Claim 1 recites the limitation "the path of travel" in line 28 of page 1. There is insufficient antecedent basis for this limitation in the claim. Claim 1 recites the limitation "the classification category data" in line 3 of page 2. There is insufficient antecedent basis for this limitation in the claim. Claim 4 recites the limitation "the presence" in line 18 of page 2. There is insufficient antecedent basis for this limitation in the claim. Claim 10 recites the limitation "the entry and exit points" in line 21 of page 3. There is insufficient antecedent basis for this limitation in the claim. Claim 11 recites the limitation "the motion" in line 30 of page 3. There is insufficient antecedent basis for this limitation in the claim. Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13. The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer. Claim 1 is rejected on the ground of nonstatutory double patenting as being unpatentable over Claim 1 of U.S. Patent No. 11,900,678 in view of Spivak (US PG Pub No 2021/0245971). The conflicting claims are not identical. Patent Claim 1 is directed towards “identification of various pitch-types of a ball pitched by a pitcher in a game of baseball”, while the instant application is directed towards the “automatic calculation of the relative location of a first object relative to a target object”. Patent Claim 1 also requires “a baseball field having a home plate, a pitcher's mound, a first base, a second base, and a third base”. However, the instant application is merely broadening the scope of the claim of Patent Claim 1. The “first object” disclosed in the instant application is the “ball’ disclosed in Patent Claim 1. The “target object” disclosed in the instant application is the “home plate” disclosed in Patent Claim 1. The “classification category data” disclosed in the instant application is the “pitch type data”. Patent Claim 1 requires “a primary camera positioned a primary camera is positioned above and within view of the home plate, and a side camera is positioned to the left or right of the home plate wherein each camera captures a continuous video image; wherein the field of view of the primary camera is at least 120 degrees and includes the positions of a batter (while on the home plate) and a catcher and a pitcher”. In contrast, the instant application requires “a first camera and a second camera, positioned in relation to the target object such that the location of the target object defines the target vertex of a triangle and the first and second cameras define the other two vertices of the triangle, such that an internal angle at the target vertex of the triangle is between 45 degrees and 135 degrees”. However, these limitations are effectively equivalent. By positioning a camera within view of home plate and a secondary camera to the left or right, a triangle is formed between home plate and the two cameras. Additionally, the angle formed at the home plate vertex may be between 45 degrees and 135 degrees. Any angles outside of this range would fail to include the position of the batter, catcher, and a pitcher, as taught in Claim 1. Furthermore, Spivak teaches a similar camera formation (see Fig 1A, shown below, where the target object, is a batter located by the ‘homeplate’ 104, and a triangle is formed by the ‘homeplate’, cameras 160A and 160B, and see target vertex at ‘homeplate’ 104 has angle less than 135 degrees. Also see paragraph [0059], “The video images captured by of each of the cameras 160A, 160B and 160C preferably also include the pitcher's mound 132, so that the cameras are capable of being used to capture video images of a baseball as it travels from the pitcher's mound 132 to home plate 104. More generally, the video images captured by each of the cameras 160A, 160B and 160C preferably include the baseball as it traveling towards home plate, and the baseball bat that is selectively swung at the baseball by a player”). PNG media_image1.png 846 632 media_image1.png Greyscale Additionally, Patent Claim 1 teaches the additional limitation of “wherein the home plate space is a 3-dimensional shape extrapolated from an NxN 2-dimensional grid centered on the home plate”. However, the instant application teaches these limitations in Claim 10. The instant application teaches the additional limitation of “and wherein the one or more tracking backbone programs detects the first object at a time T=1 and calculates the speed and direction of the first object, and predicts the path of travel of the first object relative to the target object, and thereby predicts a position of the first object relative to the target object at a time T=2”. However, Spivak teaches that an object can be detected a time (see paragraph [0117], “So, it can be arranged for t==0 when the first sample (image) is taken”, and see paragraph [0120], “Thus, the solving involves solving for the matrix A, which includes the coefficients of the equations of motion, based on matrices S0 . . . SN which include the determined object positions in the images”, where the position of the ball is determined by solving matrix A).” Spivak further teaches that the speed and direction can be calculated (see equations 2 and 3), and that using these equations, a path of travel of a ball can be predicted (see paragraph [0129],” Here, a baseball path 1600 is indicated in the wx-wy plane. Lines of position 1622 and 1624 extend from camera 160A at time points tA0 and tA1, respectively, while a line of position 1632 extends from camera 160C at tC0. Baseball positions, which are not known, are indicated by the circles at tA0 and tA1 and by the square at tC0.” and see paragraph [0130], “A calculated baseball position 1610, indicated by a triangle, can be determined as a position which meets a distance criteria with respect to the lines of position 1622, 1624 and 1632.”, and see predicted baseball path 1600, with a calculated position at 1610). Thus, a position of a ball at any time t can be determined, and a position of a ball can be predicted at a later time. Both Spivak and the instant application are from the analogous field of tracking moving objects. Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the location calculation taught by Spivak with the system claimed by Hall in Patent 11,900,678 in order to obtain the invention as claimed in Claim 1. The motivation for doing so would be to accurately determine the position of the first object as taught by Spivak in paragraph [0106], “At step 1320, equations of motion of the tracked object (e.g., baseball) are obtained. Equations of motion express the 3D location of the object as a function of time. The equations of motion should be sufficiently accurate over the course of the measured trajectory.” Thus, it would have been obvious to combine the trajectory tracking and camera set up taught by Spivak with the previous Patent in order to obtain the invention as claimed in Claim 1. The instant application also recites the additional limitation where the classification data is transmitted “to a computer programmed to select a response based on the classification of the first object”. However, since this limitation is preceded by an ‘or’, this limitation carries no patentable weight. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim 1 is rejected under 35 U.S.C. 103 as being unpatentable over Spivak (US Pub No 2024/0245971) in view of Yerli (US Pub No 2019/0321683). As to Claim 1, Spivak teaches a computerized system (see Fig. 1, Processing Facility 164), for automatic calculation of the relative location of a first object relative to a target object ( see paragraph wherein at least the first object is in motion (see paragraph [0006], “The method also includes autonomously tracking locations of the ball traveling towards the batter, using computer vision, based on the video images of the ball traveling towards the batter; and autonomously tracking locations of the bat being held by the batter as the ball travels towards the batter”, where the first object is the ball, and the target object is the batter) the system comprising a first camera and a second camera (see Fig 1A, cameras 160a and 160c), positioned in relation to the target object such that the location of the target object defines the target vertex of a triangle and the first and second cameras define the other two vertices of the triangle such that an internal angle at the target vertex of the triangle is between 45 degrees and 135 degrees; (see Fig 1A, shown below, where the target object, is a batter located by the ‘homeplate’ 104, and a triangle is formed by the homeplate, cameras 160A and 160B, and see target vertex at ‘homeplate’ 104 has angle less than 135 degrees. Also see paragraph [0059], “The video images captured by of each of the cameras 160A, 160B and 160C preferably also include the pitcher's mound 132, so that the cameras are capable of being used to capture video images of a baseball as it travels from the pitcher's mound 132 to home plate 104. More generally, the video images captured by each of the cameras 160A, 160B and 160C preferably include the baseball as it traveling towards home plate, and the baseball bat that is selectively swung at the baseball by a player”). PNG media_image1.png 846 632 media_image1.png Greyscale wherein the first camera is positioned within view of the target object, and a side camera is positioned to the left or right of the target object (see Fig 1A, camera 160B in view of Homeplate 104, which is the target object, and see cameras 160A and 160C on left and right side of the target object), wherein each camera captures a continuous video image (see paragraph [0107], “At a minimum there are two cameras. At step 1410, the cameras capturing images of the moving baseball at different points in time.” and the first camera includes the target object and the first object once it enters the field of view and the side camera also includes the target object and the first object once it enters the field of view (see paragraph [0079], “Step 406 involves receiving video images, captured using at least two different cameras having different positions, of a ball traveling towards the batter for which the strike zone was determined. Step 408 involves receiving video images, captured using at least two different cameras having different positions, of a bat being held by the batter as the ball travels towards the batter., where the ball is the first object and the target object is the batter); wherein both cameras are in functional communication with, and transmit video data to a computer (see paragraph [0078], “The cameras 160A, 160B and 160C used for tracking the ball and/or the bat can communicate video to Vertical Interval Time Code (VITC) inserters 310A, 310B and 310C, which can be individually referred to as a VITC inserter 310, or collectively as VITC inserters 310. The video from each VITC inserter 310 is sent to a respective tracking computer 314A, 314B and 314C, which can be individually referred to as a tracking computer 314, or collectively as tracking computers 314. The tracking computers 314 are connected to each other and to the Strike Zone computer”), and transmit video data to a computer programmed with one of more mapping and tracking backbone programs and an AI algorithm (see Fig. 3, tracking computers 314A-C, and see paragraph [0014], “Similarly, an arbitrary point (x, y, z, a) in homogenous coordinates can be mapped back to a 3D point by dividing the first three terms by the fourth (a) term: (x, y, z, a)→(x/a, y/a, z/a)”, and see paragraph [0080], “Still referring to FIG. 4 , step 410 involves autonomously tracking locations of the ball traveling towards the batter, using computer vision”, where computer vision is a subset of AI). receiving data from the backbone program, which is trained to classify the first object using the data provided by the one or more backbone programs, (see paragraph [0080], “Step 418 involves autonomously determining whether a “strike” or a “ball” occurred, based on the determination of whether at least one location of the ball intersected with the strike zone, and/or the determination of whether the batter made a genuine attempt to swing at the ball”, where the first object is the ball, and the ball is classified as a ‘strike’ or ‘ball’), wherein one or more mapping backbone programs defines and maps 3-dimensional space in real-time (see paragraph [0076], “The tracking system can be used to track the 3D positions a pitched baseball, a bat and/or a strike zone”) , centered on the target object (see paragraph [0010], “the method further includes autonomously determining a trajectory of the ball in 3D space as the ball travels towards the batter”, where the batter is the target object), and performs object detection (see paragraph [0108], “For example, as discussed above in connection with FIG. 1C, in a captured image, a location of the detected baseball or other object (e.g., 193 in FIG. 1C) in the image is identified by the pixel coordinates (sx, sy), where sx denotes a horizontal position in the image and sy denotes a vertical position in the image. The baseball can be detected in the image in different ways” , monitors and analyses the video images (see paragraph [0077], “one or more of the cameras 160 provide video for broadcast or at least for recording video of the game for viewing and/or analyzing”, and post-processes the video data from the cameras (see paragraph [0060], “A processing facility 164 receives and processes frames of video images from the cameras 160”), and feeds it into the AI algorithm (see paragraph [0080], “Still referring to FIG. 4 , step 410 involves autonomously tracking locations of the ball traveling towards the batter, using computer vision”, where computer vision is a subset of AI), and produces data containing such information, and feeds said data to the AI algorithm which is trained to classify the first object into various categories (see paragraph [0066], “Step 418 involves autonomously determining whether a “strike” or a “ball” occurred, based on the determination of whether at least one location of the ball intersected with the strike zone, and/or the determination of whether the batter made a genuine attempt to swing at the ball”), and wherein one or more tracking backbone programs detects the first object at a time T=1 (see paragraph [0117], “So, it can be arranged for t==0 when the first sample (image) is taken”, and see paragraph [0120], “Thus, the solving involves solving for the matrix A, which includes the coefficients of the equations of motion, based on matrices S0 . . . SN which include the determined object positions in the images) and calculates the speed and direction of the first object (see paragraph [0106], “The nine parameters x0, y0, z0, vx0, vy0, vz0, ax, ay and az, are coefficients of the equations of motion. Coefficients x0, y0, z0 denote the coefficients vx0, vy0, vz0 denote the velocity of the object in the three orthogonal directions at time t=0, and coefficients ax, ay, az denote the acceleration of the object in three orthogonal directions at time t, where velocity is equivalent to speed and direction of the object, and these equations can be used for anytime of t) and predicts the path of travel of the first object relative to the target object and thereby predicts a position of the first object relative to the target object at a time T=2 (see paragraph [0129], “Here, a baseball path 1600 is indicated in the wx-wy plane. Lines of position 1622 and 1624 extend from camera 160A at time points tA0 and tA1, respectively, while a line of position 1632 extends from camera 160C at tC0. Baseball positions, which are not known, are indicated by the circles at tA0 and tA1 and by the square at tC0” and see paragraph [0130], “A calculated baseball position 1610, indicated by a triangle, can be determined as a position which meets a distance criteria with respect to the lines of position 1622, 1624 and 1632.”, and see predicted baseball path 1600, with a calculated position at 1610, thus showing that distance can be calculated at any time, such as T = 2), and produces data containing such information (see paragraph [0157], “In accordance, with certain embodiments, an image may be enhanced to depict the path of a baseball as it travels towards a batter”) and wherein the classification category data is transmitted from the computer to a screen viewable by a viewer or to a computer programmed to select a response based on the classification of the first object, (see paragraph [0066], “The user interface 176 is one example of an output device that can be used to provide autonomous indications of whether a “strike” or a “ball” occurred, a ball was hit fair or foul, whether a foul tip occurred, and/or the like”). Spivak fails to teach that the field of view for the first camera and second camera have a field of view of at least 120 degrees. However, Yerli teaches a system with multiple image capturing devices which each have a field of view of 120 degrees (see paragraph [0056], “ The video-recording cameras may capture video footage at a high rate of at least 100 FPS and covering at least 120 degrees of field view”. Yerli is combinable with Spivak since both are from the analogous field of using cameras to detect sports equipment. Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Yerli with Spivak . The motivation for doing so would be to better capture the players by using a greater field of view . Yerli teaches in paragraph [0009], “The stadium camera system includes a plurality of action-focused cameras that, based on the received directional data sent by the data processing server, auto-compensate and regulate rotation, focus, and zoom of cameras in order to generate footage data comprising uniform action coverage covering 360 degrees field of view around a spherical focus zone determined by the location of the sports ball, individual players, or combinations thereof.” Thus, it would have been obvious to combine the teachings of Yerli with the teachings of Spivak in order to obtain the invention as claimed in Claim 1. As to Claim 8, Spivak in view of Yerli teaches the system of Claim 1 adapted to a sports game involving a ball in motion, wherein the first object is a ball in motion (see abstract of Spivak, “Methods and systems for use in automating or assisting umpiring of a baseball or softball game are described herein”). As to Claim 9, Spivak in view of Yerli teaches the system of claim 8 wherein the game is baseball or softball or rounders or cricket, and the target object is a player (see abstract, “Methods and systems for use in automating or assisting umpiring of a baseball or softball game are described herein”), and see paragraph [0006], “The method also includes autonomously tracking locations of the ball traveling towards the batter, using computer vision, based on the video images of the ball traveling towards the batter”, where the target object is the batter). Claim 2 is rejected under 35 U.S.C. 103 as being unpatentable over Spivak (US Pub No 2024/0245971) in view of Yerli (US Pub No 2019/0321683), and further in view of Lu (US 2024/0119625), hereinafter Lu. As to Claim 2, Spivak in view of Yerli teaches the computerized system of Claim 1, wherein the one or mapping backbone programs employs a bounding box (see paragraph [0088], “Prior to the game, an operator can indicate where in the video the ball is expected to be during a pitch. The tracking computer can look in those areas for a cluster of pixels that are in the YUV color space of the ball's color. Pre-set variables can define the minimum and maximum sizes of a cluster in numbers of pixels, as well as acceptable shapes for the cluster's bounding box”). Spivak fails to teach a confidence prediction class probability map. However, Lu teaches a system for tracking the trajectory of a player in a sports field, (see paragraph [0035], “To resolve the issues mentioned above, the disclosed method and system automate the ball carrier detection and tracking by using a convolutional neural network (CNN) based algorithm to detect and track a ball carrier”)and using a probability map in order to determine a confidence value in order to determine the position of a ball with respect to a player (see paragraph [0053], “The probability map 704 is used to determine which player has the highest confidence values over multiple frames, and is declared the ball carrier on the output 708 image and for a specific time frame (specific interval of frames). The ball carrier is circled on the output image 708.” Lu is combinable with Spivak and Yerli since all are from the analogous field image analysis in sports applications. Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed inventio to combine the probability prediction taught by Lu with the teachings of Spivak and Yerli. The motivation for doing so would be able to determine the position of the ball even if the ball is obstructed by a player. Lu teaches in paragraph [0001], “ However, automatic image processing applications with object recognition and object tracking usually cannot adequately track a ball during action on a field of play of a team sport since the ball is often too difficult to see. Thus, conventional tracking applications often do not adequately determine which player is carrying the ball.” Thus, it would have been obvious to combine the confidence and probability map taught by Lu with the teachings of Spivak and Yerli in order to obtain the invention as claimed in Claim 2. Claims 3 and 4 are rejected under 35 U.S.C. 103 as being unpatentable over Spivak (US Pub No 2024/0245971) in view of Yerli (US Pub No 2019/0321683), and further in view of Tuxen et al. (US Pub No 2018/0120428), hereinafter Tuxen. As to Claim 3, Spivak teaches that radar can be used to detect an object (see paragraph [0109], “Other various techniques for analyzing images to detect baseballs which will be apparent to those skilled in the art may be used. For example, various pattern recognition techniques can be used. Radar, infra-red and other technologies can also be used”). However, Spivak fails to explicitly teach that wherein the one or more tracking backbone programs is in functional communication with a Doppler radar system poisoned at the location of the first camera, wherein said Doppler radar system provides real-time velocity information to said one or more tracking backbone programs. However, Tuxen discloses a tracking device (see Fig. 1, tracking device 102), which can include a camera and a Doppler radar device (see paragraph [0003], “The tracking device performs the tracking contactless and preferably without any modifications to the ball. The tracking device may be a Doppler radar, lidar or camera tracking device, or any combination thereof”), and that the tracking device can provide real time velocity information to a processor (see paragraph [0004] “the processor receives from the first tracking device a signal and calculates from this signal object data including position and velocity values for the object identified”). Tuxen is combinable with Spivak and Yerli since all three are from the analogous field of image analysis for tracking sport accessories. Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the radar system taught by Tuxen with the teachings of Spivak and Yerli. The motivation for doing so would have been able to accurately track an object without having to make modifications to the object itself. Tuxen teaches in paragraph [0001], “Other systems have required special markings on objects to compare a spin rate of a ball to its linear speed to identify a time at which the ball begins to truly roll. However, these systems work only with specialized balls and are unsuitable for situations in which players use balls not including such specialized markings.” Tuxen further teaches in paragraph [0003], “The tracking device performs the tracking contactless and preferably without any modifications to the ball”, where the ball is the object. Thus, it would have been obvious to one of ordinary skill in the art to combine the teachings of Tuxen with the teachings of Spivak and Yerli in order to obtain the invention as claimed in Claim 3. As to Claim 4, Spivak in view of Yerli fails to teaches that the computerized system is alerted to the presence of the first object from the Doppler radar system which sends an alert when it detects an object traveling at a velocity above a set threshold velocity. However, Tuxen teaches a tracking device which can be alerted to the presence of a discontinuity of a first object (see paragraph [0035], “In step 420, discontinuity detection is implemented to detect any discontinuity in the velocity and/or the derivative thereof (acceleration). In an exemplary embodiment, the discontinuity may be detected by the processor for any time at which a change in velocity or derivative jump from one data point to an adjacent data point is greater than a set threshold. For example, an exemplary threshold value for velocity discontinuity may be a change of 0.05 m/s while and exemplary threshold value for acceleration discontinuity may be a change of 0.2 m/s2.” Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the radar detection system taught by Tuxen with the teachings of Spivak and Yerli. The motivation for doing so would be to ensure that only the velocity of the first object is collected. Tuxen teaches in paragraph [0035], “That is, a segment is defined as a time interval during which the velocity is continuous and has a continuous derivative. As would be understood by those skilled in the art, moving items which do not follow the pattern associated with movement of a putted ball (i.e. golf club, golfer, leaves, birds) are detected and eliminated from analysis”. Thus, by analyzing for discontinuities, only segments of continuous velocity are captured, which are more likely to contain the ball or first object that needs to be tracked. Thus, it would have been obvious to one of ordinary skill in the art to combine the teachings of Tuxen with the teachings of Spivak and Yerli in order to obtain the invention as claimed in Claim 4. Claims 5 and 7 are rejected under 35 U.S.C. 103 as being unpatentable over Spivak (US Pub No 2024/0245971) in view of Yerli (US Pub No 2019/0321683), and further in view of Moraites et al. (US Pub No 2015/0377709), hereinafter Moraites. As to Claim 5, Spivak in view of Yerli teaches the computerized system of claim 1 wherein the classification category data is transmitted from the computer to a screen viewable by a viewer (see Spivak, paragraph [0066]) . However, Spivak in view of Yerli fails to teach that the classification category data is ranked by threat level and is presented visually. However, Moraites teaches a computerized system for tracking the trajectory of hostile fire, and that the threat level can be displayed to the user (see paragraph [0072], “FIG. 8 shows an example of a display 800 providing hostile fire hit probability information. The display 800 can have indicia associated with the hit/likely hit location on the vehicle and also probability data, such as hit probability percentage”). Moraites is combinable with Spivak since both are from the analogous field of tracking object trajectory. Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Moraites with the teachings of Spivak and Yerli. The motivation for doing so would be to better prepare crews for the need for maintenance with respect to the target object. Moraites teaches in paragraph [0072], “ Thus, statistical or probabilistic data of a hit or likely hit can be used to generate an indication for air or maintenance crews that the vehicle was hit and an approximation or estimation of where the vehicle was hit or likely hit. Again, such data can be used by air and maintenance crews for damage assessment, inspection, and repair.” Thus, it would have been obvious to combine the threat detection system taught by Moraites with the teachings of Spivak and Yerli. As to Claim 7, Spivak in view of Yerli teaches transmitting data to a computer programmed with one of more backbone programs and an AI algorithm, receiving data from the backbone programs. However, Spivak in view of Yerli fails to teach fails to teach further receiving real-time input data from a satellite, drone or aircraft in functional communication with, and. However, Moraites teaches receiving input from an aircraft (see paragraph [0003], “The location corresponding to the electronic location indication can be representative of a location of an airborne vehicle, for instance, a helicopter” and see paragraph [0004], “Additionally, one or more embodiments can include a system operative on an airborne vehicle, for instance a helicopter, that is operative during the day and at night to determine whether the vehicle), and that this data can be transmitted to a processor, which can be used to determine hit or probable hit locations (see paragraph 0027], “Optionally, the instructions, when executed by the processor, can cause the processor to perform operations comprising: responsive to a retrieval request, output the stored data corresponding to any determined hit or probable hit locations for later retrieval and analysis, the stored data being transformed so as to provide on a display device a visual diagrammatic representation of the flying vehicle and any determined hit locations or probably hit locations to the flying vehicle”). Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the input taught by Moraites with the teachings of Spivak and Yerli in order to obtain the invention as claimed in Claim 7. The motivation for doing so would be to accurately determine if an aircraft was hit, which would allow for ground crew to prepare for repair. Moraites teaches in paragraph [0072], “Thus, statistical or probabilistic data of a hit or likely hit can be used to generate an indication for air or maintenance crews that the vehicle was hit and an approximation or estimation of where the vehicle was hit or likely hit. Again, such data can be used by air and maintenance crews for damage assessment, inspection, and repair.” Thus, it would have been obvious to combine the threat detection system taught by Moraites with the teachings of Spivak and Yerli. Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Spivak (US Pub No 2024/0245971) in view of Yerli (US Pub No 2019/0321683), and further in view of Moraites et al. (US Pub No 2015/0377709), hereinafter Moraites and further in view of Allen et al. (US Pat No 9,384,645), hereinafter Allen. As to Claim 6, Spivak in view of Yerli and Moraites teaches the screen viewable by a viewer further displays information including a probability of impact and the first object relative to the target object (see paragraph [0072], “FIG. 8 shows an example of a display 800 providing hostile fire hit probability information. The display 800 can have indicia associated with the hit/likely hit location on the vehicle and also probability data, such as hit probability percentage”). Moraites fails to explicitly teach a ‘time of impact’ is given. However, Allen teaches an optical system (see Col. 5, lines 31-34, “In some embodiments, one or more remote sensors 30 (e.g., remote cameras, etc.) are further utilized to acquire data regarding area 20.” which can determine a time of impact of an object with respect to another object, and display this information (see Col. 4, lines 8-14, “The warning may be generated based on various data regarding the user, other users, a surrounding area, etc., and may be provided so as to provide an indication of a distance to a potential impact, a time until a potential impact, a direction toward a potential impact, a velocity of an impacting object (e.g., another player, the ground, etc.), and the like”). Allen is combinable with Spivak and Yerli since all three are in the same field of image analysis in sports. Furthermore, Alen is combinable with Moraites since both are from the analogous field of determining the velocity of objects. Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Alen with the teachings of Spivak, Yerli, and Moraites. The motivation for doing so would be to provide warning to users of an impending impact. Allen teaches in Col. X, lines X-X, “However, players are not always aware of impending impacts with other players, the ground or a wall, a ball, etc., due to limitations of field of vision, player distractions, etc. The systems disclosed herein in accordance with various embodiments provide players with advance warning (e.g., audible, haptic, visual, etc.) regarding potential impacts involving the user.” Thus, it would have been obvious to combine the time-to-impact warning taught by Allen with the teachings of Spivak, Yerli, and Moraites in order to obtain the invention as claimed in Claim 6. Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Spivak (US Pub No 2024/0245971) in view of Yerli (US Pub No 2019/0321683), and further in view of Shin et al (US Pub No 2020/0238153), hereinafter Shin. As to Claim 10, Spivak in view of Yerli teaches system of claim 9, wherein the game is baseball on a baseball field (see Spivak, Fig. 1, baseball field 102), having a home plate, a pitcher’s mound, a first base, a second base, and a third base (see Spivak, Fig 1, home plate 104, pitcher’s mound 132, first base 112, second base 118, third base 122) , wherein one or more backbone programs define and map a 3-dimensional space (see Spivak, paragraph [0076], “The tracking system can be used to track the 3D positions a pitched baseball, a bat and/or a strike zone”), wherein a home plate space is a 3-dimensional shape extrapolated from an NxN 2-dimensional grid centered on the home plate (see Spivak, Fig 2, K Zone 210, which is centered over a homeplate, and see paragraph [0075], “As can be seen from FIG. 2 , the left image shows a batter at home plate 104. The image on the right shows a batter at home plate 104 with a strike zone graphic 210 added by a system. In some (but not all) instances, the graphic will include cross hairs and a solid circle to indicate the location where the baseball intersected the front plane of the strike zone. In accordance with certain embodiments, as different cameras are used and/or panned/tilted, the 3D strike zone can be seen from different perspectives“), and performs object detection (see Spivak, paragraph [0108]), monitors and analyses the video images, (see Spivak, paragraph [0077]) and post-processes the video data from the cameras (see Spivak, paragraph [0060]), and feeds it into the AI algorithm, and wherein the one or more backbone programs detects and maps the ball in real-time (see Spivak, paragraph [0076]), and calculates the speed of the ball (see Spivak, paragraph [0106]) and feeds said data to the AI algorithm (see Spivak, paragraph [0080]) which is trained to classify a ball into various pitch types using the data provided by the one or more backbone programs (see Spivak, paragraph [0066]), and wherein the relative location of the ball in motion relative to the player is transmitted from the computer to a screen viewable by a viewer (see Spivak, paragraph [0066]) . Spivak in view of Yerli fails to explicitly teach and calculates and/or predicts the path of travel of the ball relative to the home plate space and predicting the entry and exit points of the ball relative to the home plate space. However, Shin teaches that the entry and exit of a ball can be predicted with respect to a three dimensional area over a home plate (see paragraph [0067], “Referring to FIG. 6, the strike zone S may be a space above the home plate according to baseball regulations. Accordingly, the display 10 may display the three-dimensional strike zone S corresponding to the space above the home plate, and the pitched ball trajectory prediction result B representing a three-dimensional pitched ball trajectory therefor may be provided in a complex manner such that positions of the ball entering and exiting the strike zone S may be grasped, and thereby, a referee may be helped to make an accurate determination.” Shin is combinable with Spivak and Yerli since all three are from the analogous art of image analysis and tracking sport accessories. Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed inventio to combine the entry and exit calculation taught by Shin with the teachings of Spivak and Yerli. The motivation for doing so would be to assist referees in making accurate determinations, as taught in paragraph [0067]. Thus, it would have been obvious to combine the teachings of Shin with the teachings of Spivak and Yerli in order to obtain the invention as claimed in Claim 10. Claims 11-15 is rejected under 35 U.S.C. 103 as being unpatentable over Spivak (US Pub No 2024/0245971) in view of Yerli (US Pub No 2019/0321683), and further in view of Shin et al (US Pub No 2020/0238153), and further in view of Perry et al (US Pub No 20200334838), hereinafter Perry. As to Claim 11, Spivak in view of Yerli and Shin fails to teach wherein the AI algorithm is trained using information related to the motion of a player. However, Perry teaches “an automated recognition means” (see paragraph [0035]), which can be trained using information related to the motion of a player (see paragraph [0114], “Next, at step 330 the retrieved footages are applied together with retrieved ruler based biometrics of the identified player to a 3D moving model creator”, where the motion of the player is used to create the 3D moving model. Perry is combinable with Spivak, Yerli, and Shin since all three are from the analogous field of image analysis for sports and sports training. Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed inventio to combine the model taught by Perry with the teachings of Spivak, Yerli, and Shin. The motivation for doing so would be to better track players, and use the data to help players train. Perry teaches in paragraph [0014], “A biomechanical model of the moving player can depict interrelated aspects of the player's motions and/or environmental surroundings, thus providing useful statistics regarding possible cross correlations between environment and/or positions and/or motions to corresponding game results where these can be later used for training and performance improvement purposes.” Thus, it would have been obvious to combine the teachings of Perry with the teachings of Spivak, Yerli, and Shin in order to obtain the invention as claimed in Claim 11. As to Claim 12, Spivak in view of Yerli, Shin and Perry teaches the player is a pitcher (see Perry, Fig 2, the pitcher, and see paragraph [0020], “FIG. 2 depicts a generated biomechanics-based model for a baseball pitcher which takes into account muscle masses, bone lengths and joint positionings.”. As to Claim 13, Spivak in view of Yerli, Shin and Perry teaches wherein the transmission of the video data from the one or more cameras to the computer programmed with one of more mapping and tracking backbone programs and an AI algorithm, is automatically activated upon a defined motion of the pitcher. However, Perry teaches that only data corresponding to a specific motion is kept and used (see paragraph [0033], Instead a pitch detecting mechanism such as a peak ball speed radar gun 120 is used to determine the approximate time at which the ball 111 is in a play action state, in other words it is being thrown. A time stamp signal TRIG corresponding to a time point on a common clock is passed to each of the image storage units 131, 132 and 133 to indicate to each, how much (what segment length) of a full length of captured motion footage should be kept for each respective pitch. For example, it may be decided to keep 5 seconds worth of captured frames before the ball 111 is first detected by radar gun 120 and 2 seconds worth after. Thus, for each pitch event, only the relevant clip of motions (e.g., 7 seconds total) is kept for further use and analysis (including rendering of the current 3D biomechanical motion model) and the rest is discarded. This helps to reduce the amount of data storage needed”). As to Claim 14, Spivak in view of Yerli, Shin and Perry teaches that the defined motion of the pitcher is a wind-up prior to a pitch (see Perry, Fig 1, Moving Actor 1 is in a ‘wind up’ position), and see paragraph [0033], where recording includes time before the ball is thrown, which would include the ‘wind up’ position of the pitcher . As to Claim 15, Spivak in view of Yerli, Shin and Perry teaches that the background may be blocked or masked (see paragraph [0039], “In one embodiment, where initial camera settings do not provide sufficient contrast between one or more focused-upon players and their respective backgrounds, optical spectral filters and/or polarizing filters may be added to the cameras to improve contrast between player and background. More specifically, in one example player uniforms may be specially coated with light polarizing fibers and/or infra-red (IR) absorbing fibers that substantially distinguish the players from natural field materials so that corresponding camera equipment can capture well contrasted images of the players as distinct from background filed imagery”, where a filter is added to ‘mask’ the background and improve contrast between the player and the background. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to SOUMYA THOMAS whose telephone number is (571)272-8639. The examiner can normally be reached M-F 8:30-5:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jennifer Mehmood can be reached at (571) 272-2976. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /S.T./Examiner, Art Unit 2664 /JENNIFER MEHMOOD/Supervisory Patent Examiner, Art Unit 2664
Read full office action

Prosecution Timeline

Feb 09, 2024
Application Filed
Jan 15, 2026
Non-Final Rejection — §103, §112, §DP (current)

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
100%
Grant Probability
99%
With Interview (+0.0%)
2y 9m
Median Time to Grant
Low
PTA Risk
Based on 2 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month