Prosecution Insights
Last updated: April 19, 2026
Application No. 18/202,142

SYSTEM AND METHOD FOR CHARACTERIZING AND MONITORING HEALTH OF AN ANIMAL BASED ON GAIT AND POSTURAL MOVEMENTS

Final Rejection §103
Filed
May 25, 2023
Examiner
COUPE, ANITA YVONNE
Art Unit
3619
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Companion Labs Inc.
OA Round
2 (Final)
17%
Grant Probability
At Risk
3-4
OA Rounds
5y 9m
To Grant
53%
With Interview

Examiner Intelligence

Grants only 17% of cases
17%
Career Allow Rate
27 granted / 161 resolved
-35.2% vs TC avg
Strong +37% interview lift
Without
With
+36.6%
Interview Lift
resolved cases with interview
Typical timeline
5y 9m
Avg Prosecution
1 currently pending
Career history
162
Total Applications
across all art units

Statute-Specific Performance

§101
21.0%
-19.0% vs TC avg
§103
43.7%
+3.7% vs TC avg
§102
15.3%
-24.7% vs TC avg
§112
18.3%
-21.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 161 resolved cases

Office Action

§103
DETAILED ACTION Claim Rejections - 35 USC § 103 1. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 2. Claims 1-3 and 5 are rejected under 35 U.S.C. 103 as being unpatentable over Hanson (US Patent Publication 2016/0316716) in view of Hong (US Patent Publication 2019/0000036) and Grisel (US Patent Publication 2017/0262599). a. Regarding claim 1, Hanson teaches a method for monitoring health of an animal during autonomous training with a training apparatus [provides automated training by displaying videos and/or projecting audio via the interface device, abstract] comprising during a first training session accessing a video feed recorded by an optical sensor defining a field of view intersecting a working field [capture video with depth information including a depth image that may include depth values [0045]] and integrated into the training apparatus configured to dispense units of a primary reinforcer via a dispenser 210 integrated into the training apparatus [device 100 may comprise a treat dispenser 210 to provide positive reinforcement [0028]]; and in the video feed, detecting the animal in the working field [track 315 the animal to determine the specific position and/or movements of the animal [0058]; device 100 may detect the specific position of the animal, for example if the animal is sitting [0015]; data captured by the cameras and device in the form of the skeletal model and movements associated with it may be compared to the gesture filters in the gesture recognizer engine to identify when the animal (as represented by the skeletal) model) has performed one or more gestures [0053]]. Hanson does not specifically teach a training apparatus configured to dispense units of a primary reinforcer toward a range of locations within the working field via a dispenser integrated into the training apparatus; in response to detecting the animal at a first location within the working field calculating a target location for dispensation of a first unit of a primary reinforcer based on the first location and a target pathway defining a target distance and a target orientation of the animal relative the training apparatus; and dispensing the first unit of the primary reinforcer toward the target location. Hong teaches a training apparatus configured to dispense units of a primary reinforcer toward a range of locations within the working field via dispenser 200 integrated into the training apparatus [rotation driving unit 300 rotates the feed throwing unit 200 to adjust a throwing direction of the feed [0099]; since the pet is aware of the feed supply period through the sound source; the user may train the pet [0207]]; in response to detecting the animal at a first location within the working field [camera unit 500 obtains image information of the pet [0217]] calculating a target location B1 for dispensation of a first unit of a primary reinforcer based on the first location and a target pathway defining a target distance and a target orientation of the animal relative the training apparatus [a moving distance calculation part calculating a predicted cumulative moving distance of the pet by cumulatively adding distances (L) between the predicted reach positions of the feeds that are sequentially thrown from the feed throwing unit [0023]]; and dispensing the first unit of the primary reinforcer toward target location B1 [the feed that is thrown by the rotary apparatus [0236]; B1 represents a predicted reach position that is calculated by predicting a final reach position of the feed in consideration of rolling of the first falling feed [0231] FIG. 26]] for the purpose of providing for autonomously training an animal with a dispenser integrated into a training apparatus to adjust a throwing direction of the primary reinforcer to the position of the pet according to an input signal from a camera connected to a control unit to train the pet when the user is absent and to improve the quantity of fun and exercise of the pet. It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the method taught by Hanson to include a training apparatus configured to dispense units of a primary reinforcer toward a range of locations within the working field via a dispenser integrated into the training apparatus; in response to detecting the animal at a first location within the working field calculating a target location for dispensation of a first unit of a primary reinforcer based on the first location and a target pathway defining a target distance and a target orientation of the animal relative the training apparatus; and dispensing the first unit of the primary reinforcer toward the target location as taught by Hong because doing so would have provided for autonomously training an animal with a dispenser integrated into a training apparatus to adjust a throwing direction of the primary reinforcer to the position of the pet according to an input signal from a camera connected to a control unit to train the pet when the user is absent and to improve the quantity of fun and exercise of the pet. Hanson in view of Hong does not specifically teach during a first test period, in response to detecting movement of the animal along the target pathway toward the target location in the video feed collecting a first timeseries of position data representing changes in position of a set of anatomical features of the animal during the first test period; deriving a first movement profile for the animal based on the first timeseries of position data, the first movement profile representing movement of the animal along the target pathway during the first test period; accessing a baseline movement profile defined for the animal; and characterizing health of the animal based on a difference between the baseline movement profile and the first movement profile. Grisel teaches during a first test period, in response to detecting movement of the animal along the target pathway toward the target location in the video feed collecting a first timeseries of position data representing changes in position of a set of anatomical features of the animal during the first test period [capture device 162 can be embodied as a camera, video camera, microphone, or other related device or combination thereof [0023]; the capture device 162 can be used to capture images, video, and/or audio of one or more subjects standing, walking, trotting, or running for telemedical evaluation by the computing environment 100. The data can be captured from various angles and/or fields of view over various periods of time [0024]]; deriving a first movement profile for the animal based on the first timeseries of position data, the first movement profile representing movement of the animal along the target pathway during the first test period [As shown in FIG. 2, the subject 200, which is a horse in this case, is present in image frames 210A-210N of a video file 210 stored in the video content 121 [0048]]; accessing a baseline movement profile defined for the animal and characterizing health of the animal based on a difference between the baseline movement profile and the first movement profile [motion evaluator 146 can compare and evaluate various gait signatures, gait cycle patterns, gait distance differentials, gait cycle periods, gait cycle frequencies, etc., associated with one or more subjects. By evaluating the motion data from the motion tracker 144, the motion evaluator 146 can provide data to the evaluation module 132 for evaluation. As one example, the motion evaluator 146 can identify asymmetry in one or more gait cycle patterns of a subject, which may be an indicator of lameness [0046]; The reference data 125 can further include benchmark analytical metrics, such as benchmark posture, benchmark gait signatures, benchmark biological parameters, etc., for examination by the evaluation module 132 as references for comparison [0035]; The evaluation rules 126 can include, for example, a set of rules and/or metrics for evaluating the health of subjects to determine or diagnosis a likely source of lameness, for example, or other health aspects [0036]] for the purpose of providing for tracking motion of various subjects, including dogs, cats, and other animals walking, trotting, or running over a number of frames of video and provide an evaluation of the subject in the video to more accurately identify lameness or other health aspects in the various subjects. It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the method taught by Hanson in view of Hong to include during a first test period, in response to detecting movement of the animal along the target pathway toward the target location in the video feed collecting a first timeseries of position data representing changes in position of a set of anatomical features of the animal during the first test period; deriving a first movement profile for the animal based on the first timeseries of position data, the first movement profile representing movement of the animal along the target pathway during the first test period; accessing a baseline movement profile defined for the animal; and characterizing health of the animal based on a difference between the baseline movement profile and the first movement profile as taught by Grisel because doing so would have provided for tracking motion of various subjects, including dogs, cats, and other animals walking, trotting, or running over a number of frames of video and provide an evaluation of the subject in the video to more accurately identify lameness or other health aspects in the various subjects. b. Regarding claim 2, Hanson in view of Hong and Grisel teaches (references to Hong) the method of claim 1 further comprising, at an initial time during the first training session, dispensing an initial unit of the primary reinforcer toward a first target location within the working field via dispenser 200 occupying a first dispenser position φ1 [φ1 is a throwing azimuth of the first thrown feed [0231]] aligned to first target location B1; wherein calculating the target location based on the first location and the target pathway in response to detecting the animal at the first location comprises, in response to detecting the animal at the first location, falling within a threshold distance of the first target location, calculating a second target location B2 [A2 represents a position to which the second thrown feed falls, B2 represents a predicted reach position of the second falling feed [0232]] for dispensation of the first unit of the primary reinforcer based on the first target location and the target pathway [camera unit 500 obtains image information of the pet [0217]; control unit 400 controls the feed so that the feed is thrown according to the inputted throwing information. That is, the control unit 400 may control the throwing intensity, the throwing angle θ, and the throwing azimuth φ of the feed throwing unit 200 [0245]]; and wherein dispensing the first unit of the primary reinforcer toward the target location comprises dispensing the first unit of the primary reinforcer toward the second target location via the dispenser occupying a second dispenser position φ2 aligned to the second target location [φ2 is a throwing azimuth of the second thrown feed [0232]]. c. Regarding claim 3, Hanson in view of Hong and Grisel teaches (references to Grisel) the method of claim 1 wherein characterizing health of the animal based on the difference between the baseline movement profile and the first movement profile comprises characterizing the difference between the baseline movement profile and the first movement profile [motion evaluator 146 can compare and evaluate various gait signatures, gait cycle patterns, gait distance differentials, gait cycle periods, gait cycle frequencies, etc., associated with one or more subjects. By evaluating the motion data from the motion tracker 144, the motion evaluator 146 can provide data to the evaluation module 132 for evaluation. As one example, the motion evaluator 146 can identify asymmetry in one or more gait cycle patterns of a subject, which may be an indicator of lameness [0046]; The evaluation rules 126 can include, for example, a set of rules and/or metrics for evaluating the health of subjects to determine or diagnosis a likely source of lameness, for example, or other health aspects [0036]], the baseline movement profile derived from timeseries of position data collected for the set of anatomical features of the animal during a preceding time period [As shown in FIG. 2, the subject 200, which is a horse in this case, is present in image frames 210A-210N of a video file 210 stored in the video content 121 [0048]; reference data 125 can include various posture reference data and statistics, gait signatures, motion data, and/or motion metrics associated with subjects. In that context, gait signatures can include combinations of gait cycle patterns associated with anatomical features of subjects over time [0033]]; in response to the difference exceeding a threshold difference [reference data 125 can further include benchmark analytical metrics, such as benchmark posture, benchmark gait signatures, benchmark biological parameters, etc., for examination by the evaluation module 132 as references for comparison [0035]; The evaluation rules 126 can also include rules that determine a source, severity, and contributing factors (among other relevant characteristics) of lameness in subjects based on an analysis of the motion of anatomical features as compared with other, non-motion related benchmark values for anatomical features, such as deviations from benchmark heights or ranges of motion of subjects when standing, walking, trotting, running in stride, etc [0037]], identifying a first abnormality in the first movement profile for the animal, predicting a first causal pathway, in a set of causal pathways, for the first abnormality based on the difference and characteristics of the first movement profile [To identify potential sources, causes, or locations of lameness and other conditions, the embodiments described herein can automate lameness evaluation using a combination of various forms of telemedical data, such as videos, still images, audio recordings, and other medical data [0015]]; and characterizing a first health score for the animal based on detection of the first abnormality in the first movement profile and prediction of the first causal pathway [evaluation module 132 can determine the nature (e.g., weightbearing or non-weightbearing), severity, and/or grade of lameness or other conditions by considering the similarity to or difference from benchmark levels, for example, and provide that information as part of the evaluation of the subject. The grade of lameness can be expressed in a report using a numerical scale, an alphabetical scale, a color scale, or according to any other suitable means of presentation [0101]]; and further comprising selecting a first treatment protocol for the animal based on the first health score [Care can then be administered to the subject, in part, based on the results of the evaluation [0016]]. d. Regarding claim 5, Hanson in view of Hong and Grisel teaches (references to Grisel) the method of claim 1, wherein characterizing health of the animal based on the difference between the baseline movement profile and the first movement profile [motion evaluator 146 can compare and evaluate various gait signatures, gait cycle patterns, gait distance differentials, gait cycle periods, gait cycle frequencies, etc., associated with one or more subjects. By evaluating the motion data from the motion tracker 144, the motion evaluator 146 can provide data to the evaluation module 132 for evaluation. As one example, the motion evaluator 146 can identify asymmetry in one or more gait cycle patterns of a subject, which may be an indicator of lameness [0046]; The reference data 125 can further include benchmark analytical metrics, such as benchmark posture, benchmark gait signatures, benchmark biological parameters, etc., for examination by the evaluation module 132 as references for comparison [0035]; The evaluation rules 126 can include, for example, a set of rules and/or metrics for evaluating the health of subjects to determine or diagnosis a likely source of lameness, for example, or other health aspects [0036]] comprises characterizing a first health score for the animal based on the difference between the baseline movement profile and the first movement profile [evaluation module 132 can determine the nature (e.g., weightbearing or non-weightbearing), severity, and/or grade of lameness or other conditions by considering the similarity to or difference from benchmark levels, for example, and provide that information as part of the evaluation of the subject. The grade of lameness can be expressed in a report using a numerical scale, an alphabetical scale, a color scale, or according to any other suitable means of presentation [0101]] and in response to the first health score falling below a threshold health score terminating the first training session and generating a notification indicative of the first health score and comprising a suggestion to further investigate health of the animal and transmitting the notification to a user associated with the animal [Reports generated by the evaluation module 132 at step 712 can be stored in the data store 120, presented on a display for individuals, printed, e-mailed, etc. Individuals, such as veterinarians or other healthcare service providers, can reference the reports to determine an appropriate manner in which to assist subjects [0102]]. Allowable Subject Matter 3. Claims 4 and 6 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to JEFFREY R LARSEN whose telephone number is (313)446-6578. The examiner can normally be reached on increased Flextime. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Peter Poon can be reached on 571-272-6891. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JEFFREY R LARSEN/Examiner, Art Unit 3643
Read full office action

Prosecution Timeline

May 25, 2023
Application Filed
Nov 15, 2024
Non-Final Rejection — §103
Apr 21, 2025
Response Filed
Dec 17, 2025
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 9563897
SYSTEMS AND METHODS TO IDENTIFY INTENTIONALLY PLACED PRODUCTS
2y 5m to grant Granted Feb 07, 2017
Patent 9256982
Medical Image Rendering
2y 5m to grant Granted Feb 09, 2016
Patent 9189846
Method and Device for Representing Multichannel Image Data
2y 5m to grant Granted Nov 17, 2015
Patent 9188679
ELECTRONIC APPARATUS AND POWER SUPPLY CONTROL PROGRAM FOR POSITION MEASURING
2y 5m to grant Granted Nov 17, 2015
Patent 9086294
NAVIGATION DEVICE WITH ADAPTIVE NAVIGATION INSTRUCTIONS
2y 5m to grant Granted Jul 21, 2015
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
17%
Grant Probability
53%
With Interview (+36.6%)
5y 9m
Median Time to Grant
Moderate
PTA Risk
Based on 161 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month