DETAILED ACTION
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
Claims 1-20 are pending.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 1-14, 16-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kriesel et al. [US 20050257748] in view of Lampe et al. [US 20160073614].
As to claim 1. Kriesel discloses A method for analyzing animal health, the method comprising:
acquiring a sequence of depth images of at least one animal, from a monitoring device located at a facility, [figs. 2-9, 2-14, 0585, 0656] obtain video of target animal from range cameras at a predetermined area;
detecting a animal in the sequence of depth images, [0649, 0659] determine a 3D data set of the animal, and identifying a class of the animal, [0719] animals graded in quality based on measurements of dimensions of the animal, [0715] from the 3D data;
characterized by:
determining a body composition of the animal based on the depth images, [0738] mass/volume ratio for various tissue-types obtained;
determining a classification indication for the animal relating to a set of potential classifications based on the class of the animal, and the body composition of the animal, [0739] USDA grades and classifications calculated based on measurements, [0825] BCS score, using a trained analysis, [0293]; and
outputting a notification based on the classification indication to a computing device associated with at least one of the facility or a buyer, [fig. 4-54, 4-57A, 1288, 1291] display pre-selected calculations, any of the calculations described can be displayed, [fig. 4-65] including yield grade, BCS score, the notification indicating at least one of the following:
a productivity prediction for the animal, [fig. 4-65] yield grade; or
a recommended intervention for the animal.
Kriesel fails to disclose wherein the method further determines a gait of the animal in addition to the body composition; wherein classification is determined further based on gait of the animal; wherein the analysis is a neural network.
Lampe teaches a system and method used for determining lameness of an animal; wherein the system uses a plurality of cameras to determine the gait of an animal, [0065], using machine learning algorithms 513, [fig. 5, 0071]; wherein the system uses depth camera 205 to provide a time series of images, [0070], that can generate a 3D video representation of the animal, [0074]; wherein the system further determines the volume and weight distribution of the animal, [0080], which reads on the claimed body composition.
It would have been obvious for one of ordinary skill in the art at the time of the filing of the claimed invention to combine the teachings of Kriesel with that of Lampe so that the system can use the neural network to detect patterns that cannot be easily detected using a conventional algorithm.
As to claim 2. Kriesel discloses The method of claim 1, wherein the classification indication includes a category selected from a plurality of categories, and wherein the category is determined based on a score between a continuous range of scores, [0825] BCS score ranging continuously from 1 to 5.
As to claim 3. Kriesel discloses The method of claim 1, wherein the classification indication includes a category selected from a plurality of categories, and wherein the classification indication is determined based on previously determined categories on at least one of previous topologies, shapes, gaits, or body compositions, [0833-0835].
As to claim 4. Kriesel discloses The method of claim 2, wherein the category of the plurality of categories is further determined based on a threshold, [0774], to compare the gait of the animal, [1202] the animal needs to able to walk, and the body composition of the animal with the threshold, [1038].
As to claim 5. Kriesel fails to disclose The method of claim 1, wherein the gait of the animal is determined by:
identify a joint in a first frame of the number of video frames with a mark;
porting the identified joint in the first frame to a second frame of the number of video frames;
determining a time-series relative motion of the joint based on the joint in the first frame and the joint in the second frame; and
determining the gait of the animal based on the time-series relative motion.
Lampe teaches a system and method used for determining lameness of an animal, [0039-0043], wherein lameness is an abnormal gait of an animal, [0043], using machine learning algorithms 513, [fig. 5, 0071]; wherein the system uses depth camera 205 to provides a time series of images, [0070], for creating a point cloud representation of the movement of the animal, [fig. 2, 0071], and to identify points of interest to determine joint positions based on past images of similar animal using the machine learning model, [0071], and track the point of interest to identify motion in time [0071], and used to determine lameness, [0071].
It would have been obvious for one of ordinary skill in the art at the time of the filing of the claimed invention to combine the teachings of the Kriesel with that of Lampe so that the system can determine the gait of the animal using skeletal features to avoid varying outcomes based on the animal’s weight change.
As to claim 6. Kriesel fails to disclose The method of claim 5, wherein the gait of the animal is provided to the neural network trained to identify categories of the gait, and wherein the neural network was trained on a dataset comprising previous animal gait information and the categories in connection of the previous animal gait information.
Lampe teaches a system and method used for determining lameness of an animal, [0039-0043], wherein lameness is an abnormal gait of an animal, [0043], using machine learning algorithms 513, [fig. 5, 0071]; wherein the system uses depth camera 205 to provides a time series of images, [0070], for creating a point cloud representation of the movement of the animal, [fig. 2, 0071], and to identify points of interest (POI) to determine joint positions based on past images of similar animal using the machine learning model, [0071], and track the point of interest to identify motion in time [0071], and used to determine lameness, [0071]; wherein the machine learning model is trained with past POI data, [0092]; wherein the system outputs lameness signal that indicates presence and severity of lameness, [0082] which reads on the claimed identify categories based on past categories of the previous animal.
It would have been obvious for one of ordinary skill in the art at the time of the filing of the claimed invention to combine the teachings of Kriesel with that of Lampe so that the system can determine the gait of the animal using skeletal features to avoid varying outcome based on the animal’s weight change.
As to claim 7. Kriesel discloses The method of claim 1, further comprising: determining an indicator of the animal's backfat by measuring a region of the animal from the video data, [0802] 3D volume used to determine fat component.
As to claim 8. Kriesel discloses The method of claim 1, further comprising: determining an indicator of the body composition of the animal by determining at least one of a height, shoulder width, estimated weight, and estimated volume of the animal from the video data, [0822-0825] BCS score calculated from 3D data; wherein BCS score represents the fat reserves of the animal.
As to claim 9. Kriesel discloses A precision livestock farming system comprising:
a camera, [figs. 2-9, 2-14, 0585, 0656] range cameras; and
a processor, [0634] processing unit 42, wherein the precision livestock farming system is further characterized by a memory in communication with the processor, having stored thereon a set of instructions which, when executed, cause the processor to:
acquire video data regarding an animal of interest from the camera, [figs. 2-9, 2-14, 0585, 0656] obtain video of target animal from range cameras at a predetermined area, during a given time period, [0734, 1115], the video data comprising a series of image frames capturing a gait of an animal as the animal of interest is moving through a facility, [fig. 4-55, 0757];
determine a set of pose estimations for the animal of interest, [0171, 1217] three-dimensional animal model determined, as the animal of interest is moving through a facility based on the video data acquired from the camera, [fig. 4-55, 0757];
store the pose estimations in a data record associated with the animal of interest, [fig. 4-57A, 1254-1264] animal data, including historical data recorded in a database; and
provide the pose estimations to an algorithm to predict an animal outcome for animals of a similar species to the animal of interest, [fig. 4-63, 1325].
Kriesel fails to disclose wherein the algorithm is a neural network trained to predict an outcome.
Lampe teaches a system and method used for determining lameness of an animal; wherein the system uses a plurality of cameras to determine the gait of an animal, [0065], using machine learning algorithms 513, [fig. 5, 0071]; wherein the system uses depth camera 205 to provide a time series of images, [0070], that can generate a 3D video representation of the animal, [0074].
It would have been obvious for one of ordinary skill in the art at the time of the filing of the claimed invention to combine the teachings of Kriesel with that of Lampe so that the system can use the neural network to detect patterns that cannot be easily detected using a conventional algorithm.
As to claim 10. Kriesel discloses The system of claim 9, wherein the camera is a depth camera, [0656].
As to claim 11. Kriesel discloses The system of claim 10, wherein determining the pose estimations comprises determining landmarks of interest in a given depth image frame of the video data of the animal of interest, [0173, 1293].
As to claim 12. Kriesel discloses The system of claim 11, wherein determining landmarks of the animal of interest in the given depth image frame further comprises using a landmark detector to identify the landmarks of the animal of interest in another image of the animal of interest and transposing the landmarks of interest to the depth image, [fig. 4-57A] other views of the animal shown with the same reference.
As to claim 13. Kriesel fails to disclose The system of claim 9, wherein the neural network is trained to predict whether the animal of interest will exhibit an abnormal gait based upon the series of image frames of the video data of the animal of interest.
Lampe teaches a system and method used for determining lameness of an animal, [0039-0043], wherein lameness is an abnormal gait of an animal, [0043], using machine learning algorithms 513, [fig. 5, 0071]; wherein the system uses depth camera 205 to provides a time series of images, [0070], for creating a point cloud representation of the movement of the animal, [fig. 2, 0071], and to identify points of interest (POI) to determine joint positions based on past images of similar animal using the machine learning model, [0071], and track the point of interest to identify motion in time [0071], and used to determine lameness, [0071]; wherein the machine learning model is trained with past POI data, [0092]; wherein the system outputs lameness signal that indicates presence and severity of lameness, [0082].
It would have been obvious for one of ordinary skill in the art at the time of the filing of the claimed invention to combine the teachings of Kriesel with that of Lampe so that the system can use the neural network to detect patterns that cannot be easily detected using a conventional algorithm.
As to claim 14. Kriesel discloses The system of claim 9, wherein the instructions, when executed, further cause the processor to output a notification to the farming facility identifying a health issue for the animal of interest based upon the output of the neural network, [fig. 4-54, 4-57A, 1288, 1291] display pre-selected calculations, any of the calculations described can be displayed, [fig. 4-65] including yield grade, BCS score.
As to claim 16. Kriesel discloses The method of claim 1, wherein respective ones of the sequence of depth images correspond to frames of a video, [1367, 1368].
As to claim 17. Kriesel discloses The method of claim 1, further comprising: receiving supplementary data indicating a characteristic of an environment in which the animal is located, [fig. 4-70, 1221, 1223].
As to claim 18. Kriesel discloses The method of claim 17, wherein the characteristic is at least one of a temperature or a humidity of the environment, [fig. 4-70, 1221, 1223].
As to claim 19. Kriesel discloses The system of claim 9, wherein the instructions, when executed, further cause the processor to acquire supplementary data from a supplementary sensor, the supplementary data indicating a characteristic of an environment in which the animal is located, [fig. 4-70, 1221, 1223].
As to claim 20. Kriesel discloses The system of claim 19, wherein the characteristic is at least one of a temperature or a humidity of the environment, [fig. 4-70, 1221, 1223].
Claim(s) 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kriesel in view of Lampe as applied to claim 9 above, further in view of Eineren et al. [US 20150302241].
As to claim 15. Kriesel discloses The system of claim 9, wherein:
the camera is a depth camera positioned within a farming facility, [fig. 2-1];
the instructions, when executed, further cause the processor to:
determine body composition scores of the batch of animals based upon at least one of a height, shape, backfat width, or volume of each animal of the batch of animals, [0739] USDA grades and classifications calculated based on measurements, [0825] BCS score;
output the gait abnormality and body composition score determinations to at least one of a network associated with the farming facility or a network associated with potential buyers of the batch of animals, [fig. 4-65] yield grade, BCS score.
The combination of Kriesel and Lampe fails to disclose wherein the camera is a near-infrared depth camera; and wherein the processor is further configured to determine a gait abnormality for a batch of animals from a set of video data of the batch of animals acquired by the near-infrared depth camera.
Eineren teaches a system and method for predicting the outcome of a state of an animal wherein the system monitors the gait of the animal using cameras, [fig. 1, 0123]; wherein the camera is a near-infrared camera, [0017], that can sense a spatially precise image of the animal, [0058]; wherein a sequence of images is used to determine the monitored health outcome, [claim 1].
It would have been obvious for one of ordinary skill in the art at the time of the filing of the claimed invention to combine the teachings of the combination of Kriesel and Lampe with that of Eineren so that the system can the NIR to monitor both gait and physiological parameters without the need for additional sensors.
Response to Arguments
Applicant’s arguments, see page 7, lines 23-26, filed 03/10/2026, with respect to the rejection(s) of claim(s) 1-20 under 103 have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of the previously presented prior art Lampe et al.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to BENYAM HAILE whose telephone number is (571)272-2080. The examiner can normally be reached 7:00 AM - 5:30 PM Mon. - Thur..
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Steven Lim can be reached at (571)270-1210. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Benyam Haile/Primary Examiner, Art Unit 2688