Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAILED ACTION
This action is in response to the applicant's communication filed on 07/26/2023. In virtue of this communication, claims 1-30 filed on 07/26/2023 are currently pending in the instant application.
Information Disclosure Statement
The information Disclosure statement (IDS) form PTO-1449, filed on 11/01/2023 are in compliance with the provisions of CFR 1.97. Accordingly, the information disclosed therein was considered by the examiner.
Drawings
The drawings were received on 07/26/2023 have been reviewed by Examiner and they are acceptable.
Claim Rejections -35 USC§ 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claim 1-30 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The independent claims recite receiving, image data associated with a set of aquatic animals, determine a set of characteristics associated with the set of aquatic animals based on the image data by a machine learning model, and classify each aquatic animal in the set of aquatic animals based on the set of characteristics; and counting at least a subset of the aquatic animals based on the classification.
Step 1:
With regard to Step 1, the instant claims are directed to an apparatus, a method, and a non-transitory computer-readable medium, all among the statutory categories of invention.
Step 2A — Prong 1:
With regard to Step 2A — Prong 1, for example in method Claim 14, the limitations “receiving, image data associated with a set of aquatic animals, determine a set of characteristics associated with the set of aquatic animals based on the image data by a machine learning model,” and “classify each aquatic animal in the set of aquatic animals based on the set of characteristics”, and “ counting at least a subset of the aquatic animals based on the classification” as recited, is a method that, under its broadest reasonable interpretation, covers performance of the limitation in the mind/observation of a person inspecting an image/picture of the aquatic animals for sorting and counting. And based on their distances from each other, one may render an opinion as to the safety factor or the infection risk of the region. That is, other than reciting “by a computer" nothing in the claim steps preclude the limitations from practically being performed in the mind or through observation of a person inspecting an image/video. The recited computer is simply a generic device. If a claim limitation, under its broadest reasonably interpretation covers performance of the limitation in the mind but for the recitation of a generic components, then it falls within the "Mental processes" grouping of the abstract idea, which include concepts performed in the human mind, including an observation, evaluation, judgement, opinion. Accordingly, the claim recites an abstract idea. In addition, the additional components recited in independent Claims 1, 13, and 23, i.e., a memory, a processor, and a non-transitory computer-readable medium are simply generic computing components, accordingly, these independent claims include the above- described abstract idea.
Step 2A — Prong 2:
The 2019 PEG defines the phrase “integration into a practical application’ to require an additional element or a combination of additional elements in the claim
to apply, rely on, or use the judicial exception. In the instant case, the additional elements in the claims do not apply, rely on, or use the judicial exception.
This judicial exception is not integrated into a practical application because the claims only recite additional elements using a computer, a memory, a processor, or a non-transitory Computer-readable medium, for instance, that includes to perform the recited elements/functions/steps. These computing components in all are recited at high-level of generality and there are no other recited additional limitations in the claims. Accordingly, these additional steps/element do not integrate the abstract idea into a practical application because it is a field-of-use limitation that does not impose any meaningful limits on practicing the abstract idea. Therefore, independent Claims 1, 13, and 23 recite an abstract idea.
Step 2B:
Because the claims fail under Step 2A, the claims are further evaluated under Step 2B. The claims herein do not include additional elements that are sufficient to amount to significantly more than the judicial exception because as discussed above with respect to integration of the abstract idea into practical application, the additional element of using a computer, a memory, a processor, or a non-transitory computer- readable medium to execute programming instructions to perform the step amounts to no more than mere instructions to apply the exception using a generic apparatus component. Mere instructions to apply an exception using generic apparatus component cannot provide an inventive concept. The claim is not patent eligible.
Further, with regard to dependent Claims 2-12, 14-22, 24-30 viewed individually, these additional elements are under their broadest reasonable interpretation, cover performance of the limitation in the mind and do not provide meaningful limitations to transform the abstract idea into a patent eligible application of the abstract idea such that the claims amount to significantly more than the abstract idea itself. Accordingly, Claims 1-30 are rejected under 35 U.S.C. 101.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claim 7 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 7, limitation “wherein predicting …” is unclear, it is not clear whether the predicting has happened before or is a new step in this claim, sine the independent claim does not include predicting limitation.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claim(s) 1-2, 4-6, 8-14, 16-18, 20-23, 26-27, 29, and 30 is/are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Muramatsu et al. (JP2019135986A.)
As per claim 1, A system, comprising: “an optical sensor configured to generate image data associated with a set of aquatic animals;” “a memory; and a processor operatively coupled to the memory and the optical sensor,”( Muramatsu , ¶[0045] discloses imaging unit 4a.)
“the processor configured to: receive the image data associated with the set of aquatic animals;”( Muramatsu, ¶[0045] discloses capture images of the movements of the catch 13 in a tank or fish preserve.)
“determine a set of characteristics associated with the set of aquatic animals based on the image data using a machine learning model;”( Muramatsu, ¶[0046] discloses This fish species characteristic amount 15b is extracted by the fish species characteristic amount extracting unit 6b as fish species characteristic amount image data 5b by analyzing image data of the large amount of landed catch 13 as big data 11d. The process of extracting the fish species characteristic values 15b is carried out mainly by the neural network 11b of the artificial intelligence 11a. further, see ¶[0033] and [0034] for fish characteristic and further ¶[0070-0071].)
“classify each aquatic animal in the set of aquatic animals based on the set of characteristics using the machine learning model;”( Muramatsu, ¶[0019] discloses s individual features that can be used to distinguish catches of the same type and assigns an identification number. Further ¶[0020] discloses the premise is that by utilizing artificial intelligence, it will be possible to assign individual identification numbers to catches, ¶[0043] discloses FIG. 5 shows a schematic configuration of one embodiment of a procedure for identifying catches 13 of the same type and assigning an identification number 10 using a neural network 11b and deep learning 11c of an artificial intelligence 11a.¶[0048] discloses FIG. 1, a method for sorting landed catches 13 according to fish species 14 is shown. When the catch 13 is transported to a market by a catch transport machine 17 such as a belt conveyor, the image capturing unit 4 captures individual feature image data 5a. Then, the individual feature amount extracting unit 6a extracts the individual feature amount 15a. The imaging unit 4 may capture the individual feature amount image data 5a and simultaneously convert the individual feature amount image data 5a into fish species feature amount image 11-09-2025 - Page 28 data 5b. The extracted individual characteristic amount 15 a is compared by the fish species determination unit 8 with the fish species characteristic amount 15 b of each fish species 14 stored in the fish species characteristic amount storage unit 7 . Then, the fish species 14 of the catch 13 is determined. This comparison is performed mainly by deep learning 11c of the artificial intelligence 11a. further in ¶[0049] discloses assigning same identification number 10 to a group of fish. Further ¶[0052].)
“and count at least a subset of the aquatic animals based on the classification.” (Muramatsu, ¶[0036] discloses a catch quantity calculation unit that calculates the quantity of catch for each fish species based on the identification number. ¶[0074] discloses a catch quantity calculation unit that calculates the quantity of catch for each fish species based on the identification number. )
Claim 13 has been analyzed and is rejected for the reasons indicated in claim 1 above.
As per claim 2, the system of claim 1, “wherein the machine learning model is at least one of a deep learning model, a faster region-based convolutional neural network (Faster R-CNN), a single shot detector (SSD), and combinations thereof.”( Muramatsu, ¶[0043] discloses FIG. 4 is an explanatory diagram showing a schematic configuration of one embodiment of a procedure for determining the fish species 14 by comparing the feature amount 15 of the catch 13 with the fish species feature amount 15b using deep learning 11c of artificial intelligence 11a. FIG. 5 shows a schematic configuration of one embodiment of a procedure for identifying catches 13 of the same type and assigning an identification number 10 using a neural network 11b and deep learning 11c of an artificial intelligence 11a.)
Claim 14 has been analyzed and is rejected for the reasons indicated in claim 2 above.
As per claim 4, The system of claim 1, “wherein the set of characteristics associated with the set of aquatic animals includes at least one of mortality, health, developmental stage, quantity, size, shape, geometry, weight, and combinations thereof.” (Muramatsu, ¶[0031] discloses it is preferable that the catch management system has a three-dimensional shape measurement unit that calculates body length data of the catch from the three-dimensional shape, sorts the catch into body length ranges, calculates body shape data of the catch from the body length data and weight data of the catch, and the individual catch database stores the body length data and body shape data in association with the identification number of the catch. ¶[0033] discloses In the catch management system, it is preferable that the color imaging unit measures data regarding the eye color of the catch or data regarding the color of the body surface of the catch, the quality prediction unit predicts the freshness of the catch from the data regarding the eye color and body surface color, and the individual catch database stores the eye color data, body surface color data, and freshness data in association with the identification number of the catch. This is because, according to buyers and other "experts," the eyes of fresh catches are clear, while those of older catches are cloudy and deformed. Therefore, the freshness of a catch can be predicted by observing the color of its eyes. That is, the eye color and shape patterns of the fish species are stored in a database, and by pattern matching the eye color and shape of the catch, it is possible to predict the freshness of the catch. further see ¶[0070-0071].)
Claims 16 and 26 have been analyzed and are rejected for the reasons indicated in claim 4 above.
As per claim 5, The system of claim 1, “wherein each aquatic animal in the subset of aquatic animals has a common classification.” (¶[0049] discloses Once the fish species 14 of the catch 13 has been determined by the fish species determination unit 8 , the identification number assignment unit 9 assigns an identification number 10 to the catch 13 . For example, a consecutive number within the same type of catch 13 is assigned along with data such as the year of catch, day of the week, fish species, fishing ground, fishing boat, and fishing port. Further see ¶[0070-0071]).
Claim 17 has been analyzed and is rejected for the reasons indicated in claim 5 above.
As per claim 6, The system of claim 1, “wherein the image data represents an image depicting the set of aquatic animals, and determining the set of characteristics associated with the set of aquatic animals includes identifying an aquatic animal depicted along a boundary of the image.” (¶[0045] discloses The catch 13 is photographed by the imaging unit 4 during this transportation. The imaging unit 4 may be provided with not only the imaging camera 4a but also a sensor 4b. This sensor 4b measures the surface temperature, odor, color, etc. of the catch 13. A video camera 4c may also be provided to capture images of the movements of the catch 13. In particular, in the aquaculture industry 3c, it is used to capture images of the movements of the catch 13 in a tank or fish preserve.)
Claim 18 has been analyzed and is rejected for the reasons indicated in claim 6 above.
As per claim 8, The system of claim 1, “wherein at least one aquatic animal in the set of aquatic animals has a size smaller than about 1 centimeters (cm).” (¶[0048], discloses For example, feature amount data 7b of squid, feature amount data 7d of eel, . . . feature amount data 7n of fish species 14 are stored. ¶[0064], discloses The individual catch database 12 stores the individual feature amount image data 5 a and the individual feature amount data 2 at the juvenile fish stage in an individual juvenile fish database 12 c provided within the individual catch database 12 . ¶[0081]. (a juvenile squid or eel size is less than 1(CM)).)
Claim 20 has been analyzed and is rejected for the reasons indicated in claim 8 above.
As per claim 9, the system of claim 1, “wherein the optical sensor includes at least one of a scanner, optical counter, light blocking counter, light scattering counter, direct imaging counter, or camera.” (¶[0045] discloses The imaging unit 4 may be provided with not only the imaging camera 4a but also a sensor 4b. This sensor 4b measures the surface temperature, odor, color, etc. of the catch 13. A video camera 4c may also be provided to capture images of the movements of the catch 13. In particular, in the aquaculture industry 3c, it is used to capture images of the movements of the catch 13 in a tank or fish preserve.)
Claim 21 has been analyzed and is rejected for the reasons indicated in claim 9 above.
As per claim 10, The system of claim 1, “wherein the optical sensor is coupled to a conveyor configured to convey the set of aquatic animals from a collection device to at least one tank.” (¶[0045] discloses at the fishing port, the landed catch 13 is transported to the market by a catch transport machine 17 such as a belt conveyor. The catch 13 is photographed by the imaging unit 4 during this transportation. The imaging unit 4 may be provided with not only the imaging camera 4a.¶[0047-0048].)
As per claim 11, The system of claim 10, “wherein the image data includes multiple images depicting at least a portion of the set of aquatic animals as the set of aquatic animals are moved along the conveyor.” (¶[0045] discloses at the fishing port, the landed catch 13 is transported to the market by a catch transport machine 17 such as a belt conveyor. The catch 13 is photographed by the imaging unit 4 during this transportation. The imaging unit 4 may be provided with not only the imaging camera 4a. ¶[0047] and [0048] discloses When the catch 13 is transported to a market by a catch transport machine 17 such as a belt conveyor, the image capturing unit 4 captures individual feature image data 5a. Then, the individual feature amount extracting unit 6a extracts the individual feature amount 15a. The imaging unit 4 may capture the individual feature amount image data 5a and simultaneously convert the individual feature amount image data 5a into fish species feature amount image 11-09-2025 - Page 28 data 5b. The extracted individual characteristic amount 15 a is compared by the fish species determination unit 8 with the fish species characteristic amount 15 b of each fish species 14 stored in the fish species characteristic amount storage unit 7 . Then, the fish species 14 of the catch 13 is determined .)
As per claim 12, The system of claim 1, “wherein the set of aquatic animals is a set of aquatic animals from an aquaculture system, the system being implemented on a vessel configured to transfer the set of aquatic animals from the aquaculture system to a grading/sorting system of the vessel.” (Muramatsu, ¶[0045] discloses In conventional fishing, the catch 13 caught by a fishing boat is transported to a nearby fishing port and landed. At the fishing port, the landed catch 13 is transported to the market by a catch transport machine 17 such as a belt conveyor. The catch 13 is photographed by the imaging unit 4 during this transportation).
As per claim 22, in view of claim 13, “wherein the image data includes multiple image frames collectively forming a video depicting the set of aquatic animals.”(Muramatsu, ¶[0045] discloses a video camera 4c may also be provided to capture images of the movements of the catch 13. In particular, in the aquaculture industry 3c, it is used to capture images of the movements of the catch 13 in a tank or fish preserve.)
As per claim 23, An apparatus, comprising:
“a collection system configured to engage an aquaculture system to transfer a set of aquatic animals from the aquaculture system to a grading/sorting system of the apparatus configured to sort the set of aquatic animals;”(Muramatsu,¶[0052], (Sorting work) As shown in Figure 2, the catch 13 determined by the fish species determination unit 8 is transported to a sorting unit 18 by a catch transport device 17, such as a belt conveyor, and sorted.)
“a sensor configured to generate a sensor data associated with a subset of aquatic animals after being sorted by the grading/sorting system; and a controller operatively coupled to the collection system, the grading/sorting system, and the sensor, the controller having a processor and a memory,”(Muramatsu, ¶[0045] discloses the imaging unit 4 may be provided with not only the imaging camera 4a but also a sensor 4b. This sensor 4b measures the surface temperature, odor, color, etc. of the catch 13. A video camera 4c may also be provided to capture images of the movements of the catch 13. In particular, in the aquaculture industry 3c, it is used to capture images of the movements of the catch 13 in a tank or fish preserve.)
“the processor configured to execute a machine learning model to: determine a set of characteristics associated with the subset of aquatic animals based on the sensor data,” ( Muramatsu, ¶[0046] discloses This fish species characteristic amount 15b is extracted by the fish species characteristic amount extracting unit 6b as fish species characteristic amount image data 5b by analyzing image data of the large amount of landed catch 13 as big data 11d. The process of extracting the fish species characteristic values 15b is carried out mainly by the neural network 11b of the artificial intelligence 11a. further, see ¶[0033] and [0034] for fish characteristic and further ¶[0070-0071].)
“and classify each aquatic animal in the subset of aquatic animals based on the set of characteristics;” ( Muramatsu, ¶[0019] discloses s individual features that can be used to distinguish catches of the same type and assigns an identification number. Further ¶[0020] discloses the premise is that by utilizing artificial intelligence, it will be possible to assign individual identification numbers to catches, ¶[0043] discloses FIG. 5 shows a schematic configuration of one embodiment of a procedure for identifying catches 13 of the same type and assigning an identification number 10 using a neural network 11b and deep learning 11c of an artificial intelligence 11a.¶[0048] discloses FIG. 1, a method for sorting landed catches 13 according to fish species 14 is shown. When the catch 13 is transported to a market by a catch transport machine 17 such as a belt conveyor, the image capturing unit 4 captures individual feature image data 5a. Then, the individual feature amount extracting unit 6a extracts the individual feature amount 15a. The imaging unit 4 may capture the individual feature amount image data 5a and simultaneously convert the individual feature amount image data 5a into fish species feature amount image 11-09-2025 - Page 28 data 5b. The extracted individual characteristic amount 15 a is compared by the fish species determination unit 8 with the fish species characteristic amount 15 b of each fish species 14 stored in the fish species characteristic amount storage unit 7 . Then, the fish species 14 of the catch 13 is determined. This comparison is performed mainly by deep learning 11c of the artificial intelligence 11a. further in ¶[0049] discloses assigning same identification number 10 to a group of fish. Further ¶[0052].)
“and the processor further configured to count at least a portion of the subset of aquatic animals based on the classification.” (Muramatsu, ¶[0036] discloses a catch quantity calculation unit that calculates the quantity of catch for each fish species based on the identification number. ¶[0074] discloses a catch quantity calculation unit that calculates the quantity of catch for each fish species based on the identification number. )
As per claim 27, in view of claim 23, “wherein the set of characteristics is a first set of characteristics, the apparatus further comprising: the grading/sorting system, the grading/sorting system includes a sorting device configured to sort the set of aquatic animals received from the collection system based at least in part on a second set of characteristics.” (Muramatsu, ¶[0034] discloses sorting based on sex, ¶[0048] discloses sorting based on fish species, ¶[0052] discloses sorting based on fish species, male/female, adult fish /juvenile fish, diseased, deformed , deep-sea fish, or whether or not they have eggs.)
As per claim 29, in view of claim 27, “wherein the subset of aquatic animals has at least one common characteristic from the second set of characteristics.” (Muramatsu, ¶[0020] discloses the premise is that by utilizing artificial intelligence, it will be possible to assign individual identification numbers to catches, ¶[0043] discloses FIG. 5 shows a schematic configuration of one embodiment of a procedure for identifying catches 13 of the same type and assigning an identification number 10 using a neural network 11b and deep learning 11c of an artificial intelligence 11a.¶[0048] discloses FIG. 1, a method for sorting landed catches 13 according to fish species 14 is shown. When the catch 13 is transported to a market by a catch transport machine 17 such as a belt conveyor, the image capturing unit 4 captures individual feature image data 5a. Then, the individual feature amount extracting unit 6a extracts the individual feature amount 15a. The imaging unit 4 may capture the individual feature amount image data 5a and simultaneously convert the individual feature amount image data 5a into fish species feature amount image 11-09-2025 - Page 28 data 5b. The extracted individual characteristic amount 15 a is compared by the fish species determination unit 8 with the fish species characteristic amount 15 b of each fish species 14 stored in the fish species characteristic amount storage unit 7 . Then, the fish species 14 of the catch 13 is determined. This comparison is performed mainly by deep learning 11c of the artificial intelligence 11a. further in ¶[0049] discloses assigning same identification number 10 to a group of fish. Further ¶[0052].)
As per claim 30, in view of claim 29, “wherein the portion of the subset of aquatic animals has a common classification.” (Muramatsu, ¶[0020] discloses the premise is that by utilizing artificial intelligence, it will be possible to assign individual identification numbers to catches, ¶[0036] discloses a catch quantity calculation unit that calculates the quantity of catch for each fish species based on the identification number. ¶[0043] discloses FIG. 5 shows a schematic configuration of one embodiment of a procedure for identifying catches 13 of the same type and assigning an identification number 10 using a neural network 11b and deep learning 11c of an artificial intelligence 11a.¶[0048] discloses FIG. 1, a method for sorting landed catches 13 according to fish species 14 is shown. When the catch 13 is transported to a market by a catch transport machine 17 such as a belt conveyor, the image capturing unit 4 captures individual feature image data 5a. Then, the individual feature amount extracting unit 6a extracts the individual feature amount 15a. The imaging unit 4 may capture the individual feature amount image data 5a and simultaneously convert the individual feature amount image data 5a into fish species feature amount image 11-09-2025 - Page 28 data 5b. The extracted individual characteristic amount 15 a is compared by the fish species determination unit 8 with the fish species characteristic amount 15 b of each fish species 14 stored in the fish species characteristic amount storage unit 7 . Then, the fish species 14 of the catch 13 is determined. This comparison is performed mainly by deep learning 11c of the artificial intelligence 11a. further in ¶[0049] discloses assigning same identification number 10 to a group of fish. Further ¶[0052].)
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 3 and 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Muramatsu et al. (JP2019135986A), further in view of Kozachenok et al. (US 2022/0004760).
As per claim 3, the system of claim 2, “wherein the aquatic animals are mollusks, receive training image data representing multiple images of mollusks.” (Muramatsu, ¶[0047] discloses feature characteristic amount of each species for example, feature amount data 7b of squid, feature amount data 7c of scallop, ¶[0048],¶[0051] discloses the individual catch database 12 stores the individual feature amount image data 5 a and the individual feature amount data 2 together with the identification number 10 and the fish species 14 . That is, for each catch 13 caught at a fishing port, data such as the year of catch, day of the week, fish species 14, fishing ground, fishing boat, and fishing port are stored using an identification number 10, and individual feature image data 5a on which the individual feature 15a is determined, and individual feature data 2 which is a collective data compilation of the multiple individual feature values 15a are also stored. Data on aquaculture 3c and fish farming 3d is then added to this individual catch database 12. Furthermore, data from the catch management system 40 and the catch logistics system 50 is added to complete a single database, which becomes big data 11d and enables more accurate analysis by the artificial intelligence 11a further ¶[0081].)
However Muramatsu is silent on the following which would have been obvious in view of Kozachenok from similar field of endeavor “the processor is further configured to: train, using the training image data, the machine learning model for high recall.” (Kozachenok, ¶[0035] discloses the trained forecast model 132 is trained using training data including at least a portion of the data sets 108 and various machine-learning methods. The training data may include various images of fish 112 and/or images of views of the water surface 106. For example, the training data may include images of fish having varying features and properties, such as fins, tails, shape, size, color, and the like. The training data may also include images with variations in the locations and orientations of fish within each image, including images of the fish captured at various camera viewing angles. ¶[0040] discloses by observing fish behavior via machine/computer vision and processing the captured imagery using a trained forecast model to generate one or more quantifications associated with fish activity proximate the water surface, the system 100 is able to more objectively and accurately quantify fish activity as a measure of appetite in an advantageous way. The resulting surface splash score therefore indicates for multiple feeding parameters and may have an improved prediction accuracy. For example, using multiple image-based quantifications of fish activity and computing a surface splash score based on such quantifications provides for an appetite forecast with an increased accuracy (and also at a lesser human capital cost) relative to human-based observations or experiential determinations.)
Before the effective filing date of the claimed invention it would have been obvious to a person of ordinary skill in the art to combine Kozachenok technique of fish activity detection into Muramatsu technique to provide the known and expected uses and benefits of Kozachenok technique over fish product identification and management technique of Muramatsu. The proposed combination would have constituted a mere arrangement of old elements with each performing their known function, the combination yielding no more than one would expect from such an arrangement.
Therefore, it would have been obvious to a person of ordinary skill in the art to incorporate Kozachenok to Muramatsu in order to improve the precision and accuracy of feeding appetite determination and feeding strategies. (Refer to Kozachenok paragraph [0012].)
Claim 15 has been analyzed and is rejected for the reasons indicated in claim 3 above.
Claim(s) 7 and 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Muramatsu et al. (JP2019135986A), further in view of Okamoto et al. (WO 2019198332 A1)
As per claim 7, The system of claim 1, “wherein predicting the set of characteristics is based on each of the image data and the sensor data.” (Muramatsu , ¶[0045] discloses The imaging unit 4 may be provided with not only the imaging camera 4a but also a sensor 4b. This sensor 4b measures the surface temperature, odor, color, etc. of the catch 13. A video camera 4c may also be provided to capture images of the movements of the catch 13. In particular, in the aquaculture industry 3c, it is used to capture images of the movements of the catch 13 in a tank or fish preserve. Further see ¶[0047].)
However Muramatsu, is silent on the following which would have been obvious in view of Okamoto from similar field of endeavor “a contact sensor configured to generate contact data associated with the set of aquatic animals, the contact sensor operatively coupled to the processor, the processor further configured to receive the contact data from the contact sensor, wherein predicting the set of characteristics is based on each of the image data and the contact data.” (Examiner notes the claim has been rejected in view of 112 (b) rejection above. Okamoto, page 2, in section appearance figure 1, discloses these appetite sensors 25 detect appetite of farmed fish in the fish farming facility, and use contact sensors. In addition, a known sensor such as an infrared sensor, an ultrasonic sensor, an optical sensor, or a pressure sensor can be used as the appetite sensor 25. )
Before the effective filing date of the claimed invention it would have been obvious to a person of ordinary skill in the art to combine Okamoto technique of fish activity detection into Muramatsu technique to provide the known and expected uses and benefits of Okamoto technique over fish product identification and management technique of Muramatsu. The proposed combination would have constituted a mere arrangement of old elements with each performing their known function, the combination yielding no more than one would expect from such an arrangement.
Therefore, it would have been obvious to a person of ordinary skill in the art to incorporate Okamoto to Muramatsu in order to improve feeding systems by scattering over a wider range and at a longer distance. (Refer to Okamoto page 1.)
Claim 19 has been analyzed and is rejected for the reasons indicated in claim 7 above.
Claim(s) 24 and 25 is/are rejected under 35 U.S.C. 103 as being unpatentable over Muramatsu et al. (JP2019135986A), further in view of Bertoson (US 2019/0053476).
As per claim 24,The apparatus of claim 23, Muramatsu is silent on the following which would have been obvious in view of Bertoson from similar field of endeavor “wherein the collection system is configured to transfer the set of aquatic animals from a bin of the aquaculture system to a hopper of the grading/sorting system.” (Bertoson, ¶[0054] discloses the lifting device 114 can be a lifting chute, a hopper belt or any other lifting device or motorized lifting device known in the art for loading and unloading products. In a further example, the lifting device 114 can be a clam transfer tube. The transfer tube carries the clams collected by the clam scoop 106 vertically or substantially vertically towards the platform 108 using an impeller-less pump. A dewatering pump can be configured at the top of the clam transfer tube to remove water from the clams as the clams move from the transfer tube or impeller-less pump to the sorting area 116.)
Before the effective filing date of the claimed invention it would have been obvious to a person of ordinary skill in the art to combine Bertoson technique of clam transferring and sorting system into Muramatsu technique to provide the known and expected uses and benefits of Bertoson technique over fish product identification and management technique of Muramatsu. The proposed combination would have constituted a mere arrangement of old elements with each performing their known function, the combination yielding no more than one would expect from such an arrangement.
Therefore, it would have been obvious to a person of ordinary skill in the art to incorporate Bertoson to Muramatsu in order to increase quantity and speed of the sorting and harvesting clams. (Refer to Bertoson ¶[0030].)
As per claim 25, in view of claim 23, Muramatsu is silent on the following which would have been obvious in view of Bertoson from similar field of endeavor “wherein the collection system includes at least one of an arm, an arm support, a crane, an actuator, an end effector, and combinations thereof.” (Bertoson, ¶[0051] discloses The first support axle 140 can be coupled to one end of a link arm 138a and one end of a link arm 138b. The bucket 130 can be coupled to another end of the link arm 138a and another end of link arm 138b. In another example, the link arms 138a and 138b are pivotally connected to flat plates 148a and 148b, respectively, that protrude from the sides 133a and 133b of the bucket 130, respectively. In one configuration, the first support axle 140 and link arms 138a and 138b are locked and do not pivot about the first pivot axis 141. In another configuration, the first support axle 140 and link arms 138a and 138b can rotate about the first pivot axis 141 to allow the bucket to move to various positions. In another configuration, the first support axle 140 and link arms 138a and 138b can pivot to allow height adjustment of the bucket 130. The connecting assembly can include a second support axle 144 at a second pivot axis 145. The second support axle 144 can be pivotally coupled to one end of a second link arm or rocker arm 142a and one end of a rocker arm 142b. The rocker arms 142a and 142b are pivotally connected to flat plates 148a and 148b, respectively. The second support axle 144 and rocker arms 142a and 142b can rotate about the second pivot axis 145 to allow the bucket to move to various positions. In another example, the rocker arms 142a and 142b extend beyond the pivot axle 144 and are connected by a crossbar 146. The crossbar 146 can be configured as a handlebar for a user to manually move the bucket 130 from one position to another position. In another example, the connecting assembly of the clam scoop 106 can be mounted to the frame 102 with one or more hydraulic arms. In another example, at least one of pivot axle 140 and 144 are coupled to at least one motor that turn the pivot axle or axles and move the bucket 130 from one position to another position. In one example, the clam scoop 106 can be mounted to the front portion of the frame 102 with a pair of fixed arms and a pair and hydraulic rotators pivotally connecting the pair of fixed arms and the clam scoop 106. )
Before the effective filing date of the claimed invention it would have been obvious to a person of ordinary skill in the art to combine Bertoson technique of clam transferring and sorting system into Muramatsu technique to provide the known and expected uses and benefits of Bertoson technique over fish product identification and management technique of Muramatsu. The proposed combination would have constituted a mere arrangement of old elements with each performing their known function, the combination yielding no more than one would expect from such an arrangement.
Therefore, it would have been obvious to a person of ordinary skill in the art to incorporate Bertoson to Muramatsu in order to increase quantity and speed of the sorting and harvesting clams. (Refer to Bertoson ¶[0030].)
Claim(s) 28 is/are rejected under 35 U.S.C. 103 as being unpatentable over Muramatsu et al. (JP2019135986A), further in view of Loeffelman (US 4955005 A).
As per claim 28, in view of claim 27, “wherein the grading/sorting system includes an isolator element configured to dampen vibrations generated by the grading/sorting system during operation.” (Loeffelman, Col. 6, line 25 discloses Also cables 11 and straps 12 dampen vibrations from sources outside the chamber, preventing or materially reducing any such external vibrations from being imparted onto bag 9 and into chamber 15. Further see claim 5 discloses supporting cables and elastomeric isolators to dampen vibration from sources outside the container .)
Before the effective filing date of the claimed invention it would have been obvious to a person of ordinary skill in the art to combine Loeffelman technique of clam transferring and sorting system into Muramatsu technique to provide the known and expected uses and benefits of Loeffelman technique over fish product identification and management technique of Muramatsu. The proposed combination would have constituted a mere arrangement of old elements with each performing their known function, the combination yielding no more than one would expect from such an arrangement.
Therefore, it would have been obvious to a person of ordinary skill in the art to incorporate Loeffelman to Muramatsu in order to effectively, efficiently and easily guide animals underwater. (Refer to Loeffelman Col. 1, line 10-15.)
Contact
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SHAGHAYEGH AZIMA whose telephone number is (571)272-1459. The examiner can normally be reached Monday-Friday, 9:30-6:30.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Vincent Rudolph can be reached at (571)272-8243. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/SHAGHAYEGH AZIMA/Examiner, Art Unit 2671