Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Objections
Claims 1-7 are objected to because of the following informalities:
Claim 1 recites in part “a random number generator which generates a set of a first random number and a second random number while the second transmitter and receiver receiving the information set”. Please amend to include “are” between “receiver” and “receiving. Further claim 1 recites in part “and converts the second random number to the code to be output”. Please remove the bolding from “the code” in the claim.
Claim 3 recites in part “wherein the remote electronic device 3 includes a mobile phone”. Please remove the “3” from the claim.
Claim 7 recites in part “with audio data built-in a database of the storage unit” Please amend the claim to recite instead “the data storage unit” for consistency purposes.
Appropriate correction is required.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-7 are rejected under 35 U.S.C 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1: Claims 1-7 recite a system (machine), and therefore fall into a statutory category.
Step 2A – Prong 1 (Is a Judicial Exception Recited?):
Referring to claims 1-7, the claims recite a manner of organizing the acquisition information regarding animals used in making a product for sharing with users, which under its broadest reasonable interpretation covers concepts covered under the Certain Methods of Organizing Human Activity.
The abstract idea portion of the claims is as follows:
[A real food honesty display system comprising: at least one on-site recording device, a central electronic device, at least one remote electronic device, and a network used for connecting the central electronic device with both the on-site recording device and the remote electronic device in a wireless manner]; [wherein the on-site recording device is disposed in at least one ecological environment and provided with an image-sound capture device used for] capturing a dynamic video of at least one animal, [a positioning device] for generating a geographic information showing a position coordinate of the image-sound capture device, [a first transmitter and receiver for] transmission of a plurality of information sets; [a first processor coupled to the image-sound capture device, the positioning device, and the first transmitter and receiver]; [wherein the first processor is used to activate the image-sound capture device, make the positioning device] generate the geographic information, and combine the dynamic video with the geographic information to form the information set; [wherein the central electronic device includes a second transmitter and receiver for] receiving the information set [from the on-site recording device] and transmitting the information set corresponding to a code [to the remote electronic device]; [a random number generator which] generates a set of a first random number and a second random number while [the second transmitter and receiver] receiving the information set; [a data storage unit for] storing the set of the first random number and the second random number, the information set corresponding to the first random number, and the code corresponding to the second random number; [a second processor coupled to the second transmitter and receiver, the random number generator, and the data storage unit; wherein the second processor] corresponds the first random number to the information set and converts the second random number to the code to be output; wherein when the remote electronic device gets access to the code, [the second processor] finds out the second random number corresponding to the code, retrieves the set of the first random number and the second random number containing the second random number corresponding to the code, and gets the corresponding information set according to the first random number in the set of the first random number and the second random number containing the second random number corresponding to the code; then the information set is output [to the remote electronic device]; [wherein the remote electronic device used in combination with the code disposed and displayed on food products or food packaging in a consumer scene includes a display device, a reader for reading the code, a third transmitter and receiver able to be connected with the central electronic device by the network for receiving the information set; a third processor coupled to the display device, the reader, and the third transmitter and receiver]; wherein after [the reader] reading the code, [the third processor] obtains the information set corresponding to the code and displays the information set [on the display device].
Wherein the portions not bracketed recite the abstract idea.
The recited abstract idea above describes a manner of managing personal behavior or relationships or interactions between people (following rules or instructions) but for the recitation of generic computing components. In the present application the claims recite concepts for organizing a manner of organizing the acquisition information regarding animals used in making a product for sharing with users. (See paragraph 1, 5, and 23)
If a claim limitation, under its broadest reasonable interpretation, covers concepts capable of being performed in managing personal behavior or relationships or interactions between people (including following rules or instructions), it falls under the Certain Methods of Organizing Human Activity grouping of abstract ideas. See MPEP 2106.04.
Step 2A-Prong 2 (Is the Exception Integrated into a Practical Application?):
The examiner views the following as the additional elements:
At least one on-site recording device. (See paragraphs 18-19 and Fig. 1 el. 1)
A central electronic device. (See paragraph 21 and Fig. 1 el. 2)
At least one remote electronic device. (See paragraphs 23 and Fig. 1 el.3)
A network. (See paragraph 17 and Fig. 1 el. 4).
An image sound capture device. (See paragraph 19 and Fig. 2 el. 11)
A positioning device. (See paragraph 19 and Fig. 2 el. 13)
A first transmitter and receiver. (See paragraph 19 and Fig. 2 el. 14)
A first processor. (See paragraph 19 and Fig. 2 el. 15)
A second transmitter and receiver. (See paragraph 22 and Fig. 3 el. 21)
A random number generator. (See paragraph 22 and Fig. 3 el. 22)
A data storage unit. (See paragraph 22 and Fig. 3 el. 23)
A second processor. (See paragraph 22 and Fig. 3 el. 24)
A display device. (See paragraph 25 and Fig. 4 el. 31)
A reader. (See paragraph 24 and Fig. 4 el. 32)
A third transmitter and receiver. (See paragraph 25 and Fig. 4 el. 33)
A third processor. (See paragraph 25 and Fig. 4 el. 34)
These additional elements are recited at a high-level of generality such that they act to merely “apply” the abstract idea using generic computing components and do not integrate the abstract idea into a practical application. (See MPEP 2106.05 (f))
Referring to “network used for connecting the central electronic device with both the on-site recording device and the remote electronic device in a wireless manner”, “wherein the first processor is used to activate the image-sound capture device, make the positioning device” and “wherein the remote electronic device used in combination with the code disposed and displayed on food products or food packaging in a consumer scene includes a display device, a reader for reading the code, a third transmitter and receiver able to be connected with the central electronic device by the network for receiving the information set” the examiner views as a results-oriented solution lacking details and therefore equivalent to merely apply it. (See Id.) and paragraphs 17, 20, and 23 of the Specification).
The combination of these additional elements and/or results oriented steps are no more than mere instructions to apply the exception using generic computing components. (See MPEP 2106.05 (f)) Accordingly, even in combination these additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. Therefore, the claim is directed to an abstract idea.
Step 2B (Does the claim recite additional elements that amount to Significantly More than the Judicial Exception?):
As noted above, the claims as a whole merely describes a method that generally “apply” the concepts discussed in prong 1 above. (See MPEP 2106.05 f (II)) In particular applicant has recited the computing components at a high-level of generality such that it amounts to no more than mere instructions to apply the exception using generic computer components. As the court stated in TLI Communications v. LLC v. AV Automotive LLC, 823 F.3d 607, 613 (Fed. Cir. 2016) merely invoking generic computing components or machinery that perform their functions in their ordinary capacity to facilitate the abstract idea are mere instructions to implement the abstract idea within a computing environment and does not add significantly more to the abstract idea. Accordingly, these additional computer components do not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. Therefore, even when viewed as a whole, nothing in the claim adds significantly more (i.e. an inventive concept) to the abstract idea and as a result the claim is not patent eligible.
Dependent claim 2 further generally links the abstract idea to a field of a use of where to apply the abstract idea by further defining what the central device includes and does not integrate the abstract idea into a practical application or adds significantly more to the abstract idea. (See paragraph 21 and Fig. 1 el. 2). Therefore claim 2 is considered to be patent ineligible.
Dependent claim 3 further generally links the abstract idea to a field of a use of where to apply the abstract idea by further defining what the remote electronic device includes and does not integrate the abstract idea into a practical application or adds significantly more to the abstract idea. (See paragraph 23 and Fig. 1 el. 3). Therefore claim 3 is considered to be patent ineligible.
Dependent claim 4 further generally links the abstract idea to a field of a use of where to apply the abstract idea by further defining what the network includes and does not integrate the abstract idea into a practical application or adds significantly more to the abstract idea. (See paragraph 17 and Fig. 1 el. 4). Therefore claim 4 is considered to be patent ineligible.
Dependent claim 5 further defines the abstract idea as identified. Additionally, the claim recites the generic second processor (See paragraph 22 and Fig. 3 el. 24) for merely implementing the abstract idea using generic computing components which does not integrate the abstract idea into a practical application or adds significantly more. Therefore claim 5 is considered to be patent ineligible.
Dependent claim 6 further defines the abstract idea as identified. Additionally, the claim recites the generic on-site recording device (See paragraphs 18-19 and Fig. 1 el. 1) and at least one sensor (See paragraph 32 and Fig. 2 el. 12) for merely implementing the abstract idea using generic computing components which does not integrate the abstract idea into a practical application or adds significantly more. Therefore claim 6 is considered to be patent ineligible.
Dependent claim 7 further defines the abstract idea as identified. Additionally, the claim recites the generic second processor (See paragraph 22 and Fig. 3 el. 24), a vibration circuit (See paragraph 32), a database (See paragraph 33 and Fig. 3 el. 231), and storage unit (See paragraph 22 and Fig. 3 el. 23) for merely implementing the abstract idea using generic computing components which does not integrate the abstract idea into a practical application or adds significantly more. Therefore claim 7 is considered to be patent ineligible.
In conclusion the claims do not provide an inventive concept, because the claims do not recite additional elements or a combination of elements that amount to significantly more than the judicial exception of the claims. There is no indication that the combination of elements improves the functioning of a computer or improves any other technology, and the collective functions merely provide conventional computer implementation. Therefore, whether taken individually or as an order combination, the claims are nonetheless rejected under 35 U.S.C. 101 as being directed to non-statutory subject matter.
No Prior Art Applied
The prior art of record fails to at least explicitly disclose or teach at least the following limitations of the independent claims:
a random number generator which generates a set of a first random number and a second random number while the second transmitter and receiver receiving the information set; a data storage unit for storing the set of the first random number and the second random number, the information set corresponding to the first random number, and the code corresponding to the second random number;
Spoor (US 20200184486) -directed to tracking animal protein for consumption. Spoor paragraph 6 teaching A method and apparatus for tracking an edible consumable includes maintaining a database of parameters related to edible consumables (e.g. animal protein obtained from animals). The parameters related to the edible consumables can include animal husbandry parameters, processing parameters, and transportation parameters. The edible consumables can be from a plurality of sources. Animals can be processed by a plurality of processors to produce edible consumables. A unique identifier associated with a particular edible consumable (e.g., a cut of meat) is received from a user device. The user device generates the unique identifier based on a quick response (QR) code that is scanned by the user device. Parameters associated with the particular edible consumable are retrieved from a database of edible consumables. The retrieved parameters are transmitted in response to receiving the unique identifier. The animal husbandry parameters can include general farm parameters, general herd parameters, general flock parameters, sourcing parameters, animal specific husbandry practices, feed parameters, healthcare parameters, animal management parameters, water management parameters, environment parameters, and finishing parameters. In one embodiment, a location of a device that transmitted the unique identifier is determined and can be used to determine a distance from a one of the plurality of sources associated with the particular edible consumer. An identifier of the distance can be transmitted in response to receiving the unique identifier. Spoor paragraph 35 teaching FIG. 1 depicts parameters related to raising animal 10. Animal 10, in one embodiment, is a farm animal, such as a cow, pig, goat, lamb, turkey, chicken, etc. The parameters related to raising animal 10 include farm 12 which identifies where the animal was raised. General herd information 14 (also referred to as general herd parameters) specifies certain cyclical practices of farm 12 at which animal 10 was bred. In one embodiment, general herd information 14 can also pertain to general flock information (also referred to as general flock parameters). Sourcing practices 16 (also referred to as sourcing parameters) can include an identification of the origination of certain animals in a herd or flock, or the entirety of a herd or flock. Husbandry practices 18 (also referred to as animal specific husbandry parameters) indicate what practices on a farm govern mating, pregnancy, birth and care of offspring. In one embodiment, husbandry practices pertain to a particular animal and are referred to as animal specific husbandry practices Diet and feeding practices 20 (also referred to as feed parameters) identify the inputs (e.g., foods) that are administered to the animals by mouth and, in some embodiments, the regularity with which they receive such food as may be distributed. Healthcare practices 22 (also referred to as healthcare parameters) specify the substances that are administered to animals either to prevent illness and infection or as a result of illness and infection. Water practices 24 (also referred to as water management parameters) specify the sourcing, distribution and availability of water to the herd or flock. Environment 26 (also referred to as environment parameters) indicates the size, type, coverage and composition of the living area of the animals raised by the farm. Finishing (also referred to as finishing parameters) and processing 28 identifies the procedures that may be in place to prepare the animal for eventual processing by making significant changes to diet, feeding, exercise or other inputs. Distance from consumer to farm 30 identifies the distance from a consumer to the location of farm 12 where the animal protein was raised. In one embodiment, distance from consumer to farm 30 is used to determine if the animal protein intended for consumption deserves to be referred to as local insofar as the farm of origin is operating within geographical parameters that any reasonable person would consider to be proximate. In one embodiment, the steps from point-of-origin to the point-of-service are inclusive of pick-up at the farm, drop off and pick up from the processing facility, and delivery to either the consumer or the grocery store, delicatessen, restaurant or other location of the end user of the animal protein intended for consumption. Spoor paragraph 44 teaching a farm 302 is shown having animals. At step 304, a person associated with a farm completes farm questionnaire 400 and animal questionnaire 500 which are transmitted to administrator database 504. At step 306, a server on which administrator database 504 is located compares data from farm questionnaire 400 and animal questionnaire 500 to standards 600 which are stored in administrator database 504. In one embodiment, this comparison is performed using an algorithm. Alternatively, the comparison can also be performed by a management team or a combination of an algorithm and a management team. In one embodiment, the management team comprises individuals associated with maintaining server and/or administrator database 504 and are referred to as the management. If the information provided in farm questionnaire 400 and animal questionnaire 500 satisfies standards 600, that information is transmitted to and stored in administrator database 504. At step 308, a unique QR code 650 is assigned to the individual animal protein intended for consumption based on the answers to farm questionnaire 400 and animal questionnaire 500, combined with pricing information, portion information, the distance from the farm to the consumer, and every step in the supply chain, in a standardized, easy-to-read format, which is then listed on a website and made accessible to consumers. In one embodiment, the website is controlled by management. Customers in locations serviced by management, for instance, local restaurants, grocery stores and delicatessens, can scan the QR code 650 as shown at step 310 and review multiple data points regarding the animal protein intended for consumption as shown at step 312, and purchase as shown at step 314. Spoor paragraph 103-105 teaching FIG. 16 illustrates how a user, such as a consumer, can use a device, such as a smart phone, to obtain information concerning an animal protein intended for consumption. User device 1602 scans QR code 650 that, in one embodiment, is a unique identifier associated with an animal protein intended for consumption. QR code 650 contains data that user device 1602 can use to retrieve information administrator database 504 (shown in FIG. 3) in which data pertaining to animal protein intended for consumption is stored. For example, QR code 650 can be located on a menu in a restaurant and scanned by user device 1602. QR code 650 identifies information pertaining to an animal protein that can be retrieved the administrator database 504 in order to be displayed to the user. QR code 650, in one embodiment is generated by a server on which administrator database 504 is stored. QR code 650, in one embodiment, is generated in response to a request from the server. Generated QR codes can be (i) displayed on the digital platform, (ii) sent to grocery stores, delicatessens or restaurants, (iii) viewed by shoppers on their personal devices 1602. Display 1604 of user device 1602 depicts information displayed to a user in response to the user scanning QR code 650 using a scanning device (e.g., a camera) of user device 1602 according to an embodiment. Name 1606 identifies a name of an entity (e.g., a farm) that provided the animal protein intended for consumption. Image 1610 is an image selected for display associated with the farm identified by name 1606. In one embodiment, various text shown on display 1604 of user device 1602 can be selected (e.g., highlighted or hovered over using a mouse and clicked on or selected by touch using a touch screen) in order to display detailed information about the farm identified by name 1606. Location 30 identifies the user's current distance from the farm which can be determined as described in detail below. Complete supply chain information 1612 is text that can be selected to display information concerning a supply chain as described in further detail below in connection with FIG. 18. Farm information 12 is text shown on display 1604 of user device 1602 that can be selected in order to display information pertaining to the farm identified by name 1606. General information 14 is text shown on display 1604 of user device 1602 that can be selected in order to display information pertaining to the farm identified by name 1606. Sourcing practices 16 is text shown on display 1604 of user device 1602 that can be selected in order to display information pertaining to the sourcing practices of the farm identified by name 1606. Husbandry practices 18 is text shown on display 1604 of user device 1602 that can be selected in order to display information pertaining to the husbandry practices of the farm identified by name 1606. Diet and feeding practices 20 is text shown on display 1604 of user device 1602 that can be selected in order to display information pertaining to the diet and feeding practices of the farm identified by name 1606. Health care practices 22 is text shown on display 1604 of user device 1602 that can be selected in order to display information pertaining to health care of animals on the farm identified by name 1606. Water practices 24 is text shown on display 1604 of user device 1602 that can be selected in order to display information pertaining to the watering practices of the farm identified by name 1606. Environment 26 is text shown on display 1604 of user device 1602 that can be selected in order to display information pertaining to the environment of the farm identified by name 1606. Finishing Practices 28 is text shown on display 1604 of user device 1602 that can be selected in order to display information pertaining to the finishing of animals on the farm identified by name 1606.
Mindel et al. (US 20210153479) -directed to monitoring livestock in an agricultural pen. Mindel paragraphs 54-56 teaching in some embodiments, non-transient computer-readable storage device 114 (which may include one or more computer readable storage mediums) is used for storing, retrieving, comparing, and/or annotating captured image frames. Image frames may be stored on storage device 114 based on one or more attributes, or tags, such as a time stamp, a user-entered label, or the result of an applied image processing method indicating the association of the frames, to name a few. The software instructions and/or components operating hardware processor 110 may include instructions for receiving and analyzing multiple image frames captured by imaging device 118. For example, hardware processor 110 may comprise image processing module 110 a, which receives one or more images and/or image streams from imaging device 118 and applies one or more image processing algorithms thereto. In some embodiments, image processing module 110 a and/or machine learning module 110 b comprise one or more algorithms configured to perform object detection, segmentation, recognition, identification, and/or classification in images captured by imaging device 118, using any suitable image processing technique. The image streams received by the image processing module 110 a may vary in resolution, frame rate (e.g., between 15 and 35 frames per second), format, and protocol according to the characteristics and purpose of their respective source device. Depending on the embodiment, the image processing module 110 a can route image streams through various processing functions, or to an output circuit that sends the processed image stream for presentation, e.g., on a display 116 a, to a recording system, across a network, or to another logical destination. In image processing module 110 a, the image stream processing algorithm may improve the visibility and reduce or eliminate distortion, glare, or other undesirable effects in the image stream provided by an imaging device. An image stream processing algorithm may reduce or remove fog, smoke, contaminants, or other obscurities present in the image stream. The image stream processing module 110 a may apply image stream processing algorithms alone or in combination. In some embodiments, system 100 comprises a communications module (or a set of instructions), a contact/motion module (or a set of instructions), a graphics module (or a set of instructions), a text input module (or a set of instructions), a Global Positioning System (GPS) module (or a set of instructions), voice recognition and/or and voice replication module (or a set of instructions), and one or more applications (or sets of instructions). Mindel paragraph 107 teaching assessment module 230 may assess the health physical state of an animal from its behavior and locations over time as captured by the images. For example, if an animal does not change its posture and/or location in a predetermined number of consecutive images, assessment module 230 may assess that the animal is sick or dead. If an animal is not detected near the feeding or the water equipment in the pen, in a predetermined number of consecutive images, assessment module 230 may assess that the animal is not in good health.
Biffert et al. (US 20220192151) -directed to livestock management system. Biffert paragraphs 95-96 teaching the microphone 66 and the camera 72 provide information about sounds and still and/or moving video in the environment around and external to the tag 20 and in some cases the livestock 12 to which the tag 20 is attached. The microphone 66 and camera 72 each may be accessed by the processor 50 in response to either a determination made locally by the processor 50 or a command or instruction received from the management system platform 140. The processor 50 may access each of the microphone 66 and the camera 72 on a periodic schedule or on an on-demand basis. In addition, each or either of the microphone 66 and the camera 72 may include separate control circuitry that enables the microphone 66 and/or the camera 72 to automatically respond to a sound and/or video stimulus independent of the processor 50 and to communicate data regarding the sound and/or video stimulus to the processor 50. Alternatively, the microphone 66 and the camera 72 may share common control circuitry for that purpose. For example, the microphone 66 may be controlled to automatically respond to a sound having loudness above a threshold level or to a sound of a particular type, such as a particular type of sound occurring in the external environment or made by the livestock 12 to which the tag 20 is attached. Similarly, the camera 72 may be controlled to automatically respond to certain detected shapes or movements in the external environment. The audio and video data provided by the microphone 66 and/or the camera 72 are helpful in determining the occurrence of certain events external to the livestock 12 such as a nearby gunshot, or the approach of a predator or vehicle. They are also helpful in determining certain activities, behaviors and health-related and other physical conditions of the livestock 12. For example, detection of repeated or continuous bawling or mooing sounds by the microphone 66 may indicate the livestock 12 is ill or injured, has become separated from a calf, or that a predator is nearby. The microphone can also be used to detect coughing or other sounds, to aid in determining if a livestock 12 is ill. Similarly, detection of certain video may indicate the livestock 12 is down, ill or injured, trapped, etc. or that the livestock 12 is involved in certain behavior such as mating.)
Siedenberg (US 20200111106) -directed to authenticating products. Siedenberg paragraphs 6-8 teaching the embodiments of the present disclosure can improve traceability technology by adding multi-factor authentication, creating layers of security to the supply chain, guaranteeing the integrity of every product such as a fish from harvest to the point of sale. After a fish is harvested, it can be tagged with a unique code and photographed with a timestamp. This data can be transmitted to a server, where it can be joined with a host of other relevant data specific to the fish, such as feed formula and harvest time, as well as consumer friendly data about the species, including popular recipes and nutritional information. As the product moves through its supply chain, its data can be verified and updated at each stop along the way creating a history of the product, from the processor, transporter, wholesaler, or consumer, each update can be sent and saved to the server, ensuring the integrity of the product, as well as its handlers. A quick reading or scan with an electronic device, or a mobile software application on the electronic device, allows customers to validate the source of the product being scanned. The exemplary embodiments provide seafood customers with the information necessary to make informed decisions about purchases, and create a transparent supply chain including authentication of a product. For example, according to some exemplary embodiments of the present disclosure, a method of authentication of a food product, the method comprising tagging a food product with a unique code, recording vital information associated with the food product, the vital information including one or more of an image of the food product, a date and time the food product was tagged, a weight of the food product, a location the food product was obtained and a type of the food product, uploading the vital information to a computer server, and associating the vital information with the unique code so that the vital information is specific to each unique code. Siedenberg paragraph 23 teaching a code 130 (e.g., “RT89JVM2N8”) can be provided that is specific to the actual produce item labeled, which once scanned by a consumer or other party, can provide information that will be more fully described below. The code 130 can be unique to the produce, and can be an alphanumeric code including a combination of letters (upper or lower case) and numbers, which can include symbols as well as other special characters. The code 130 according to the exemplary embodiments of the present disclosure is not limited to an alphanumeric code and can be any code for a type of tag, such as a barcode, Quick Response (QR) code, matrix code, readable code image, radio-frequency identification code for an RFID tag, near field communication code for NFC tags, or any type of code(s) and tag that can be read by, e.g., an electronic device. The code 130 can be provided with precision serial printing so easily readable by an electronic device. Siedenberg paragraph 26 teaching FIG. 2 illustrates a system by which produce can be authenticated according to the exemplary embodiments of the present disclosure. Once a fish 210 or other produce is harvested, a tag 100 containing a code 130 as described above can be applied to the produce 210 at the point of harvest (e.g., when a fish is caught or brought onto or off of a boat). In some exemplary embodiments, only a code 130 can be provided initially to the fish 210, and a tag 100 can be applied later containing the remaining information described above. A photo of the fish 210 can be taken using a camera 220 (e.g., a digital camera, or a digital camera provided on glasses 225 or a helmet), and other information can also be recorded and uploaded to a computer server 230, such as but not limited to the photo, time, date, code, birth location, location of harvest (e.g., fish farm or ocean), type of fish and weight. All this information can be stored on the computer server 230 in a record related to the code 130, which can be unique to the particular fish 210.
Kim et al. (WO2018012676A1) -directed to counterfeit detection. Kim page 1 lines 30 to page 2 line 8 teaching the method according to one embodiment of the present invention is characterized in that the activation server generates a first random key having a predetermined number of digits, wherein the first random key is key data randomly generated without a specific generation rule being set, A random key generation step; Confirming whether the activation server has issued the first random key and has been issued with an activation code; An activation code generation step in which the activation server generates an activation code using the first random key if the first random key corresponds to a newly generated key; If the first random key corresponds to a key already generated, re-executing the first random key generation step by the activation server; Confirming whether or not the activation key extracted by the comparison authentication code acquired by the purchaser client matches the first random key assigned with the activation code; And a first counterfeit determination step in which, when the extraction key is a key not assigned to the activation code, the activation server informs that the article to which the comparison authentication code is attached is a counterfeit product. In the activation code generation step, the activation server obtains a product classification code corresponding to the product to which the first random key is to be assigned, combines the first random key and the product classification code, Characterized in that the counterfeit product determination step includes a step of determining whether or not the extracted key is identical to the article to which the comparison authentication code is attached when the extracted key matches the specific first random key given to the product authentication code And the product information is provided to the purchaser client, wherein the product information may be identification data for a product corresponding to a goods classification code combined with the first random key. Kim page 2 lines 17-26 teaching in addition, the activation server may generate a second random key that is different from the first random key. And a sales code generation step of the product activation server generating the second random key as a sale code corresponding to a sale product composed of one or more articles, wherein the sale code includes at least one item And when the request for confirming whether the genuine product is the same as the code for selling is received, it is guided to the genuine product without limitation to the number of times. In addition, upon receipt of the sales code given to the merchandise by the purchaser client and the one or more activation codes included in the merchandise, the activation server transmits the first extraction random key included in each activation code, Receiving a second extraction random key contained in the code; And guiding that the activation server does not establish a matching relationship between the article and the article when the first extraction random key and the second extraction random key are not matched with each other. Kim page 7 line 45 to page 8 line 18 teaching the activation server generates a second random key that is distinguished from the first random key (S800). The code for sale itself is also generated with a random key to prevent it from being arbitrarily made by the counterfeit manufacturer. The method of generating the second random key is the same as the method of generating the first random key for generating the activation code. The activation server generates the second random key to be distinguished from the first random key. In one embodiment of generating the first random key and the second random key so as to be distinguished, the activation server restricts the generation of the same random key as the key generated by the first random key with the second random key. Further, in another embodiment, the activation server combines the second random key with a classification code corresponding to a sales key of a specific commodity. Thereafter, the activation server further includes a step (S900) of generating the second random key as a code for selling a commodity corresponding to the commodity composed of one or more commodities. In addition, the activation server can additionally perform encryption and obfuscation in the process of generating the second random key using the sales code. The sale code may be formed of an identification mark (i.e., a sale identification mark) such as a QR code. The identification label for sale is affixed to the sale, not the product itself. In one embodiment, if the item is composed of one item and packaged in a case, the seller may attach the code for sale to the case. Further, in another embodiment, when a plurality of articles are bundled and sold as a single sale item, the seller attaches a sale code to the corresponding sale item unit. Further, in another embodiment, the activation server forms and stores a connection relationship between the sales code (or the second random key) and the at least one activation code (or the first random key). If the winemaker obtains only the genuine case with the code for sale, and distributes the counterfeit in the original case, the buyer can misunderstand the goods contained in the genuine article through the confirmation through the code for sale. To prevent this, in the process of manufacturing a merchandise (i. E., The process of packaging one or more merchandise), the activation server matches the code for sale of the merchandise with the activation code of the merchandise contained in the merchandise.
Kwak (US 20130018761) – directed to consumer-level food source information tracking and management. Kwak paragraph 7 teaching in one embodiment of the invention, a consumer-level food source information tracking and management system is disclosed. This consumer-level food source information tracking and management system comprises: static and dynamic food source information directly or indirectly provided by a producer to the consumer-level food source information tracking and management server, wherein the dynamic food source information includes at least one of a real-time webcam view, GPS location info, and livestock RFID tag info of the producer's onsite asset; the consumer-level food source information tracking and management server with a CPU and a memory unit executing one or more programs to process and store the static and dynamic food source information periodically or continuously from the producer; a food package attached with a consumer-level food source information label containing a unique identifying code, wherein the unique identifying code is associated with the static and dynamic food source information separately stored in the consumer-level food source information tracking and management server; a consumer's user interface device configured to retrieve the static and dynamic food source information stored in the consumer-level food source information tracking and management server by entering, scanning, or using the unique identifying code in the consumer's user interface; and one or more data network transceivers operatively connecting the producer, the consumer-level food source information tracking and management server, and the consumer's user interface device for wireless and wired data communication. Kwak paragraph 25 teaching in addition, for the purpose of describing the invention, a term “consumer-level food source information label” is defined as a printed label, an RFID product label tag, or another device containing a unique identifying code, wherein the unique identifying code is used to access producer-specific, package-specific, and/or other consumer-level food source information associated with a particular food package. The unique identifying code may be a bar code, an alphanumeric code, QR code, or another piece of identifying information. In a preferred embodiment of the invention, the consumer-level food source information label is attached to the particular food package, and a consumer is able to retrieve producer-specific, package-specific, and/or other consumer-level food source information if the unique identifying code is entered into a user interface device operatively connected to a consumer-level food source information tracking and management server. If a bar code or a QR code is present in the unique identifying code, it may be desirable to use a bar code scanner or an image scanner to enter the scanned barcode, the scanned QR code, or the scanned image information directly into a consumer's user interface device. The bar code scanner or the image scanner may be an integrated application to the consumer's user interface device, or a standalone unit. Similarly, alphanumeric or numeric codes may be scanned and recognized by an image scanner and an optical character recognition (OCR) program internal or external to the user interface device. Kwak paragraph 38 teaching in a preferred embodiment of the invention, an in-house label-generation entity (103) within the producer (101), or an external label-generation entity (103) within the distribution channel (107) receives information (i.e. 115) from the producer (101) to associate a particular food package with a unique identifying code on a consumer-level food source information label (e.g. 203 of FIG. 2). The unique identifying code is typically a unique alphanumeric/numeric code, a bar code, and/or a QR code. If a bar code or a QR code is present in the unique identifying code, it may be desirable to use a bar code scanner or an image scanner to enter the scanned barcode information or the scanned QR code information directly into a consumer's user interface device. The bar code scanner or the image scanner may be an integrated application to the consumer's user interface device, or a standalone unit. Similarly, alphanumeric or numeric codes may be scanned and recognized by an image scanner and an optical character recognition (OCR) program internal or external to the user interface device. Kwak paragraph 44 teaching in some embodiments of the invention, at least some portions of the producer-specific information provided by the producer (101) may be dynamic and include a real-time webcam view, GPS location info, livestock RFID tag info, and/or other information of a producer's onsite assets. Examples of a producer's onsite assets include, but are not limited to, livestock, fishery, plants, trees, farmland, farm buildings, and/or other properties associated with the producer, which may enhance a retail consumer's appreciation and understanding of the producer. Sharing such dynamic and visual information of a producer (101) with a retail consumer (111) may be particularly helpful for building a positive brand image, a high reputation, and a resilient consumer loyalty for a premium-grade, organic, and/or local producer. Kwak paragraph 50 teaching in general, both producer-specific information and package-specific information are static and/or dynamic food source information provided by a producer. A portion of the food source information which does not change dynamically over a short period of time is considered “static” food source information. On the other hand, a portion of the food source information which typically receives frequent or periodic updates over the course of processing, distribution, and sale of a food package is considered “dynamic” food source information. Examples of the static food source information may include, but are not limited to, name of the producer, current location of the producer, and weight of the food package. Examples of the dynamic food source information may include, but are not limited to, real-time webcam view, GPS location info, livestock RFID tag info, and other information of the producer's dynamically-moving or changing onsite assets (e.g. livestock, fishery, and etc.).
While the prior art teach concepts related to monitoring animals using image capture devices including information regarding the location( Spoor paragraphs 6 and 35, Mindel paragraphs 54-56, and 107, Biffert paragraphs 95-96, Siedenberg paragraphs 6-8, and Kwak paragraphs 7 and 44) , sharing the content to a consumer through use of a code (Spoor paragraph 44 and 103-105, Siedenberg paragraphs 23 and 26, and Kwak paragraphs 7, 25, and 38) and the use of a first and random number in conjunction for validating a product (Kim page 1 lines 30 to page 2 line 8, Kim page 2 lines 17-26, and Kim page 7 line 45 to page 8 line 18) the prior art fails to disclose or teach the particular manner generating a set of a first random number and second random number while the second transmitter and receiver are receiving the information set of the dynamic video and positioning information where one of the random numbers is associated with the information set and the other random number is associated with the code of the product as claimed by Applicant.
Therefore, there is no current art rejection applied for claims 1-7; however, the examiner notes the outstanding claim rejection under 35 USC § 101 and claim objections for claims 1-7. Therefore, the claims are not indicated as allowable at this time.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
McGlone et al. (US 20150289478) -directed to livestock identification and monitoring.
Hicks et al. (US 20190335715) -directed to food supply may be tracked, authenticated and evaluated.
Sarzen et al. (US 20210289755) -directed to using sound data to analyze health condition and welfare states in collections of farm animals.
Molloy et al. (US 20220159934) – directed to animal health and safety monitoring.
Amat Roldan (US 20230096439) -directed to traceability of living specimens.
Bourke-Borrowes et al. (US 20240096501) -directed to determination of the welfare of an animal population.
Spears et al. (US 20220104463) -directed to counting livestock.
Nagano et al. (US 20200250712) -directed to animal habitat value evaluation.
Deliou (US 20200367471) -directed to tracking a plurality of animals with a portable computing device.
Li et al. (US 20210271885) -directed to video analytics to detect animal abuse.
Beckham et al. (US 20180218057) – directed to monitoring and analyzing animal related data.
Canning et al. (US 20230260327) -directed to autonomous livestock monitoring
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MICHAEL J MONAGHAN whose telephone number is (571)270-5523. The examiner can normally be reached on Monday- Friday 8:30 am - 5:30 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at
http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Sarah Monfeldt can be reached on (571) 270-1833. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Michael J. Monaghan/Examiner, Art Unit 3629