DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Drawings
The drawings are objected to as failing to comply with 37 CFR 1.84(p)(5) because they do not include the following reference sign(s) mentioned in the description: “sensor 112” in paragraph [00015]. Additionally, in paragraph [00049], a reference is made to figure 2 with the elements “requesting user 1 405…user 1 content 415…user 2 content 435…requesting user 2 425…user 2 content 435… an administrator 465…permission level 400…users 105.” However, figure 2 does not have corresponding labels for the elements.
Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either "Replacement Sheet" or "New Sheet" pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance.
Specification Objections
The specification is objected to because of the following informalities:
In paragraph [00018], line 14, “fingerprint s136aved” should read “fingerprint 136 saved”.
In paragraph [00039], line 11, “which a. digital code” should read “which a
Appropriate correction is required.
Claim Objections
Claims 1, 4, 13, and 20 are objected to because of the following informalities:
In claim 1, line 9, the term “pattern form said” should be changed to “pattern from said” in order to avoid typographical issue.
In claim 1, line 2, the term “emit a pattern,” should be changed to “emit a pattern;” in order to avoid typographical issue.
In claim 1, line 3, the term “containing said pattern,” should be changed to “containing said pattern;” in order to avoid typographical issue.
In claim 1, line 5, the term “manipulation of said camera,” should be changed to “manipulation of said camera;” in order to avoid typographical issue.
In claim 1, line 7, the term “device, and” should be changed to “device; and” in order to avoid typographical issue.
In claim 1, line 8, the term “and said camera,” should be changed to “and said camera;” in order to avoid typographical issue.
In claim 1, line 9, the term “said image data,” should be changed to “said image data;” in order to avoid typographical issue.
In claim 13, line 3, the term “wildlife,” should be changed to “wildlife;” in order to avoid typographical issue.
In claim 13, line 4, the term “emit a pattern,” should be changed to “emit a pattern;” in order to avoid typographical issue.
In claim 13, line 5, the term “said camera,” should be changed to “said camera;” in order to avoid typographical issue.
In claim 13, line 7, the term “measuring device, and” should be changed to “measuring device; and” in order to avoid typographical issue.
In claim 13, line 8, the term “said camera,” should be changed to “said camera;” in order to avoid typographical issue.
In claim 13, line 9, the term “said light,” should be changed to “said light;” in order to avoid typographical issue.
In claim 13, line 11, the term “life or wildlife,” should be changed to “life or wildlife;” in order to avoid typographical issue.
In claim 20, line 5, the term “computing device,” should be changed to “computing device;” in order to avoid typographical issue.
In claim 20, line 7, the term “wildlife,” should be changed to “wildlife;” in order to avoid typographical issue.
In claim 20, line 8, the term “computing device,” should be changed to “computing device;” in order to avoid typographical issue.
In claim 20, line 10, the term “data is associated,” should be changed to “data is associated;” in order to avoid typographical issue.
In claim 20, line 11, the term “said image data, and” should be changed to “said image data; and” in order to avoid typographical issue.
In claim 4, line 1, claim 5, line 2, and claim 13 lines 2, 4, and 7, the term “measuring device” should be changed to “measurement device” in order to keep terminology consistent the specification (please see specification paragraph [00026 and 00028] and to prevent 112(a) and 112(b) rejection due to the 112(f) claim interpretation).
Appropriate correction is required.
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that use the word “means” or “step” but are nonetheless not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph because the claim limitation(s) recite(s) sufficient structure, materials, or acts to entirely perform the recited function.
Claims 4 and 13 recite limitations that use words like “means” (or “step”) or similar terms with functional language and do invoke 35 U.S.C. 112(f):
Claim 4; recites the limitation, “a measuring device configured to measure…” [Line 1].
Claim 13; recites the limitation, “a measuring device configured to measure…” [Line 2].
Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
After a careful analysis, as disclosed above, and a careful review of the specification the following limitations in claims 4 and 13;
“measuring device”
(No term measuring device found in the specification. However, paragraph [00026]-the measurement device 107 comprises a platform 205. The platform 205 preferably has markings 225 thereon that extend from a proximal end to a distal end of the platform 205, which may be used to measure a length of wildlife/marine life 102. In a preferred embodiment, the measurement device 107 further comprises a head piece 230 at the proximal end of the platform 205. The markings 225 of the platform 205 preferably begin at the point in which the head piece 230 attaches to the platform 205. In a preferred embodiment, the markings 225 allow a user 105 to measure the length of wildlife/marine life 102. For instance, a user 105 may place one end of the wildlife/marine life 102 against said head piece 230 as a starting point to begin a measurement. In some preferred embodiments, the measurement device 107 may comprise a second head piece 231 at the distal end of said platform 205. At least one of the head pieces 230, 231 may be configured to move about the platform 205 in a way such that its position about the platform 205 may be changed, as illustrated in FIG. 2. Paragraph [00028]- a plurality of light sensors 215 is placed in line about the length of the measurement device 107
(Wherein the measuring device does not have sufficient structure associated with it. However, measurement device does have sufficient structure as stated above, sensor, light sensor, a container with mounted sides, head pieces.) Please note claim objection on page 4.
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 4 and 13 along with their dependent claims are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor, or for pre-AIA the applicant regards as the invention.
Claim 4; recites the limitation, “a measuring device configured to measure…” [Line 1].
Claim 13; recites the limitation, “a measuring device configured to measure…” [Line 2].
Claims 4 and 13 respectively invokes 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. However, the written description fails to disclose the corresponding structure, material, or acts for performing the entire claimed function and to clearly link the structure, material, or acts to the function. The specification is devoid of adequate structure to perform the claimed functions. The specification does not provide sufficient details such that one of the ordinary skill in the art would understand which structure performed(s) the claimed function.
Therefore, the claim is indefinite and is rejected under 35 U.S.C. 112(b) or pre-AIA 35 U.S.C. 112, second paragraph.
Applicant may:
(a) Amend the claim so that the claim limitation will no longer be interpreted as a limitation under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph;
(b) Amend the written description of the specification such that it expressly recites what structure, material, or acts perform the entire claimed function, without introducing any new matter (35 U.S.C. 132(a)); or
(c) Amend the written description of the specification such that it clearly links the structure, material, or acts disclosed therein to the function recited in the claim, without introducing any new matter (35 U.S.C. 132(a)).
If applicant is of the opinion that the written description of the specification already implicitly or inherently discloses the corresponding structure, material, or acts and clearly links them to the function so that one of ordinary skill in the art would recognize what structure, material, or acts perform the claimed function, applicant should clarify the record by either:
(a) Amending the written description of the specification such that it expressly recites the corresponding structure, material, or acts for performing the claimed function and clearly links or associates the structure, material, or acts to the claimed function, without introducing any new matter (35 U.S.C. 132(a)); or
(b) Stating on the record what the corresponding structure, material, or acts, which are implicitly or inherently set forth in the written description of the specification, perform the claimed function. For more information, see 37 CFR 1.75(d) and MPEP §§ 608.01(o) and 2181.
The following is a quotation of the first paragraph of 35 U.S.C. 112(a):
(a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention.
The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112:
The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention.
Claims 4 and 13 along with their dependent claims are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for pre-AIA the inventor(s), at the time the application was filed, had possession of the claimed invention. As described above, the disclosure does not provide adequate structure to perform the claimed function in the recited limitation.
Claim 4; recites the limitation, “a measuring device configured to measure…” [Line 1].
Claim 13; recites the limitation, “a measuring device configured to measure…” [Line 2].
The specification does not demonstrate that applicant has made an invention that achieves the claimed function because the invention is not described with sufficient detail such that one of ordinary skill in the art can reasonably conclude that the inventor had possession of the claimed invention.
Double Patenting
The non-statutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A non-statutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on non-statutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/process/file/efs/guidance/eTD-info-I.jsp.
Claims 1, 4-13, and 14-19 are rejected on the ground of non-statutory double patenting as being unpatentable over claims 1, 8, 9, 11, 13, 14, 16, 17 of US Patent No.: US 11,951,404 B2 in view of HLATKY, JR. et al. (US 9195370 B1).
Claim 20 is rejected on the ground of non-statutory double patenting as being unpatentable over claims 14, 15, and 16 of US Patent No.: US 11,951,404 B2.
Although the claims 1-20 of this Application No. 18/596,044 and claims at issue are not identical, they are not patentably distinct from each other because the instant application and the conflicting Patent are claiming common subject matter, as follows: claiming common subject matter, as follows:
This Application No. 18/596,044
US Patent No.: US 11,951,404 B2
Claim 1: A system for confirming image data comprising:
a light configured to emit a pattern,
a camera configured to capture image data containing said pattern,
a computing device operably connected to said camera and
wherein said computing device contains user data pertaining to a user of said computing device, and
a processor operably connected to said computing device and said camera,
wherein said processor creates tagged image data using said image data, user data, and said pattern.
having a user interface configured to allow manipulation of said camera,
wherein said processor extracts said pattern form said image data,
Claim 1: A system for confirming image data comprising:
a light configured to emit a pattern,
a camera configured to capture image data containing said pattern,
a computing device operably connected to said camera and
wherein said computing device contains user data pertaining to a user of said computing device, and
a processor operably connected to said computing device and said camera,
wherein said processor extracts said pattern form said image data,
wherein said processor creates tagged image data using said image data, user data, and said pattern.
having a user interface configured to allow manipulation of said camera,
Claim 13: A system for confirming image data comprising:
a measuring device configured to measure physical attributes of at least one of marine life or wildlife,
a light secured to said measuring device and configured to emit a pattern,
a computing device having a camera and
wherein said camera is configured to capture image data containing said pattern, at least one of marine life or wildlife, and measuring device, and
a processor operably connected to said computing device and said camera,
wherein said processor extracts said pattern from said image data containing said light,
wherein said processor extracts measurement data from said image data pertaining to said physical attributes of said at least one of marine life or wildlife,
wherein said processor creates tagged image data using said image data, pattern, and said measurement data.
a user interface configured to manipulate said camera,
Claim 20: A system for confirming image data comprising:
a non-transitory computer-readable medium coupled to a processor and having instructions stored thereon, which, when executed by said processor, cause said processor to perform operations comprising:
receiving image data from a camera operably connected to a computing device,
wherein said image data contains a light and at least one of marine life or wildlife,
receiving user data from said computing device,
wherein said user data instructs said processor which user of a plurality of users captured said image data is associated,
extracting a pattern emitted by said light from said image data, and
creating tagged image data using said image data, pattern, and said user data.
Claim 1: A system for confirming image data comprising:
a user identifier containing user data,
a light configured to emit a pattern,
a computing device having a camera,
wherein said computing device obtains said user data (wherein user data is said pattern) via said user identifier,
wherein said camera captures image data containing said pattern,
Claim 1: a processor operably connected to said at least one sensor and said computing device,
wherein said processor converts said user data, pattern, and environmental data into a digital fingerprint, and
wherein said processor creates tagged image data using said digital fingerprint (wherein digital fingerprint is said pattern).
Claim 14: A system for confirming image data comprising:
a user identification (UID) pattern containing user data,
a light configured to emit a light pattern,
a computing device having a camera,
wherein said camera captures image data containing said user identification (UID) pattern and said light pattern (wherein UID is user data),
a processor operably connected to said computing device and said at least one
a non-transitory computer-readable medium coupled to said processor and having instructions stored thereon, which, when executed by said processor, cause said processor to perform operations comprising:
receiving said image data from said computing device,
extracting said user data from said user identification (UID) pattern,
determining said light pattern emitted by said light in said image data,
converting said light pattern into a digital code,
checking said digital code against a digital lock, and
authenticating said image data when said digital code matches said digital lock.
Claim 14: A system for confirming image data comprising:
Claim 16: wherein markings of said measuring device communicate measurement data to said computing device via said image data,
wherein said measurement data pertains to physical attributes of at least one of marine life and wildlife.
Claim 14: a light configured to emit a pattern
Claim 16: wherein said light and said user identification (UID) pattern are secured to said measuring device,
Claim 14: a computing device having a camera
Claim 14: wherein said camera captures image data containing said user identification (UID) pattern and said light pattern
Claim 16: wherein markings of said measuring device communicate measurement data to said computing device via said image data,
wherein said measurement data pertains to physical attributes of at least one of marine life and wildlife.
Claim 14: a computing device having a camera,
a processor operably connected to said computing device and said at least one sensor
a non-transitory computer-readable medium coupled to said processor and having instructions stored thereon, which, when executed by said processor, cause said processor to perform operations comprising:
receiving said image data from said computing device,
extracting said user data from said user identification (UID) pattern,
determining said light pattern emitted by said light in said image data,
converting said light pattern into a digital code,
checking said digital code against a digital lock, and
authenticating said image data when said digital code matches said digital lock.
Claim 16: wherein markings of said measuring device communicate measurement data to said computing device via said image data,
wherein said measurement data pertains to physical attributes of at least one of marine life and wildlife
Claim 16: wherein markings of said measuring device communicate measurement data to said computing device via said image data,
wherein said measurement data pertains to physical attributes of at least one of marine life and wildlife.
Claim 17: when executed by said processor, cause said processor to perform additional operations comprising:
receiving said environmental data from said at least one sensor, and
creating a digital fingerprint using said user data, digital code, environmental data, and measurement data, and
creating said tagged image data using said digital fingerprint and said image data.
Claim 14: A system for confirming image data comprising:
a non-transitory computer-readable medium coupled to said processor and having instructions stored thereon, which, when executed by said processor, cause said processor to perform operations comprising:
receiving said image data from said computing device,
Claim 14: determining said light pattern emitted by said light in said image data,
converting said light pattern into a digital code
Claim 16: wherein markings of said measuring device communicate measurement data to said computing device via said image data,
wherein said measurement data pertains to physical attributes of at least one of marine life and wildlife.
Claim 14: receiving said image data from said computing device,
extracting said user data from said user identification (UID) pattern,
Claim 14: wherein said user data instructs said processor which user of a plurality of users captured said image data using said camera,
wherein at least one of said image data and tagged image data is saved within a user profile of said user,
determining said light pattern emitted by said light in said image data,
converting said light pattern into a digital code,
(wherein determining and converting is extracting)
checking said digital code against a digital lock, and
authenticating said image data when said digital code matches said digital lock.
Claim 14: determining said light pattern emitted by said light in said image data, converting said light pattern into a digital code
Claim 15: receiving said environmental data from said at least one sensor,
creating a digital fingerprint using said user data, digital code, and environmental data, and
creating said tagged image data using said digital fingerprint and said image data.
Although, U.S. Patent No. US 11,951,404 B2 claim 1 teaches wherein a system for confirming image data comprising: a light configured to emit a pattern, a camera configured to capture image data containing said pattern, a computing device operably connected to said camera and wherein said computing device contains user data pertaining to a user of said computing device, and
a processor operably connected to said computing device and said camera, wherein said processor creates tagged image data using said image data, user data, and said pattern. U.S. Patent No. US 11,951,404 B2, claim 1 as stated in the table above with respect to claim 1, fails to clearly disclose having a user interface configured to allow manipulation of said camera, wherein said processor extracts said pattern form said image data.
However, HLATKY, JR. et al. (US 9195370 B1), explicitly teaches having a user interface configured to allow manipulation of said camera (Fig. 1. Col. 2. Line [26-29]- HLATKY, JR. discloses client devices 120, 130, and 140 can be devices suitable for use by individuals, e.g., mobile devices such as cell phones, PDAs, tablets, netbooks, as well as laptops or personal computers. Further in Col 16. Line [26-30]- HLATKY, JR. discloses that any functionality described herein as associated with the client device can be associated with fishing and/or hunting application that can control a camera of the client device to obtain images for submitted reports (wherein the application is a user interface.).), wherein said processor extracts said pattern form said image data. (Col. 25. Line [57-67]- HLATKY, JR. discloses the tags may have visual identifying indicia (e.g., a serial number) or can use other techniques such as active/passive RF and/or barcodes (including QR codes) of other tag identifying indicia. In such implementations, client devices can be configured to read the identifying indicia (e.g., by RF scan, optical character recognition of serial numbers, or a bar code reader) and auto-populate the catch reports based on the tag (wherein the pattern is the identifying indicia.).).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of the U.S. Patent No. US 11,951,404 B2, claim 1 of having wherein a system for confirming image data comprising: a light configured to emit a pattern, a camera configured to capture image data containing said pattern, a computing device operably connected to said camera and wherein said computing device contains user data pertaining to a user of said computing device, and a processor operably connected to said computing device and said camera, wherein said processor creates tagged image data using said image data, user data, and said pattern, with the teachings of HLATKY, JR. et al. (US 9195370 B1), having a user interface configured to allow manipulation of said camera, wherein said processor extracts said pattern form said image data.
Wherein having U.S. Patent No. US 11,951,404 B2 claim 1 having a user interface configured to allow manipulation of said camera, wherein said processor extracts said pattern form said image data.
The motivation behind the modification would have been to obtain a more desirable system for reporting a catch as easy as possible for a user, to encourage users to provide fishing data.
Although, U.S. Patent No. US 11,951,404 B2 claim 14 teaches wherein a system for confirming image data comprising: a light configured to emit a pattern, a camera configured to capture image data containing said pattern, a computing device operably connected to said camera and, wherein said computing device contains user data pertaining to a user of said computing device, and a processor operably connected to said computing device and said camera, wherein said processor extracts said pattern form said image data, wherein said processor creates tagged image data using said image data, user data, and said pattern. U.S. Patent No. US 11,951,404 B2, claim 14 as stated in the table above with respect to claim 1, fails to clearly disclose having a user interface configured to allow manipulation of said camera.
However, HLATKY, JR. et al. (US 9195370 B1), explicitly teaches having a user interface configured to allow manipulation of said camera (Fig. 1. Col. 2. Line [26-29]- HLATKY, JR. discloses client devices 120, 130, and 140 can be devices suitable for use by individuals, e.g., mobile devices such as cell phones, PDAs, tablets, netbooks, as well as laptops or personal computers. Further in Col 16. Line [26-30]- HLATKY, JR. discloses that any functionality described herein as associated with the client device can be associated with fishing and/or hunting application that can control a camera of the client device to obtain images for submitted reports (wherein the application is a user interface.).).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of the U.S. Patent No. US 11,951,404 B2, claim 14 of having wherein a system for confirming image data comprising: a light configured to emit a pattern, a camera configured to capture image data containing said pattern, a computing device operably connected to said camera and, wherein said computing device contains user data pertaining to a user of said computing device, and a processor operably connected to said computing device and said camera, wherein said processor extracts said pattern form said image data, wherein said processor creates tagged image data using said image data, user data, and said pattern, with the teachings of HLATKY, JR. et al. (US 9195370 B1), having a user interface configured to allow manipulation of said camera.
Wherein having U.S. Patent No. US 11,951,404 B2 claim 14 having a user interface configured to allow manipulation of said camera.
The motivation behind the modification would have been to obtain a more desirable system for reporting a catch as easy as possible for a user, to encourage users to provide fishing data.
Although, U.S. Patent No. US 11,951,404 B2 claim 14 and its dependent claims 16 and 17 and teach a system for confirming image data comprising: a measuring device configured to measure physical attributes of at least one of marine life or wildlife, a light secured to said measuring device and configured to emit a pattern, a computing device having a camera and wherein said camera is configured to capture image data containing said pattern, at least one of marine life or wildlife, and measuring device, and a processor operably connected to said computing device and said camera, wherein said processor extracts said pattern from said image data containing said light, wherein said processor extracts measurement data from said image data pertaining to said physical attributes of said at least one of marine life or wildlife, wherein said processor creates tagged image data using said image data, pattern, and said measurement data. U.S. Patent No. US 11,951,404 B2, claims 14 and 16-17 as stated in the table above with respect to claim 13, fails to clearly disclose a user interface configured to manipulate said camera.
However, HLATKY, JR. et al. (US 9195370 B1), explicitly teaches having a user interface configured to allow manipulation of said camera (Fig. 1. Col. 2. Line [26-29]- HLATKY, JR. discloses client devices 120, 130, and 140 can be devices suitable for use by individuals, e.g., mobile devices such as cell phones, PDAs, tablets, netbooks, as well as laptops or personal computers. Further in Col 16. Line [26-30]- HLATKY, JR. discloses that any functionality described herein as associated with the client device can be associated with fishing and/or hunting application that can control a camera of the client device to obtain images for submitted reports (wherein the application is a user interface.).).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of the U.S. Patent No. US 11,951,404 B2, claim 14 of having wherein a system for confirming image data comprising: a measuring device configured to measure physical attributes of at least one of marine life or wildlife, a light secured to said measuring device and configured to emit a pattern, a computing device having a camera and wherein said camera is configured to capture image data containing said pattern, at least one of marine life or wildlife, and measuring device, and a processor operably connected to said computing device and said camera, wherein said processor extracts said pattern from said image data containing said light, wherein said processor extracts measurement data from said image data pertaining to said physical attributes of said at least one of marine life or wildlife, wherein said processor creates tagged image data using said image data, pattern, and said measurement data, with the teachings of HLATKY, JR. et al. (US 9195370 B1), having a user interface configured to allow manipulation of said camera.
Wherein having U.S. Patent No. US 11,951,404 B2 claim 14 having a user interface configured to allow manipulation of said camera.
The motivation behind the modification would have been to obtain a more desirable system for reporting a catch as easy as possible for a user, to encourage users to provide fishing data.
The further limitations of the dependent claims are similar as indicated below:
This Application No. 18/956,044
US Patent No.: US 11,951,404 B2
Claim 4: The system of claim 1, further comprising a measuring device configured to measure attributes of at least one of marine life or wildlife.
Claim 5: The system of claim 4, wherein said processor extracts measurement data from said image data containing said measuring device and said at least one of marine life or wildlife.
Claim 6: The system of claim 5, wherein said processor creates said tagged image data using said user data, pattern, and measurement data.
Claim 7: The system of claim 1, further comprising a user identification (UID) pattern, wherein said UID pattern is associated with said user.
Claim 8: The system of claim 7, wherein said processor extracts said UID pattern from said image data containing said UID pattern.
Claim 9: The system of claim 8, wherein said processor creates said tagged image data using said user data, pattern, and UID pattern.
Claim 10: The system of claim 1, further comprising at least one sensor operably connected to said processor and configured to obtain environmental data.
Claim 11: The system of claim 10, wherein said processor creates said tagged image data using said user data, pattern, and environmental data.
Claim 12: The system of claim 1, further comprising one of hunting equipment, fishing equipment, or card on which said light is secured.
Claim 14: The system of claim 13, further comprising a user identification (UID) pattern, wherein said UID pattern is associated with a user.
Claim 15: The system of claim 14, wherein said processor extracts said UID pattern from said image data containing said UID pattern.
Claim 16: The system of claim 15, wherein said processor creates said tagged image data using said measurement data, pattern, and UID pattern.
Claim 17: The system of claim 13, further comprising at least one sensor operably connected to said processor and configured to obtain environmental data.
Claim 18: The system of claim 17, wherein said processor creates said tagged image data using said measurement data, pattern, and environmental data.
Claim 19: The system of claim 13, further comprising one of hunting equipment, fishing equipment, or card on which a UID pattern is secured, wherein said UID pattern is associated with a user.
Claim 16: wherein markings of said measuring device communicate measurement data to said computing device via said image data,
wherein said measurement data pertains to physical attributes of at least one of marine life and wildlife.
Claim 16: wherein markings of said measuring device communicate measurement data to said computing device via said image data,
wherein said measurement data pertains to physical attributes of at least one of marine life and wildlife.
Claim 1: wherein said processor converts said user data, pattern, and environmental data into a digital fingerprint, and
wherein said processor creates tagged image data using said digital fingerprint.
Claim 8: wherein said processor creates said digital fingerprint using said user data, digital code, environmental data, and measurement data.
Claim 14: a user identification (UID) pattern containing user data
Claim 14: a non-transitory computer-readable medium coupled to said processor and having instructions stored thereon, which, when executed by said processor, cause said processor to perform operations comprising:
receiving said image data from said computing device,
extracting said user data from said user identification (UID) pattern,
wherein said user data instructs said processor which user of a plurality of users captured said image data using said camera,
Claim 1: wherein said processor converts said user data, pattern, and environmental data into a digital fingerprint, and
wherein said processor creates tagged image data using said digital fingerprint.
Claim 14: a non-transitory computer-readable medium coupled to said processor and having instructions stored thereon, which, when executed by said processor, cause said processor to perform operations comprising:
receiving said image data from said computing device, extracting said user data from said user identification (UID) pattern,
wherein said user data instructs said processor which user of a plurality of users captured said image data using said camera,
Claim 1: at least one sensor configured to obtain environmental data, and
a processor operably connected to said at least one sensor and said computing device,
Claim 1: wherein said processor converts said user data, pattern, and environmental data into a digital fingerprint, and
wherein said processor creates tagged image data using said digital fingerprint.
Claim 9: wherein said light and said user identifier are secured to said hunting equipment
Claim 11: wherein said light and said user identifier are secured to said fishing equipment
Claim 13: wherein said light and said user identifier are secured to said card
Claim 14: a user identification (UID) pattern containing user data
Claim 14: a non-transitory computer-readable medium coupled to said processor and having instructions stored thereon, which, when executed by said processor, cause said processor to perform operations comprising:
receiving said image data from said computing device,
extracting said user data from said user identification (UID) pattern,
Claim 14: a non-transitory computer-readable medium coupled to said processor and having instructions stored thereon, which, when executed by said processor, cause said processor to perform operations comprising:
receiving said image data from said computing device,
extracting said user data from said user identification (UID) pattern
Claim 15: creating a digital fingerprint using said user data, digital code, and environmental data, and
creating said tagged image data using said digital fingerprint and said image data.
Claim 17: further comprising additional instructions, which, when executed by said processor, cause said processor to perform additional operations comprising:
receiving said environmental data from said at least one sensor,
Claim 14: determining said light pattern emitted by said light in said image data,
converting said light pattern into a digital code
Claim 17: receiving said environmental data from said at least one sensor, and
creating a digital fingerprint using said user data, digital code, environmental
data, and measurement data, and
creating said tagged image data using said digital fingerprint and said image data.
Claim 14: a user identification (UID) pattern containing user data
Claim 16: wherein said light and said user identification (UID) pattern are secured to said measuring device
Claims 4-12 and 14-19 contain the same limitations as US Patent No.: US 11,951,404 claims 1, 8-9, 11, and 13-17 respectively. Therefore, given that claims 4-12 and 14-19 depend from independent claim 1, and independent claim 13 respectively and claims 1, 8-9, 11, and 13-17 depend from independent claim 1 and independent claim 14. Claims 2-12 and 14-19 are rejected for the same reasons set forth in the rejection of the independent claim above.
Claim 2 is rejected on the ground of non-statutory double patenting as being unpatentable over claim 1 of US Patent No.: US 11,951,404 B2 in view of HLATKY, JR. et al. (US 9195370 B1) and in further view of BABA et al. (DOI: 10.1109/IMTC.1999.776046).
Regarding claim 2, US Patent No.: US 11,951,404 B2 claim 1 in view of HLATKY, JR. et al. (US 9195370 B1) teaches the system of claim 1, US Patent No.: US 11,951,404 B2 claim 1 in view of HLATKY, JR. et al. (US 9195370 B1) fails to explicitly teach wherein said pattern is specific to a particular event.
However, BABA explicitly teaches wherein said pattern is specific to a particular event. (Fig. 1. Col. 6. Lines [6-9]-BABA discloses this analysis demonstrates that three lights reflected from objects located in different positions of the same measuring region can focus into the image sensor of the region in question. (Wherein the lights reflected form said pattern and the particular event is the action of lights reflected from objects.).).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of US Patent No.: US 11,951,404 B2 claim 1 in view of HLATKY, JR. et al. (US 9195370 B1) of a system for confirming image data, with the teachings of BABA having wherein said pattern is specific to a particular event.
Wherein having US Patent No.: US 11,951,404 B2 claim 1’s system for confirming image data wherein said pattern is specific to a particular event.
The motivation behind the modification would have been to obtain a system for confirming image data that provides an additional level of authentication to access the application/system.
Claim 3 is rejected on the ground of non-statutory double patenting as being unpatentable over claim 2, of US Patent No.: US 11,951,404 B2 in view of HLATKY, JR. et al. (US 9195370 B1) and in further view of BABA et al. (DOI: 10.1109/IMTC.1999.776046).
Regarding claim 3, US Patent No.: US 11,951,404 B2 claim 1 in view of HLATKY, JR. et al. (US 9195370 B1) and in further view of BABA et al. (DOI: 10.1109/IMTC.1999.776046) explicitly teaches the system of claim 2. US Patent No.: US 11,951,404 B2 claim 1 fails to explicitly teach wherein said tagged image data is associated with said particular event.
However, HLATKY, JR. et al. (US 9195370 B1) explicitly teaches wherein said tagged image data is associated with said particular event. (Fig. 1 and 3. Col. 16. Line [48-50]-HLATKY, JR. discloses in some implementations, a client-side application is provided that automatically time and location stamps pictures with the time & location where the pictures are taken. This data can be provided to the analysis device as separate data, e.g., apart from the picture, can be included in a filename of the picture, can be embedded as a watermark in the picture, as picture metadata (e.g., exchangable image file format). Watermarking pictures in this fashion can be one way to help ensure users report their catches honestly. Other implementations may use encrypted pictures or fishing data to ensure security, digital signatures, etc. (wherein the tagged image data is the watermarked image and the particular event is the pictures taken.).).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of US Patent No.: US 11,951,404 B2 claim 1 in view of HLATKY, JR. et al. (US 9195370 B1) and in further view of BABA et al. (DOI: 10.1109/IMTC.1999.776046) of a system for confirming image data, with the teachings of HLATKY, JR. et al. (US 9195370 B1) having wherein said tagged image data is associated with said particular event.
Wherein having US Patent No.: US 11,951,404 B2 claim 1’s system for confirming image data wherein said tagged image data is associated with said particular event.
The motivation behind the modification would have been to obtain a system for confirming image data that provides an additional level of authentication to access the application/system and makes the system for reporting a catch as easy as possible for a user, to encourage users to provide fishing data.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-3 and 10-12, is/are rejected under 35 U.S.C. 103 as being unpatentable over HLATKY, JR. et al. (US 9,195,370 B1), herein after referenced as HLATKY, JR, in view of BABA et al. (DOI: 10.1109/IMTC.1999.776046), herein after referenced as BABA.
Regarding claim 1, HLATKY, JR explicitly teaches a system for confirming image data comprising (Fig. 1. Col. 4. Lines [60-67]-HLATKY, JR. discloses client devices 120, 130, and/or 140 can include an outdoors application (e.g., a hunting or fishing “app”) that performs various functionality discussed herein such as obtaining pictures of fish/animals and sending catch reports and/or animal reports to analysis device 150. Devices 160 and/or 170 can include functionality to register to receive fishing/hunting reports and/or results of analysis from the client devices and/or analysis device, perhaps by providing criteria to the client and/or analysis devices indicating what reports/analyses they would like to receive.):
a light configured to emit a pattern (Fig. 1. Col. 20. Line [4-11]-HLATKY, JR. discloses the fishing rod can also include several indicating LEDs, an LED, LCD, or other display screen, or other type of interface to provide feedback indicating whether the user's retrieval speed matches the recommendation. Thus, for example, the user may see a red light on the fishing rod when the retrieval speed is too fast, a blue light when the retrieval speed is too slow, and a green light when the retrieval speed matches the recommendation (e.g., within some threshold) (Wherein the LEDs emit a light pattern.).),
a camera configured to capture image data containing said pattern (Fig. 1. Col 16. Line [26-30]-HLATKY, JR. discloses that any functionality described herein as associated with the client device can be associated with fishing and/or hunting application that can control a camera of the client device to obtain images for submitted reports. Further in Col. 25. Line [57-67]- HLATKY, JR. discloses in some cases, the tags may have visual identifying indicia (e.g., a serial number) or can use other techniques such as active/passive RF and/or barcodes (including QR codes) of other tag identifying indicia. In such implementations, client devices can be configured to read the identifying indicia (e.g., by RF scan, optical character recognition of serial numbers, or a bar code reader) and auto-populate the catch reports based on the tag (wherein the pattern is the identifying indicia.).),
a computing device (Fig. 1. #120, 130, 140 called a client device) operably connected to said camera (Fig. 1. Col. 2. Line [26-29]- HLATKY, JR. discloses client devices 120, 130, and 140 can be devices suitable for use by individuals, e.g., mobile devices such as cell phones, PDAs, tablets, netbooks, as well as laptops or personal computers.) and having a user interface configured to allow manipulation of said camera (Fig. 1. Col 16. Line [26-30]- HLATKY, JR. discloses that any functionality described herein as associated with the client device can be associated with fishing and/or hunting application that can control a camera of the client device to obtain images for submitted reports (wherein the application is a user interface.).),
wherein said computing device contains user data pertaining to a user of said computing device (Fig. 1. Col. 5. Line [11-14]- HLATKY, JR. discloses first client device 120 and second client device 130 can send fishing data identifying various fish that are reported as having been caught by a user of client device 120 and a user of client device 130. (wherein fishing data is user data)), and
a processor operably connected to said computing device and said camera (Fig. 1. Col. 4. Line [57-59]- HLATKY, JR. discloses client devices 120, 130, and/or 140 as well as devices 160 and/or 170 can also include hardware such as the aforementioned memory, processor, etc.),
wherein said processor extracts said pattern form said image data (Col. 25. Line [57-67]- HLATKY, JR. discloses the tags may have visual identifying indicia (e.g., a serial number) or can use other techniques such as active/passive RF and/or barcodes (including QR codes) of other tag identifying indicia. In such implementations, client devices can be configured to read the identifying indicia (e.g., by RF scan, optical character recognition of serial numbers, or a bar code reader) and auto-populate the catch reports based on the tag (wherein the pattern is the identifying indicia.).),
wherein said processor creates tagged image data using said image data, user data, (Fig. 1. Col. 16. Line [48-50]- HLATKY, JR. discloses in some implementations, a client-side application is provided that automatically time and location stamps pictures with the time & location where the pictures are taken. This data can be provided to the analysis device as separate data, e.g., apart from the picture, can be included in a filename of the picture, can be embedded as a watermark in the picture, as picture metadata (e.g., exchangable image file format). Watermarking pictures in this fashion can be one way to help ensure users report their catches honestly. Other implementations may use encrypted pictures or fishing data to ensure security, digital signatures, etc. (wherein time and location data is user data and the watermarked image is tagged image data).).
HLATKY, JR. fails to explicitly teach and said pattern.
However, BABA explicitly teaches and said pattern (Fig. 1. Col. 2. Lines [14-16]-BABA discloses a highly reliable, fast, accurate range imaging system based on a novel multiplexed structured light approach that uses only a single light pattern. Further in Col 2. Lines [30-36]-BABA discloses systems using classical multiplexed structured light approaches project a light pattern to encode measuring space as shown; in Fig. 1. The light pattern might, for example, include multiple stripes of the same color, color-coded stripes, a grid, and a circle [1]. Any of these light patterns is associated with a given set of projecting points, and the depth can be calculated by triangulation.)
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of HLATKY, JR. of a system for confirming image data comprising: a light configured to emit a pattern, a camera configured to capture image data containing said pattern, a computing device operably connected to said camera and having a user interface configured to allow manipulation of said camera, wherein said computing device contains user data pertaining to a user of said computing device, and a processor operably connected to said computing device and said camera, wherein said processor extracts said pattern form said image data, wherein said processor creates tagged image data using said image data, user data with the teachings of BABA of said pattern.
Wherein having HLATKY, JR.’s system for confirming image data with said pattern.
The motivation behind the modification would have been to obtain a system for confirming image data that provides an additional level of authentication to access the application. Since both HLATKY, JR. and BABA relate to systems for obtaining data, wherein HLATKY, JR. obtains a more desirable system for reporting a catch as easy as possible for a user, to encourage users to provide fishing data while BABA discloses a light data obtaining system that provides the user the ability to encode data with a highly reliable, fast, accurate range imaging system. Please see HLATKY, JR. et al. (US 9195370 B1), Col. 16. Line [58-67], and BABA et al. (DOI: 10.1109/IMTC.1999.776046), Col. 2. Lines [14-16].
Regarding claim 2, HLATKY, JR. in view of BABA explicitly teaches the system of claim 1. HLATKY, JR. fails to explicitly teach wherein said pattern is specific to a particular event.
However, BABA explicitly teaches wherein said pattern is specific to a particular event (BABA discloses this analysis demonstrates that three lights reflected from objects located in different positions of the same measuring region can focus into the image sensor of the region in question (wherein the lights reflected form said pattern and the particular event is the action of lights reflected from objects.).).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of HLATKY, JR. of a system for confirming image data with the teaching of BABA having wherein said pattern is specific to a particular event
Wherein having HLATKY, JR.’s system for confirming image data wherein said pattern is specific to a particular event.
The motivation behind the modification would have been to obtain a system for confirming image data that provides an additional level of authentication to access the application. Since both HLATKY, JR. and BABA relate to systems for obtaining data, wherein HLATKY, JR. obtains a more desirable system for reporting a catch as easy as possible for a user, to encourage users to provide fishing data while BABA discloses a light data obtaining system that provides the user the ability to encode data with a highly reliable, fast, accurate range imaging system. Please see HLATKY, JR. et al. (US 9195370 B1), Col. 16. Line [58-67], and BABA et al. (DOI: 10.1109/IMTC.1999.776046), Col. 2. Lines [14-16].
Regarding claim 3, HLATKY, JR. in view of BABA explicitly teaches the system of claim 2. HLATKY, JR. further teaches wherein said tagged image data is associated with said particular event (Fig. 1 and 3. Col. 16. Line [48-50]- HLATKY, JR. discloses in some implementations, a client-side application is provided that automatically time and location stamps pictures with the time & location where the pictures are taken. This data can be provided to the analysis device as separate data, e.g., apart from the picture, can be included in a filename of the picture, can be embedded as a watermark in the picture, as picture metadata (e.g., exchangable image file format). Watermarking pictures in this fashion can be one way to help ensure users report their catches honestly. Other implementations may use encrypted pictures or fishing data to ensure security, digital signatures, etc. (wherein the tagged image data is the watermarked image and the particular event is the pictures taken.).).
Regarding claim 10, HLATKY, JR. in view of BABA explicitly teaches the system of claim 1. HLATKY, JR. further teaches further comprising at least one sensor operably connected to said processor and configured to obtain environmental data (Fig. 1. Col. 19. Lines [39-49]- HLATKY, JR. discloses the fishing data can include data obtained from one or more sensors on a user's fishing rod. For example, optical, magnetic, or other triggering techniques can be used to measure the speed with which a user retrieves a lure (e.g., possibly taking into account mechanical characteristics of the reel, e.g., diameter, gear ratio, etc.) by mounting an optical, magnetic, or other sensor on the fishing rod. In some implementations, the retrieval speed is transmitted to the client device (e.g., by programmable processor, ASIC, FPGA, etc.) via wireless, which in turn reports the retrieval speed with the fishing data Further in Col. 19. Line [53-60]- HLATKY, JR. discloses note that since the analysis engine inputs also can include water temperature, the analysis engine may learn, for example, that relatively faster retrieval speeds are effective at warmer temperatures and slower retrieval speeds are more effective at colder temperatures. Moreover, the analysis engine may learn particular speeds that are best suited for catching particular species at particular temperatures, e.g., with particular lures (wherein the environmental data is retrieval speed.).).
Regarding claim 11, HLATKY, JR. in view of BABA explicitly teaches the system of claim 10. HLATKY, JR. further teaches wherein said processor creates said tagged image data using said user data, pattern, and environmental data (Fig. 1. Col. 16. Line [48-50]- HLATKY, JR. discloses in some implementations, a client-side application is provided that automatically time and location stamps pictures with the time & location where the pictures are taken. This data can be provided to the analysis device as separate data, e.g., apart from the picture, can be included in a filename of the picture, can be embedded as a watermark in the picture, as picture metadata (e.g., exchangable image file format) (wherein location stamps are environmental data and time is user data). Col. 25. Line [57-67]- HLATKY, JR. discloses in some cases, the tags may have visual identifying indicia (e.g., a serial number) or can use other techniques such as active/passive RF and/or barcodes (including QR codes) of other tag identifying indicia. In such implementations, client devices can be configured to read the identifying indicia (e.g., by RF scan, optical character recognition of serial numbers, or a bar code reader) and auto-populate the catch reports based on the tag (wherein the pattern is the identifying indicia.).).
Regarding claim 12, HLATKY, JR. in view of BABA explicitly teaches the system of claim 1. HLATKY, JR. further teaches further comprising one of hunting equipment, fishing equipment, or card on which said light is secured (Fig. 1. Col. 20. Line [4-7]-HLATKY, JR. discloses the fishing rod can also include several indicating LEDs, an LED, LCD, or other display screen, or other type of interface to provide feedback indicating whether the user's retrieval speed matches the recommendation (wherein the fishing rod is fishing equipment).).
Claims 7-9, are rejected under 35 U.S.C. 103 as being unpatentable over HLATKY, JR. et al. (US 9,195,370 B1), herein after referenced as HLATKY, JR, in view of BABA et al. (DOI: 10.1109/IMTC.1999.776046), herein after referenced as BABA, and further in view of TOMLINSON et al. (US 20200119923 A1), herein after referenced as TOMLINSON.
Regarding claim 7, HLATKY, JR. in view of BABA explicitly teaches the system of claim 1. HLATKY, JR. in view of BABA fail to explicitly teach further comprising a user identification (UID) pattern, wherein said UID pattern is associated with said user.
However, TOMLINSON explicitly teaches further comprising a user identification (UID) pattern (Fig. 1. #17 called a 2d code Paragraph [0022]), wherein said UID pattern is associated with said user (Fig. 1. Paragraph [0022]-TOMLINSON discloses the terminal 3 and the user device 9 may be configured for data communication with the server 11 via a data network 13. The identification data 7, which in this implementation takes the form of an electronic pass, is stored in a memory 15 of the user device 9 and includes data defining a two-dimensional (2D) barcode as 2D code 17. The 2D code 17 may be a machine-readable optical label that consists of graphical elements, typically pixels and/or patterns arranged in a square grid. The graphical elements encode source information, such as a user identifier 19 associated with the user to whom the electronic pass was originally issued, who may or may not be the same user presenting the associated identification data 7 to the terminal 3.).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of HLATKY, JR. in view of BABA of a system for confirming image data with the teachings of TOMLINSON of further comprising a user identification (UID) pattern, wherein said UID pattern is associated with said user.
Wherein having HLATKY, JR.’s system for confirming image data further comprising a user identification (UID) pattern, wherein said UID pattern is associated with said user.
The motivation behind the modification would have been to obtain a system for confirming image data that provides an additional level of authentication to access the application. Since both HLATKY, JR. and TOMLINSON relate to systems for data verification wherein HLATKY, JR. obtains a more desirable system for reporting a catch as easy as possible for a user, to encourage users to provide fishing data while TOMLINSON has the advantage of preventing fraud with the 2D code 17 which may include a user issued PIN or information input by the user. Please see HLATKY, JR. et al. (US 9195370 B1), Col. 16. Line [58-67], TOMLINSON et al. (US 20200119923 A1), Paragraph [0069].
Regarding claim 8, HLATKY, JR. in view of BABA and further in view TOMLINSON explicitly teaches the system of claim 7. HLATKY, JR. in view of BABA fail to explicitly teach wherein said processor extracts said UID pattern from said image data containing said UID pattern.
However, TOMLINSON explicitly teaches wherein said processor extracts said UID pattern from said image data containing said UID pattern (Fig. 1 and 3A-C. Paragraph [0022]-TOMLINSON the lower resolution image 37 is inserted or placed into the blank rectangle of FIG. 3A, to produce the output result of a 2d code 17-2 shown in FIG. 3C. Preferably, a marker or demarcation such as a border of white pixels is also inserted to the 2D code 17-1, for ease of subsequent image processing to extract the data of the lower resolution image 37 from the captured data of the 2D code 17. Further in Paragraph [0033]-TOMLINSON disclose the data verifier 53-1 processes the received captured image data to locate and extract the pixels corresponding to the low-resolution image 37′ of the user's face, for example by determining the marker or demarcation inserted to the 2D code 17.).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of HLATKY, JR. in view of BABA of a system for confirming image data with the teachings of TOMLINSON of wherein said processor extracts said UID pattern from said image data containing said UID pattern.
Wherein having HLATKY, JR.’s system for confirming image data wherein said processor extracts said UID pattern from said image data containing said UID pattern.
The motivation behind the modification would have been to obtain a system for confirming image data that provides an additional level of authentication to access the application. Since both HLATKY, JR. and TOMLINSON relate to systems for data verification wherein HLATKY, JR. obtains a more desirable system for reporting a catch as easy as possible for a user, to encourage users to provide fishing data while TOMLINSON has the advantage of preventing fraud with the 2D code 17 which may include a user issued PIN or information input by the user. Please see HLATKY, JR. et al. (US 9195370 B1), Col. 16. Line [58-67], TOMLINSON et al. (US 20200119923 A1), Paragraph [0069].
Regarding claim 9, HLATKY, JR. in view of BABA and further in view TOMLINSON explicitly teaches the system of claim 8. HLATKY, JR. in view of BABA fail to explicitly teach wherein said processor creates said tagged image data using said user data, pattern, and UID pattern.
However, TOMLINSON explicitly discloses wherein said processor creates said tagged image data using said user data, pattern, and UID pattern (Fig. 3A-C and Fig. 4. Paragraph [0033]- TOMLINSON discloses the data verifier 53-1 processes the received captured image data to locate and extract the pixels corresponding to the low-resolution image 37′ of the user's face, for example by determining the marker or demarcation inserted to the 2D code 17 Paragraph [0034]- TOMLINSON discloses the data verifier 53-1 may also receive image data of the face of the user presenting the identification data 7 to the terminal 3, for example from another camera (not shown) of the terminal 3, and perform image processing on the received image data to detect and verify recognisable visible features of the user's face in the low-resolution, high-resolution and/or captured image data. Further in Paragraph [0034]- TOMLINSON discloses After positive verification of the received 2D code 17 by the data verifier 53-1, an authentication token 55 may be generated and output, for example to the access controller 31 of the terminal 3 (wherein user data is image data of the face, the pattern is the low resolution image 37, the UID pattern is the 2D code, and the tagged image data is the authentication token.).).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of HLATKY, JR. in view of BABA of a system for confirming image data with the teachings of TOMLINSON of wherein said processor creates said tagged image data using said user data, pattern, and UID pattern.
Wherein having HLATKY, JR.’s system for confirming image data wherein said processor creates said tagged image data using said user data, pattern, and UID pattern.
The motivation behind the modification would have been to obtain a system for confirming image data that provides an additional level of authentication to access the application. Since both HLATKY, JR. and TOMLINSON relate to systems for data verification wherein HLATKY, JR. obtains a more desirable system for reporting a catch as easy as possible for a user, to encourage users to provide fishing data while TOMLINSON has the advantage of preventing fraud with the 2D code 17 which may include a user issued PIN or information input by the user. Please see HLATKY, JR. et al. (US 9195370 B1), Col. 16. Line [58-67], TOMLINSON et al. (US 20200119923 A1), Paragraph [0069].
Claim 20 is rejected under 35 U.S.C. 103 as being unpatentable over HLATKY, JR. et al. (US 9,195,370 B1), herein after referenced as HLATKY, JR, in view of UNGER (US 20140168430 A1), herein after referenced as UNGER, and further in view of BABA et al. (DOI: 10.1109/IMTC.1999.776046), herein after referenced as BABA.
Regarding claim 20, HLATKY, JR. explicitly teaches a system for confirming image data comprising (Fig. 1. Col. 4. Lines [60-67]-HLATKY, JR. discloses client devices 120, 130, and/or 140 can include an outdoors application (e.g., a hunting or fishing “app”) that performs various functionality discussed herein such as obtaining pictures of fish/animals and sending catch reports and/or animal reports to analysis device 150. Devices 160 and/or 170 can include functionality to register to receive fishing/hunting reports and/or results of analysis from the client devices and/or analysis device, perhaps by providing criteria to the client and/or analysis devices indicating what reports/analyses they would like to receive.):
a non-transitory computer-readable medium (Fig. 2. #152 called memory) coupled to a processor (Fig. 2. #151 called processor) and having instructions stored thereon, which, when executed by said processor, cause said processor to perform operations comprising (Fig. 2. Col. 4. Line [56-58]- HLATKY, JR. discloses client devices 120, 130, and/or 140 as well as devices 160 and/or 170 can also include hardware such as the aforementioned memory, processor, etc.):
receiving image data from a camera operably connected to a computing device (Fig. 1. Col. 2. Line [26-29]- HLATKY, JR. discloses client devices 120, 130, and 140 can be devices suitable for use by individuals, e.g., mobile devices such as cell phones, PDAs, tablets, netbooks, as well as laptops or personal computers. Further in Col 16. Line [26-30]-HLATKY, JR. discloses that any functionality described herein as associated with the client device can be associated with fishing and/or hunting application that can control a camera of the client device to obtain images for submitted reports (wherein images are user data.).),
at least one of marine life or (Fig. 1. Col. 4. Lines [60-67]-HLATKY, JR. discloses client devices 120, 130, and/or 140 can include an outdoors application (e.g., a hunting or fishing “app”) that performs various functionality discussed herein such as obtaining pictures of fish/animals and sending catch reports and/or animal reports to analysis device 150.
receiving user data from said computing device (Fig. 1. Col 16. Line [26-30]-HLATKY, JR. discloses that any functionality described herein as associated with the client device can be associated with fishing and/or hunting application that can control a camera of the client device to obtain images for submitted reports (wherein images are user data.).),
wherein said user data instructs said processor which user of a plurality of users captured said image data is associated (Fig. 1 and 3. Col 5. Line [11-14]-HLATKY, JR. discloses first client device 120 and second client device 130 can send fishing data identifying various fish that are reported as having been caught by a user of client device 120 and a user of client device 130.),
creating tagged image data using said image data, pattern, and said user data (Fig. 1. Col. 16. Line [48-50]- HLATKY, JR. discloses in some implementations, a client-side application is provided that automatically time and location stamps pictures with the time & location where the pictures are taken. This data can be provided to the analysis device as separate data, e.g., apart from the picture, can be included in a filename of the picture, can be embedded as a watermark in the picture, as picture metadata (e.g., exchangable image file format). Watermarking pictures in this fashion can be one way to help ensure users report their catches honestly. Other implementations may use encrypted pictures or fishing data to ensure security, digital signatures, etc. (wherein time and location data is user data and the watermarked image is tagged image data). Further in Col. 25. Line [61-64]- HLATKY, JR. the tags may have visual identifying indicia (e.g., a serial number) or can use other techniques such as active/passive RF and/or barcodes (including QR codes) of other tag identifying indicia. In such implementations, client devices can be configured to read the identifying indicia (e.g., by RF scan, optical character recognition of serial numbers, or a bar code reader) and auto-populate the catch reports based on the tag. In such implementations, client devices can be configured to read the identifying indicia (e.g., by RF scan, optical character recognition of serial numbers, or a bar code reader) and auto-populate the catch reports based on the tag (wherein identifying indicia is a pattern.).).
HLATKY, JR. fails to explicitly teach wherein said image data contains a light and wildlife.
However, UNGER explicitly teaches wherein said image data contains a light and wildlife (Fig. 1. Paragraph [0041]-UNGER discloses one or more illuminators 112 may be included as well. In general, an illuminator 112 will be configured to project light when an image is captured to illuminate the wildlife being photographed. An illuminator 112 may be configured to project various wavelengths of light. In one or more embodiments, the light generated by an illuminator 112 may be tailored or configured to match the capabilities of the image capture device 144. For instance, an illuminator 112 may generate light that the image capture device 144 is configured to capture (e.g., the illuminator generates visible light for an image capture device configured to capture visible light images).),
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of HLATKY, JR. a system for confirming image data comprising: a non-transitory computer-readable medium coupled to a processor and having instructions stored thereon, which, when executed by said processor, cause said processor to perform operations comprising: receiving image data from a camera operably connected to a computing device, at least one of marine life or, receiving user data from said computing device, wherein said user data instructs said processor which user of a plurality of users captured said image data is associated, and creating tagged image data using said image data, pattern, and said user data, with the teachings of UNGER of wherein said image data contains a light and wildlife
Wherein having HLATKY, JR.’s system for confirming image data wherein said image data contains a light and wildlife.
The motivation behind the modification would have been to obtain a system for confirming image data that provides an additional level of authentication to access the application. Since both HLATKY, JR. and UNGER relate to systems for imaging wildlife, wherein HLATKY, JR. obtains a more desirable system for reporting a catch as easy as possible for a user, to encourage users to provide fishing data while UNGER shows the multipurpose trail camera is advantageous in that it is quickly and easily configured with particular features/capabilities that a user desires or needs for a particular purpose or particular environment. Please see HLATKY, JR. et al. (US 9195370 B1), Col. 16. Line [58-67], and UNGER (US 20140168430 A1), Paragraph [0008].
HLATKY, JR. in view of UNGER fails to explicitly teach extracting a pattern emitted by said light from said image data.
However, BABA explicitly teaches extracting a pattern emitted by said light from said image data (Fig. 1. Col. 2. Lines [14-16]-BABA discloses a highly reliable, fast, accurate range imaging system based on a novel multiplexed structured light approach that uses only a single light pattern. Further in Col. 4. Lines [1-8]-BABA discloses multiple light stripes are projected simultaneously within an arrangement of lenses and image sensors equal in number to the number of lights projected. As shown in Fig. 2, we adjust the arrangement of the components of the optical system in such a manner that the reflected light from the object in the i-th encoded region reaches only the i-th image sensor via only the i-th lens. As a result, the proposed method can encode the measuring space without any ambiguity.).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of HLATKY, JR. in view of UNGER of a system for confirming image data comprising: a non-transitory computer-readable medium coupled to a processor and having instructions stored thereon, which, when executed by said processor, cause said processor to perform operations comprising: receiving image data from a camera operably connected to a computing device, wherein said image data contains a light and at least one of marine life or wildlife, receiving user data from said computing device, wherein said user data instructs said processor which user of a plurality of users captured said image data is associated, and creating tagged image data using said image data, pattern, and said user data, with the teachings of BABA of extracting a pattern emitted by said light from said image data.
Wherein having HLATKY, JR.’s system for confirming image data with extracting a pattern emitted by said light from said image data.
The motivation behind the modification would have been to obtain a system for confirming image data that provides an additional level of authentication to access the application. Since both HLATKY, JR. and BABA relate to systems for obtaining data, wherein HLATKY, JR. obtains a more desirable system for reporting a catch as easy as possible for a user, to encourage users to provide fishing data while BABA discloses a light data obtaining system that provides the user the ability to encode data with a highly reliable, fast, accurate range imaging system. Please see HLATKY, JR. et al. (US 9195370 B1), Col. 16. Line [58-67], and BABA et al. (DOI: 10.1109/IMTC.1999.776046), Col. 2. Lines [14-16].
Allowable Subject Matter
Claim 4, along with its dependent claims, claim 5-6, are therefrom objected to as being dependent upon a rejected base claim, claim 1, but would be allowable if rewritten in independent form including all of the limitations of the base claims and any intervening claims, once the specification and claim objections along with the 112 (a) and 112 (b) rejections due to the 112f claim interpretation and double patenting rejections are overcome.
Claim 13, an independent claim, comprises of allowable subject matter, therefore, once 112(a) and 112(b) rejections due to 112(f) and double patenting rejection are overcome, claim 13 and its dependent claims 14-19 would be allowed.
The following is a statement of reasons for the indication of allowable subject matter:
Regarding claim 4, the prior arts fail to explicitly teach further comprising a measuring device configured to measure attributes of at least one of marine life or wildlife.
With regards to independent claim 13, the cited prior arts fail to explicitly teach the following limitation in combination with all claim limitations:
Regarding claim 13, the prior arts fail to explicitly teach a measuring device configured to measure physical attributes of at least one of marine life or wildlife (wherein the measuring device where the office understand the term measuring device as measurement device after the claim objection) has a structure associated with it sensor, light sensor, a container with mounted sides, head pieces, the structure as described in applicant`s disclosure dated 03/05/2024 as in Fig. 2 and Paragraph [00026 and 00028], based on the functional language since the term “measuring device configured to measure” is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.), as claimed in claim 13.
Regarding claim 13, HLATKY, JR explicitly teaches a system for confirming image data comprising (Fig. 1. Col. 4. Lines [60-67]-HLATKY, JR. discloses client devices 120, 130, and/or 140 can include an outdoors application (e.g., a hunting or fishing “app”) that performs various functionality discussed herein such as obtaining pictures of fish/animals and sending catch reports and/or animal reports to analysis device 150. Devices 160 and/or 170 can include functionality to register to receive fishing/hunting reports and/or results of analysis from the client devices and/or analysis device, perhaps by providing criteria to the client and/or analysis devices indicating what reports/analyses they would like to receive.):
a computing device (Fig. 1. #120, 130, 140 called a client device) having a camera and a user interface configured to manipulate said camera (Fig. 1. Col. 2. Line [26-29]- HLATKY, JR. discloses client devices 120, 130, and 140 can be devices suitable for use by individuals, e.g., mobile devices such as cell phones, PDAs, tablets, netbooks, as well as laptops or personal computers. Further in Col 16. Line [26-30]- HLATKY, JR. discloses that any functionality described herein as associated with the client device can be associated with fishing and/or hunting application that can control a camera of the client device to obtain images for submitted reports (wherein the application is a user interface.).),
a processor operably connected to said computing device and said camera (Fig. 1. Col. 4. Line [57-59]- HLATKY, JR. discloses client devices 120, 130, and/or 140 as well as devices 160 and/or 170 can also include hardware such as the aforementioned memory, processor, etc.),
wherein said processor creates tagged image data using said image data, pattern, and said measurement data (Fig. 1. Col. 3. Line [38-41]- HLATKY, JR. discloses the term “submitted report” refers to a catch report or animal report submitted by a user of a client device, trail camera, etc. Submitted reports often include pictures of the fish caught and/or animals sighted. Further in Col. 17. Line [35-41]- HLATKY, JR. discloses a reporting application on the client device can automatically begin voice recording whenever a user takes a picture, and the user can simply speak the fish species, size, and/or lure or other fishing data. Speech recognition can be performed on the client device or the analysis device to convert the spoken fishing data into a representation suitable for processing by the analysis engine. Further in Col. 17. Line [57-64]- HLATKY, JR. discloses the tags may have visual identifying indicia (e.g., a serial number) or can use other techniques such as active/passive RF and/or barcodes (including QR codes) of other tag identifying indicia. In such implementations, client devices can be configured to read the identifying indicia (e.g., by RF scan, optical character recognition of serial numbers, or a bar code reader) and auto-populate the catch reports based on the tag. (wherein the catch report is the tagged image data, the pattern is identifying indicia, and measurement data is size of fish).).
HLATKY, JR fails to explicitly teach wherein said processor extracts said pattern from said image data containing said light.
However, BABA explicitly teaches wherein said processor extracts said pattern from said image data containing said light (Fig. 1. Col. 2. Lines [14-16]-BABA discloses a highly reliable, fast, accurate range imaging system based on a novel multiplexed structured light approach that uses only a single light pattern. Further in Col. 4. Lines [1-8]-BABA discloses multiple light stripes are projected simultaneously within an arrangement of lenses and image sensors equal in number to the number of lights projected. As shown in Fig. 2, we adjust the arrangement of the components of the optical system in such a manner that the reflected light from the object in the i-th encoded region reaches only the i-th image sensor via only the i-th lens. As a result, the proposed method can encode the measuring space without any ambiguity.).
HLATKY, JR in view of BABA fails to explicitly teach wherein said processor extracts measurement data from said image data pertaining to said physical attributes of said at least one of marine life.
However, KITAGAWA explicitly teaches wherein said processor extracts measurement data from said image data pertaining to said physical attributes of said at least one of marine life (Fig. 2-3. Paragraph [0066]-KITAGAWA discloses in processing of calculating the length of fish, the information processing device 20 uses the captured image by the camera 40A and the captured image by the camera 40B that have been captured at the same time. Further in Paragraph [0083]-KITAGAWA discloses the calculation unit 32 includes a function of calculating, as a length of target fish, an interval L between the paired feature parts (the tip of head and the caudal fin) as represented in FIG. 11 using the position coordinates (spatial position coordinates) of the feature parts (the tip of head and the caudal fin) of target fish specified by the specification unit 31. The length L of fish calculated by the calculation unit 32 in this manner is registered in the storage 23, in a state of being associated with predetermined information such as, for example, an observation date and time.).
HLATKY, JR in view of BABA and further in view of KITAGAWA fails to explicitly teach or wildlife.
However, UNGER explicitly teaches or wildlife (Fig. 1. Paragraph [0041]-UNGER discloses one or more illuminators 112 may be included as well. In general, an illuminator 112 will be configured to project light when an image is captured to illuminate the wildlife being photographed.).
Conclusion
Listed below are prior arts made of record and not relied upon but are considered pertinent to applicant’s disclosure.
KITAGAWA (US 20190277624 A1) - In order to provide a technology with which it is possible to easily and accurately detect the length of a measured object on the basis of a captured image, this information processing device 70 is provided with a detection unit 71 and a calculation unit 72. The detection unit 71 detects feature locations from a captured image in which the measured object is photographed, the feature locations being locations on the measured object that form pairs, each feature location having a predetermined feature. The calculation unit 72 calculates the length between feature locations that form a pair based on the detection results from the detection unit 71.
BUTTERWORTH (US 20170330342 A1) - Embodiments of the present invention seek to provide an accurate way to measure the length of a fish. Some embodiments of the present invention propose utilizing a known length of a marker within a camera view or picture to determine the length of the fish also shown in the same camera view or picture. This determination could be performed in real-time or later remotely. Further, the measurement of the length of the fish is reliable because it is not dependent on a zoom value or angle of the camera and there is no required measurement device. Further, the technique is easy to use and can be performed with a camera phone or tablet.
WILLS (US 20130331146 A1) - The instant invention is a ribbon for use with sport fishing in conjunction with a GPS-enabled smart-phone. The ribbon tape is calculated in terms of estimated weight allowing an ocean weigh-in, versus a dock located weigh-in, to meet the fishing tournament requirements and/or a recreational angler's requirements, and the GPS-enabled smart-phone allows accurate documentation of the catch for competition or recreational purposes. The ribbon tape of the instant invention provides an estimate of fish weights based upon fish length, the fish weights based upon a statistical averaging of a particular fish species. The objective of the invention is to eliminate the destruction of fish caught only for the purpose of determining fish size and weight, and to eliminate the risk caused by tournament fishing boats racing back to weigh-in stations.
KROSSLI (US 20200267947 A1) - The invention provides an arrangement for measuring the biological mass of fish, the arrangement comprising a measurement unit and a processing unit operatively connected to or integrated in the measurement unit. The measurement unit comprises one of: a camera comprising an integrated autofocus function or an external distance measurement device providing data on distance between fish and camera, and a stereovision film camera; the measurement unit or the arrangement comprises sources of in substance monochromatic light, and the processing unit comprises: a pattern recognition functionality and function for creating an outline of said fish after a positive finding, the outline comprises length and at least one transverse dimension across the length of the fish being measured, or an area inside the outline and at least one of length and a transverse dimension across the length of the fish being measured, all other data are eliminated and the outline is used to calculate or find the biological mass of the fish. The invention also provides a method for measuring the biological mass of fish.
ELLIS (US 6377353 B1) - A system measures the three-dimensional linear, angular and volumetric characteristics of an animal or carcass, such as a beef animal. The system uses light spots from a structured light camera to measure multiple points on the animal. The system locates the vertical, horizontal and depth dimension for each point and uses this data to calculate the desired linear and volumetric measurements for conformation of the animal by combining measurements of some of the light spots projected on the animal. The system also provides rapid consecutive three-dimensional motion pictures of the animal.
LAU (US 20190163944 A1) - A composite information bearing device comprising an image pattern and a human readable data device. The human readable data device includes a set of human readable data symbols representing a first set of data. The first set of data includes a first data portion and a second data portion. The image pattern represents a second set of data and comprises a third data portion. One of the first or said second data portions is to form an identification code upon combination or concatenation with said third data portion, and the other one of said first or said second data portions not forming part of said identification code is to form a verification code, the verification code being related to said identification code by a scheme of operation.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ETHAN N WOLFSON whose telephone number is (571)272-1898. The examiner can normally be reached Monday - Friday 8:00 am - 5:00 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chineyere Wills-Burns can be reached at (571) 272-9752. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ETHAN N WOLFSON/Examiner, Art Unit 2673
/CHINEYERE WILLS-BURNS/Supervisory Patent Examiner, Art Unit 2673