DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
This is in reply to communication filed on 03/11/2025.
Claims 1-18 are currently pending and have been examined.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-18 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception without significantly more.
Step 1:
Claims 1-9 recite a method, which is directed to a process.
Claims 10-18 recite a device, which is directed to a machine.
Therefore, each claim falls within one of the four statutory categories.
Step 2A, Prong 1 (Is a judicial exception recited?):
The independent claims 1 and 10 recite the abstract idea of managing inventory, transportation and delivery of items, see specification [0001]. This idea is described by the steps of
receiving a selection of an active operating mode selected from a generating mode and an updating mode;
obtaining an image depicting a storage area;
detecting, in the image, a label disposed on an item in the storage area;
determining (i) an identifier of the label, and (ii) a location of the label in the storage area;
determining whether the label is included in a map of the storage area; and
when the label is not included in the map:
(i) when the generating mode is active, inserting a record into the map, the inserted record containing the identifier of the label, and the location of the label, and
(ii) when the updating mode is active, generating a notification
These claims recite a certain method of organizing human activity. The claims recite to a certain method of organizing human activity as the above abstract idea limitations are directed to managing personal behavior or relationships or interactions between people. The examiner finds the claims to simply recites steps of following rules or instructions, and fundamental economic principles or practices to manage inventory, transportation and delivery of items. The Examiner additionally finds the claims to be similar to an example the courts have identified as being a certain method of organizing human activity:
i. using a marking affixed to the outside of a mail object to communicate information about the mail object, i.e., the sender, recipient, and contents of the mail object, Secured Mail Solutions LLC v. Universal Wilde, Inc., 873 F.3d 905, 911, 124 USPQ2d 1502, 1506 (Fed. Cir. 2017).
ii. filtering content, BASCOM Global Internet v. AT&T Mobility, LLC, 827 F.3d 1341, 1345-46, 119 USPQ2d 1236, 1239 (Fed. Cir. 2016) (finding that filtering content was an abstract idea under step 2A, but reversing an invalidity judgment of ineligibility due to an inadequate step 2B analysis).
The data collection, recognition, and storage concept described in the claim is similar to the data collection and management concepts that were held to be abstract ideas in Content Extraction, TLI Communications, and Electric Power Group. Although the claim enumerates the type of information (i.e., the images, and location data) that is acquired, stored and analyzed, the Federal Circuit has explained in Electric Power Group and Digitech that the mere selection and manipulation of particular information by itself does not make an abstract concept any less abstract. Further, the claim is not made any less abstract by the invocation of a programmed computer.
Step 2A, Prong 2 (Is the exception integrated into a practical application?):
This judicial exception is not integrated into a practical application because the claims satisfy the following criteria, which indicate that the claims do not integrate the abstract idea into practical application:
The claimed additional limitations are:
Claim 1: output device,
Claim 10: computing device, a camera, a controller, output device,
The additional limitations are directed to using a generic computer to process information and perform the abstract idea. Therefore, the limitations merely amount to adding the words “apply it” (or an equivalent) to the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea, as discussed in MPEP 2106.05(f).
Step 2B (Does the claim recite additional elements that amount to significantly more that the judicial exception?):
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception.
As for Step 2B analysis, knowing the consideration is overlapping with Step 2A, Prong 2. The Step 2B considerations have already been substantially addressed under Step 2A Prong 2, see Step 2A Prong 2 analysis above. As discussed above, the additional imitations amount to adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea, as discussed in MPEP 2106.05(f).
In addition, the dependent claims recite:
Step 2A, Prong 1 (Is a judicial exception recited?):
Dependent claims 2-9 and 11-18 recitations further narrowing the abstract idea recited in the independent claims 1 and 10 and therefore directed towards the same abstract idea, as:
Step 2A, Prong 2 and Step 2B:
The dependent claims 2-9 and 11-18 further narrow the abstract idea recited in the independent claims 1 and 10 and are therefore directed towards the same abstract idea.
The dependent claims recite the following additional limitations:
Claim 4: an optical character recognition model,
Claim 5: a memory,
Claim 9: the output device,
Claim 11: the computing device,
Claims 12, 15, 16, 17: the computing device, the controller,
Claim 13: the computing device, the controller, an optical character recognition model,
Claim 14: the computing device, the controller, a memory,
Claim 18: the computing device, the controller, the output device,
However, the examiner finds each of these additional elements to be directed to merely “apply it” or applying a generic technology to perform the recited abstract idea of managing inventory, transportation and delivery of items, the recitation to the generic computer technology that is being used as a tool to execute the steps that define the abstract idea do not provide for integration at the 2nd prong and do not provide for significantly more at step 2B.
Therefore, the limitations on the invention of claims 1-18, when viewed individually and in ordered combination are directed to in-eligible subject matter.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-2, 6-11 and 15-18 are rejected under 35 U.S.C 103 as being unpatentable over Johnson et al. (US 20170261992 A1, hereinafter “Johnson”) in view of Gil et al. (US 20210383320 A1, hereinafter “Gil”).
Regarding claims 1 and 10. Johnson discloses a method, comprising:
receiving a selection of an active operating mode selected from a generating mode and an updating mode; (Johnson, [0030]; “one of the robots 18 navigates the warehouse and builds a map 10 a, FIG. 4, … constructing or updating a map of an unknown environment”)
obtaining an image depicting a storage area; detecting, in the image, a label disposed on an item in the storage area (Johnson, [0034]; “In step 202, robot 18 using camera 26 captures image and in step 204 searches for fiducial markers within the captured images”); determining (i) an identifier of the label, and (ii) a location of the label in the storage area; (Johnson, [0021]; “robot 18 … includes … camera 26 to capture information representative of the robot's environment”. [0032]; “While constructing the map 10 a or thereafter, one or more robots 18 navigates through warehouse 10 using camera 26 to scan the environment to locate fiducial markers … When a fiducial marker, such as fiducial marker 30, FIGS. 3 and 4, is located by robot 18 using its camera 26, the location in the warehouse relative to origin 110 is determined)
determining whether the label is included in a map of the storage area; (Johnson, [0034]; “In step 206, if a fiducial marker is found in the image (step 204) it is determined if the fiducial marker is already stored in fiducial table 300, FIG. 6, which is located in memory 34 of robot 18”) and
when the label is not included in the map: (i) when the generating mode is active, inserting a record into the map, the inserted record containing the identifier of the label, and the location of the label, (Johnson, [0034-0035]; “If it is not in memory, the pose is determined according to the process described above and in step 208, it is added to fiducial to pose lookup table 300 … In look-up table 300, which may be stored in the memory of each robot, there are included for each fiducial marker a fiducial identification, 1, 2, 3, etc, and a pose for the fiducial marker/bar code associated with each fiducial identification. The pose consists of the x,y,z coordinates in the warehouse along with the orientation or the quaternion (x,y,z, ω)”) and
Johnson substantially discloses the claimed invention; however, Johnson fails to explicitly disclose the “when the updating mode is active, generating a notification via an output device”. However, Gil teaches
(ii) when the updating mode is active, generating a notification via an output device. (Gil, [0070]; “the control system 100 may identify any discrepancies between the asset and the location by locating any mismatches between identifiers. For example, the control system 100 may determine that the identifier associated with package X should be located, picked, and/or placed at shelf Y, but the camera 116 captured it located in, picked, sorted and/or placed at shelf B. A notification indicating this may be responsively transmitted back to the device 114 such that the speaker 117 issues a prompt indicating the discrepancy and/or telling the user where the correct location is for the particular package”)
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date to modify Johnson to include when the updating mode is active, generating a notification via an output device, as taught by Gil, where this would be performed in order to provide to carrier personnel improved instructions and/or guidance for the automated handling of the packages within various environments (e.g., a delivery vehicle, a trailer or cargo area of a delivery vehicle). See Gil [0008].
Regarding claims 2 and 11. The combination of Johnson in view of Gil disclose the method of claim 1, wherein
Johnson substantially discloses the claimed invention; however, Johnson fails to explicitly disclose the “the storage area includes an interior of a delivery vehicle”. However, Gil teaches
the storage area includes an interior of a delivery vehicle. (Gil, [0008]; “the automated handling of the packages within various environments (e.g., a delivery vehicle, a trailer or cargo area of a delivery vehicle, a warehouse environment whether relative to a sort location, a pick location, a conveyor belt, and/or any combination thereof)”)
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date to modify Johnson to include the storage area includes an interior of a delivery vehicle, as taught by Gil, where this would be performed in order to provide to carrier personnel improved instructions and/or guidance for the automated handling of the packages within various environments (e.g., a delivery vehicle, a trailer or cargo area of a delivery vehicle). See Gil [0008].
Regarding claims 6 and 15. The combination of Johnson in view of Gil disclose the method of claim 1, wherein determining whether the label is included in the map of the storage area includes: determining whether the map includes a record containing the identifier of the label. (Johnson, [0034]; “if a fiducial marker is found in the image (step 204) it is determined if the fiducial marker is already stored in fiducial table 300, FIG. 6, which is located in memory 34 of robot 18”)
Regarding claims 7 and 16. The combination of Johnson in view of Gil disclose the method of claim 1, further comprising:
Johnson substantially discloses the claimed invention; however, Johnson fails to explicitly disclose the “when the label is included in the map, overwriting a previous location corresponding to the label with the location of the label”. However, Gil teaches:
when the label is included in the map, overwriting a previous location corresponding to the label with the location of the label. (Gil, [0105]; “The control system 100 can update the asset location database (i.e., the claimed “overwriting”) based on determining that an asset has been added, moved, or removed from the storage area”)
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date to modify Johnson to include when the label is included in the map, overwriting a previous location corresponding to the label with the location of the label, as taught by Gil, where this would be performed in order to provide to carrier personnel improved instructions and/or guidance for the automated handling of the packages within various environments (e.g., a delivery vehicle, a trailer or cargo area of a delivery vehicle). See Gil [0008].
Regarding claims 8 and 17. The combination of Johnson in view of Gil disclose the method of claim 1, wherein generating the notification includes:
Johnson substantially discloses the claimed invention; however, Johnson fails to explicitly disclose the “generating, without updating the map, an indication that a misplaced item has been detected”. However, Gil teaches
generating, without updating the map, an indication that a misplaced item has been detected. (Gil, [0070]; “the control system 100 may determine that the identifier associated with package X should be located, picked, and/or placed at shelf Y, but the camera 116 captured it located in, picked, sorted and/or placed at shelf B. A notification indicating this may be responsively transmitted back to the device 114 such that the speaker 117 issues a prompt indicating the discrepancy and/or telling the user where the correct location is for the particular package”)
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date to modify Johnson to include generating, without updating the map, an indication that a misplaced item has been detected, as taught by Gil, where this would be performed in order to provide to carrier personnel improved instructions and/or guidance for the automated handling of the packages within various environments (e.g., a delivery vehicle, a trailer or cargo area of a delivery vehicle). See Gil [0008].
Regarding claims 9 and 18. Claims 9 and 18 recite operations that are no more than a predictable variation or duplication of the operations recited in claim 1 and 10, albeit with retrieve so-called “next identifier”. Such features would have been an obvious product of ordinary skill in art and common sense, not innovation. That is, the claimed subject matter is no more than a predictable combination of known elements according to their established purposes. See KSR Int’l. Co. v. Teleflex, Inc., 550 U.S. 398, 418421 (2007); see also MPEP § 2144.04 VI, B.
Claims 3-5 and 12-14 are rejected under 35 U.S.C 103 as being unpatentable over Johnson in view of Gil further in view of Fu et al. (US 20200118063 A1, hereinafter “Fu”).
Regarding claims 3 and 12. The combination of Johnson in view of Gil disclose the method of claim 1, wherein detecting the label includes
The combination of Johnson in view of Gil substantially discloses the claimed invention; however, the combination fails to explicitly disclose the “executing a classifier to generate a position of the label in the image”. However, Fu teaches:
executing a classifier to generate a position of the label in the image. (Fu, [0026]; “The control application 128 also includes a classifier 208, configured to classify the output of the comparator 204 (that is, the mismatches mentioned above)”)
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date to modify Johnson to include executing a classifier to generate a position of the label in the image, as taught by Fu, where this would be performed in order to reduce the accuracy with which information concerning the objects may be collected within the environment. See Fu [0002].
Regarding claims 4 and 13. The combination of Johnson in view of Gil further in view of Fu disclose the method of claim 3, wherein determining the identifier of the label includes
Johnson substantially discloses the claimed invention; however, Johnson fails to explicitly disclose the “executing an optical character recognition model to extract the identifier from the position”. However, Gil teaches
executing an optical character recognition model to extract the identifier from the position. (Gil, [0144]; “the camera 116 may utilize object recognition algorithms that identify whenever a person is clasping an object in a particular manner to determine properness”)
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date to modify Johnson to include executing an optical character recognition model to extract the identifier from the position, as taught by Gil, where this would be performed in order to provide to carrier personnel improved instructions and/or guidance for the automated handling of the packages within various environments (e.g., a delivery vehicle, a trailer or cargo area of a delivery vehicle). See Gil [0008].
Regarding claims 5 and 14. The combination of Johnson in view of Gil disclose the method of claim 1, wherein determining the location of the label in the storage area includes:
detecting, in the image, a reference object in the storage area; (Johnson, [0032]; “While constructing the map 10 a or thereafter, one or more robots 18 navigates through warehouse 10 using camera 26 to scan the environment to locate fiducial markers … Robots 18 use a known starting point or origin for reference, such as origin 110”) and
determining the location of the label based on the location of the reference object and a position of the label in the image relative to the reference object. (Johnson, [0032]; “When a fiducial marker, such as fiducial marker 30, FIGS. 3 and 4, is located by robot 18 using its camera 26, the location in the warehouse relative to origin 110 is determined”)
The combination of Johnson in view of Gil substantially discloses the claimed invention; however, the combination fails to explicitly disclose the “retrieving, from a memory, a location of the reference object in the storage area”. However, Fu teaches:
retrieving, from a memory, a location of the reference object in the storage area; (Fu, [0037]; “The reference data 710, which may also be referred to as a realogram, is retrieved from the repository 132. In addition, the reference data 710 includes depth measurements segmented to each of the bounding boxes 714 as described above in connection with the performance of block 305”)
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date to modify Johnson to include retrieving, from a memory, a location of the reference object in the storage area, as taught by Fu, where this would be performed in order to reduce the accuracy with which information concerning the objects may be collected within the environment. See Fu [0002].
Conclusion
1. Any inquiry concerning this communication or earlier communications from the examiner should be directed to AVIA SALMAN whose telephone number is (313)446-4901. The examiner can normally be reached Monday thru Friday; 9:00 AM to 5:00 PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, FAHD OBEID can be reached at (571) 270-3324. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/AVIA SALMAN/Primary Patent Examiner, Art Unit 3627