Prosecution Insights
Last updated: April 19, 2026
Application No. 18/281,338

SYSTEM AND METHOD FOR IDENTIFYING OR ACQUIRING DATA CORRESPONDING TO A HANDLED ITEM

Non-Final OA §102§103
Filed
Sep 11, 2023
Examiner
SIMPSON, DIONE N
Art Unit
3628
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Aquabot Ltd.
OA Round
1 (Non-Final)
34%
Grant Probability
At Risk
1-2
OA Rounds
3y 4m
To Grant
68%
With Interview

Examiner Intelligence

Grants only 34% of cases
34%
Career Allow Rate
81 granted / 242 resolved
-18.5% vs TC avg
Strong +35% interview lift
Without
With
+35.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 4m
Avg Prosecution
60 currently pending
Career history
302
Total Applications
across all art units

Statute-Specific Performance

§101
40.9%
+0.9% vs TC avg
§103
33.0%
-7.0% vs TC avg
§102
9.8%
-30.2% vs TC avg
§112
15.2%
-24.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 242 resolved cases

Office Action

§102 §103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Information Disclosure Statement The information disclosure statement (IDS) submitted on 09/11/2023 and 09/30/2025 was filed before the mailing of this action. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Status of the Claims Claims 2, 14, and 21-24 are canceled. Claims 1, 3-13, and 15-20 are pending. Claim Objections Claims 1 and 2-12 are objected to because of the following informalities: claim 1 recites “plurality of sensors to scan all facts of the item in order to detect identifiable characteristics on . Appropriate correction is required. Dependent claims 2-12 are also objected to due to their dependency on claims 2-12. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim(s) 1, 3-7, 12, 13, and 15-19 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Diankov (2020/0130961). Claim 1: A system for handling an item, the system comprising: a robotic arm comprising a gripper configured to grip an item located at one or a plurality of storage spaces, transfer the item to a destination and release the item at the destination; (Diankov ¶0036 the robotic system 100 can include an unloading unit 102, a transfer unit 104 (e.g., a palletizing robot and/or a piece-picker robot), a transport unit 106, a loading unit 108, or a combination thereof in a warehouse or a distribution/shipping hub; asks can be combined in sequence to perform an operation that achieves a goal, such as to unload objects from a truck or a van and store them in a warehouse or to unload objects from storage locations and prepare them for shipping; see also ¶0037; ¶0045 robotic system 100 can include physical or structural members (e.g., robotic manipulator arms) that are connected at joints for motion (e.g., rotational and/or translational displacements); structural members and the joints can form a kinetic chain configured to manipulate an end-effector (e.g., the gripper) configured to execute one or more tasks (e.g., gripping, spinning, welding, etc.); see also ¶0050 robotic system 100 can include a robotic arm 302 (e.g., an instance of the transfer unit 104 of FIG. 1) that includes an end-effector 304 (e.g., a gripper). The robotic arm 302 can be configured to transfer the target object 112 between the start location 114 of FIG. 1 and the task location; task location 116 for the robotic arm 302 can be a placement location (e.g., a starting/egress point) on a conveyor 306 (e.g., an instance of the transport unit 106 of FIG. 1). For example, the robotic arm 302 can be configured to pick the objects from the target stack 310 and place them on the conveyor 306 for transport to another destination/task) a plurality of sensors to scan all fac[e]ts of the item in order to detect identifiable characteristics on any of the facets of the item that facilitate identifying and verifying content of the item; and (Diankov ¶0024 robotic system can identify the unrecognized objects based on comparing sensor outputs (e.g., images and/or depth maps) to master data that includes predetermined information regarding physical traits of one or more known/expected objects; the robotic system can compare the sensor outputs or portions thereof to images of the known/expected objects to recognize the objects in the sensor output; ¶0026 the robotic system can use the auto-registration data to identify, process, and/or manipulate same type of objects; ¶0042 registration data 254 can include a dimension, a shape (e.g., templates for potential poses and/or computer-generated models for recognizing the object in different poses), a color scheme, an image, identification information (e.g., bar codes, quick response (QR) codes, logos, etc., and/or expected locations thereof), an expected weight, other physical/visual characteristics, or a combination thereof for the objects expected to be manipulated by the robotic system; ¶0047 the sensors 216 can include one or more imaging devices 222 (e.g., visual and/or infrared cameras, 2D and/or 3D imaging cameras, distance measuring devices such as lidars or radars, etc.) configured to detect the surrounding environment; the robotic system 100 (via, e.g., the processors 202) can process the digital image and/or the point cloud to identify the target object; ¶0051 the second imaging sensor 314 can generate imaging data corresponding to one or more top and/or side views of the target object; see also ¶0059 , ¶0076, ¶0079) a processor to receive scan data from the plurality of sensors and to identify and verify the content of the item based on the detected identifiable characteristics. (Diankov ¶0047 the sensors 216 can include one or more imaging devices 222 (e.g., visual and/or infrared cameras, 2D and/or 3D imaging cameras, distance measuring devices such as lidars or radars, etc.) configured to detect the surrounding environment. The imaging devices 222 can generate representations of the detected environment, such as digital images and/or point clouds, that may be processed via machine/computer vision (e.g., for automatic inspection, robot guidance, or other robotic applications). As described in further detail below, the robotic system 100 (via, e.g., the processors 202) can process the digital image and/or the point cloud to identify the target object; ¶0060 the robotic system 100 can use the image data of object surfaces 416 to recognize or identify the objects that are within the target stack; the robotic system 100 can compare the image data or one or more portions therein to the master data 252 (e.g., the various registration data); the robotic system 100 can identify the known objects (e.g., recognized objects 412) within the target stack 310 when a portion of the top view data 420 matches one or more images of the object surfaces 416 in the registration data; ¶0086-¶0089) Claim 13: Claim 13 is directed to a method. Claim 13 recites limitations that are parallel in nature as those addressed above for claim 1, which is directed towards a system. Claim 13 is therefore rejected for the same reasons as set forth above for claim 1. Claim 3: The system of claim 1, wherein the identifiable characteristics are selected from the group consisting of: a label, printed or otherwise inscribed information, text, logo, artwork, mark, shape, and visible characteristics. (Diankov ¶0057 first crossing sensors 316 can trigger further sensors to obtain additional information (e.g., 2D/3D images of one or more vertical surfaces/edges and/or profile shapes) about the object that may not be detectable by the first imaging sensor 312 and/or the second imaging sensor; the object height and/or the additional information can be used to generate the registration data 254 of FIG. 2 for unrecognized objects; ¶0059 disclosing the robotic system 100 can use image data (e.g., the top view data 420) from the first imaging sensor; the top view data 420 can include one or more visual images and/or one or more depth maps that depict or represent the actual top view; the robotic system 100 can analyze the top view data 420 to identify edges that may correspond to object boundaries; the robotic system 100 can identify edges and/or continuous surfaces represented in the image data based on differences in depth measurements and/or image traits (e.g., different colors, linear patterns, shadows, differences in clarity, etc.)) Claim 15: Claim 15 is directed to a method. Claim 15 recites limitations that are parallel in nature as those addressed above for claim 3, which is directed towards a system. Claim 15 is therefore rejected for the same reasons as set forth above for claim 3. Claim 4: The system of claim 1, wherein sensors of said plurality of sensors are selected from the group of sensors consisting of: scanning sensor, imaging sensor, camera, barcode reader, proximity sensor, rangefinder, mapping sensor, lidar, point cloud sensor ,and laser based sensor. (Diankov ¶0024 the robotic system can identify and register the unrecognized objects within a set of objects; sensor outputs (e.g., images and/or depth maps) to master data that includes predetermined information regarding physical traits of one or more known/expected objects; ¶0047 disclosing the sensors 216 can include one or more imaging devices 222 (e.g., visual and/or infrared cameras, 2D and/or 3D imaging cameras, distance measuring devices such as lidars or radars, etc.); ¶0052 further disclosing the robotic system 100 can include one or more crossing sensors, that detect crossing events based on continuity/disruption in transmitted and/or reflected signals (e.g., optical signals, laser, etc.)) Claim 16: Claim 16 is directed to a method. Claim 16 recites limitations that are parallel in nature as those addressed above for claim 4, which is directed towards a system. Claim 16 is therefore rejected for the same reasons as set forth above for claim 4. Claim 5: The system of claim 1, wherein the robotic arm is configured to manipulate the item so as to allow said plurality of sensors to scan the one or a plurality of surfaces from different directions of views. (Diankov ¶0051 robotic system 100 can use one or more of the sensors; the robotic system 100 can include a first imaging sensor 312 and/or a second imaging sensor; the second imaging sensor 314 can generate imaging data corresponding to one or more top and/or side views of the target object; the robotic system 100 can obtain data regarding the object from different points of view (e.g., side or profile views/images, shape measurements along a different dimension, etc.); see also ¶0059) Claim 17: Claim 17 is directed to a method. Claim 17 recites limitations that are parallel in nature as those addressed above for claim 5, which is directed towards a system. Claim 17 is therefore rejected for the same reasons as set forth above for claim 5. Claim 6: The system of claim 5, wherein the robotic arm is configured to flip the item. (Diankov ¶0090 robotic system 100 can derive and/or implement motion plans to transfer the recognized objects 412 to the task location; robotic system 100 can use the derived locations and current/projected locations for the robotic arm 302 and/or the end-effector 304 to further derive movements and/or corresponding actuator commands/settings; the robotic system 100 can derive locations, movements, and/or corresponding actuator commands/settings for the robotic arm 302 and/or the end-effector 304 corresponding to lifting, horizontally displacing, lowering, and/or rotating the recognized objects) Claim 18: Claim 18 is directed to a method. Claim 18 recites limitations that are parallel in nature as those addressed above for claim 6, which is directed towards a system. Claim 18 is therefore rejected for the same reasons as set forth above for claim 6. Claim 7: The system of claim 1, wherein the gripper is selected from the group consisting of: mechanical griper, clamp, suction cup, fixture, vacuum gripper, pneumatic gripper, hydraulic gripper, magnetic gripper, electric gripper, and electrostatic gripper. (Diankov ¶0066 disclosing the suction cups of the gripper; ¶0099 the robotic system 100 can further determine and process a grip measure (e.g., measurements of vacuum between the suction cups and the gripped surface) Claim 19: Claim 19 is directed to a method. Claim 19 recites limitations that are parallel in nature as those addressed above for claim 7, which is directed towards a system. Claim 19 is therefore rejected for the same reasons as set forth above for claim 7. Claim 12: The system of claim 1 wherein the robotic arm is configured to drop the item on a conveyor track at the destination. (Diankov ¶0037 the unloading unit 102 (e.g., a devanning robot) can be configured to transfer the target object 112 from a location in a carrier (e.g., a truck) to a location on a conveyor belt; the transfer unit 104 can be configured to transfer the target object 112 from one location (e.g., the conveyor belt); ¶0050 the robotic arm 302 can be configured to pick the objects from the target stack 310 and place them on the conveyor 306 for transport to another destination/task; see also ¶0059) Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 8, 10, 11, and 20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Diankov (2020/0130961) in view of Robertson (2019/0261566). Claim 8: The system of claim 1, further comprising one or a plurality of illumination sources to illuminate the item. Diankov discloses light/laser based sensors, but does not explicitly disclose one or a plurality of illumination sources to illuminate the item. Robertson suggests or discloses this limitation/concept: (Robertson ¶0121 disclosing controlled lighting can be provided by one or more illuminators attached to the end of the robot arm; ¶0133 disclosing obtaining multiple images under different lighting conditions; arranging a series of LED lights in a circle around the fruit and activating them one at a time, capturing one exposure per light; Fig. 9 and cameras (and possibly other sensors, such as cameras sensitive to specific (and possibly non- visible) portions of the EM spectrum including IR, (ii) cameras and illuminators that use polarized light). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Diankov to include one or a plurality of illumination sources to illuminate the item as taught by Robertson since the claimed invention is merely a combination of old elements, and in the combination each element merely would have performed the same function as it did separately; one of ordinary skill in the art would have recognized that the results of the combination were predictable. Claim 20: Claim 20 is directed to a method. Claim 20 recites limitations that are parallel in nature as those addressed above for claim 8, which is directed towards a system. Claim 20 is therefore rejected for the same reasons as set forth above for claim 8. Claim 10: The system of claim 1, further provided with an enclosure to prevent or limit penetration of ambient light into a space within the enclosure. Diankov discloses light/laser based sensors, but does not explicitly disclose an enclosure to prevent or limit penetration of ambient light into a space within the enclosure. Robertson suggests or discloses this limitation/concept: (Robertson ¶0136 and Fig. 9 disclosing a QC imaging chamber; a “chimney” is created to reduce the amount of ambient light entering the QC rig). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Diankov to include one or a plurality of illumination sources to illuminate the item as taught by Robertson since the claimed invention is merely a combination of old elements, and in the combination each element merely would have performed the same function as it did separately; one of ordinary skill in the art would have recognized that the results of the combination were predictable. Claim 11: The system of claim 10, wherein a wall of the enclosure is made of opaque or shading material. Diankov discloses a conveyor, but does not explicitly disclose a wall of the enclosure is made of opaque or shading material. Robertson suggests or discloses this limitation/concept: (Robertson ¶0136 and Fig. 9 disclosing a QC imaging chamber; a “chimney” is created to reduce the amount of ambient light entering the QC rig; the rig blocks unwanted light from the sides (thus opaque)). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Diankov to include a wall of the enclosure is made of opaque or shading material as taught by Robertson. One of ordinary skill in the art before the effective filing date of the claimed invention would have been motivated to modify Diankov in order to provide a view of every part of the surface of the item (see ¶0136 of Robertson). Claim(s) 9 is/are rejected under 35 U.S.C. 103 as being unpatentable over Diankov (2020/0130961) in view of Robertson (2019/0261566) in view of Konolige (2017/0261314). Claim 9: The system of claim 8, wherein said one or a plurality of illumination sources is selected form the group consisting of: red or infra-red-light source, 2700 kelvin lighting, 700nm-635 red spectrum light, 760nm-lmm red and infra-red wavelength spectrum light, yellow spectrum lamp, 3000 kelvin light, and 590 nm to 560 nm wavelength spectrum light. Diankov discloses light/laser based sensors, but does not explicitly disclose that one or a plurality of illumination sources is selected form the group consisting of: red or infra-red-light source, 2700 kelvin lighting, 700nm-635 red spectrum light, 760nm-lmm red and infra-red wavelength spectrum light, yellow spectrum lamp, 3000 kelvin light, and 590 nm to 560 nm wavelength spectrum light. Konolige suggests or discloses this limitation/concept: (Konolige ¶0007 disclosing the system includes a first optical sensor, a second optical sensor, a light source; each optical sensor includes a first plurality of photodetectors configured to capture visible light interspersed with a second plurality of photodetectors configured to capture infrared light within a particular infrared band; light source is configured to project infrared light of a wavelength within the particular infrared band onto an environment; ¶0091and Fig. 4 disclosing the optical sensors and generate a charge when exposed to an incident beam of light whose wavelength is within the visible spectrum (e.g. 380 nm to 750 nm) and an incident beam of light whose wavelength is within the infrared spectrum (e.g. 750 nm to 3000 nm)). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Diankov in view of Robertson to include one or a plurality of illumination sources to illuminate the item as taught by Robertson since the claimed invention is merely a combination of old elements, and in the combination each element merely would have performed the same function as it did separately; one of ordinary skill in the art would have recognized that the results of the combination were predictable. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to DIONE N SIMPSON whose telephone number is (571)272-5513. The examiner can normally be reached M-F; 7:30 a.m.-4:30 p.m.. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Resha Desai can be reached at 571-270-7792. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. DIONE N. SIMPSON Primary Examiner Art Unit 3628 /DIONE N. SIMPSON/Primary Examiner, Art Unit 3628
Read full office action

Prosecution Timeline

Sep 11, 2023
Application Filed
Dec 01, 2025
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596987
Connected Logistics Receptacle Apparatus, Systems, and Methods with Proactive Unlocking Functionality Related to a Dispatched Logistics Operation by a Mobile Logistics Asset Having an Associated Mobile Transceiver
2y 5m to grant Granted Apr 07, 2026
Patent 12579484
INTELLIGENTLY CUSTOMIZING A CANCELLATION NOTICE FOR CANCELLATION OF A TRANSPORTATION REQUEST BASED ON TRANSPORTATION FEATURES
2y 5m to grant Granted Mar 17, 2026
Patent 12561692
UPDATING ACCOUNT INFORMATION USING VIRTUAL IDENTIFICATION
2y 5m to grant Granted Feb 24, 2026
Patent 12391138
ELECTRIC VEHICLE, AND CHARGING AND DISCHARGING FACILITY, AND SYSTEM
2y 5m to grant Granted Aug 19, 2025
Patent 12387163
Logistical Management System
2y 5m to grant Granted Aug 12, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
34%
Grant Probability
68%
With Interview (+35.0%)
3y 4m
Median Time to Grant
Low
PTA Risk
Based on 242 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month