Prosecution Insights
Last updated: April 19, 2026
Application No. 18/630,901

METHOD FOR CONTROLLING A HANDLING SYSTEM AND HANDLING SYSTEM

Non-Final OA §101§103§112
Filed
Apr 09, 2024
Examiner
MOLNAR, SIDNEY LEIGH
Art Unit
3656
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
J. Schmalz GmbH
OA Round
1 (Non-Final)
54%
Grant Probability
Moderate
1-2
OA Rounds
2y 4m
To Grant
99%
With Interview

Examiner Intelligence

Grants 54% of resolved cases
54%
Career Allow Rate
7 granted / 13 resolved
+1.8% vs TC avg
Strong +86% interview lift
Without
With
+85.7%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
31 currently pending
Career history
44
Total Applications
across all art units

Statute-Specific Performance

§101
8.7%
-31.3% vs TC avg
§103
42.2%
+2.2% vs TC avg
§102
22.3%
-17.7% vs TC avg
§112
26.1%
-13.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 13 resolved cases

Office Action

§101 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Specification Applicant is reminded of the proper language and format for an abstract of the disclosure. The abstract should be in narrative form and generally limited to a single paragraph on a separate sheet within the range of 50 to 150 words in length. The abstract should describe the disclosure sufficiently to assist readers in deciding whether there is a need for consulting the full patent text for details. The language should be clear and concise and should not repeat information given in the title. It should avoid using phrases which can be implied, such as, “The disclosure concerns,” “The disclosure defined by this invention,” “The disclosure describes,” etc. In addition, the form and legal phraseology often used in patent claims, such as “means” and “said,” should be avoided. Applicant’s Abstract is inclusive of legal phraseology, including both “means” and “said”. As such, the Abstract is objected to. Appropriate correction is required. Claim Objections Claims 1, 6, 13, 17, and 19 are objected to because of the following informalities: Claim 1 recites “…receiving image data that represent an image of at least one portion of the item to be gripped, which image is captured by means of the detection device…” in lines 14-16. Since an image is already introduced, it is recommended to amend the claim such that it reads “…which the image is captured…”. Such an amendment would introduce the image with proper grammatical context which would signify the image which is referred to in the claim limitation regarding the image data is the same image which is captured by the detection device. Claim 19 is objected to as having a similar limitation to that of claim 1. Claim 6 recites “…are evaluated depending on gripping point selection criteria…” in lines 3-4. Gripping point selection criteria has already been defined in claim 4. As such, Examiner recommends instead writing “…are evaluated depending on the gripping point selection criteria…” in order to exemplify that such gripping point selection criteria are those selection criteria which are referred to in claim 4. Claim 13 recites “…a. gripping point candidate cannot be approached by…” in line 6. The gripping point candidate has already been defined in claims 9-12 which claim 13 depends from. As such, in order to accurately introduce the gripping point candidate in claim 13, it is recommended to write “a. the gripping point candidate cannot be approached by…”. This recommendation is also in line with how the gripping point candidate is introduced in line 8 of the claim. Claim 17 recites “…wherein the end effector is a suction gripping apparatus, or an elastomer suction gripper or a vacuum gripper” in lines 2-3. Examiner recommends combining the list into a single conjunction such that the claim reads “…wherein the end effector is a suction gripping apparatus, an elastomer suction gripper, or a vacuum gripper”. Such amendment simplifies the list, making the limitation flow with more ease and maintaining clarity that any such end effector type is a viable embodiment of the invention. Appropriate correction is required. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitations are: “at least one detection unit which is designed to” in claims 1, 16, and 19; and “a monitoring device which is designed to” in claim 14. Regarding the at least one detection unit, Page 3 of Applicant’s specification recites, “The at least one detection unit is preferably designed as a camera, in particular a 3D camera, for example as a CCD camera. It is also conceivable for the at least one detection unit to be a laser scanner, ultrasonic sensor, or radar sensor. It is also conceivable for the detection device to comprise a plurality of detection units of different types.” Thus, such a detection unit will be considered as a camera(s), a laser scanner(s), an ultrasonic sensor(s), or a radar sensor(s), and any other such functional equivalent which captures an image of an item to be gripped when reviewing the prior art. Regarding the monitoring device, Pages 19-20 of Applicant’s specification recites, “The monitoring device can, for example, comprise one or more cameras which are designed to capture the image of the gripping of the item at the target gripping point, in particular to capture whether the item has been reliably gripped and deposited again. In an embodiment of the end effector as a suction gripping apparatus, the monitoring device can, for example, comprise a vacuum sensor which is designed to monitor a negative pressure prevailing in the suction gripping apparatus. It is also conceivable for the monitoring device to comprise a weighing device which is designed to weigh a source container, in particular before and after the gripping of an item. It is also conceivable for the items to have an RFID tag. The monitoring device can then comprise an RFID detector.” Thus, the monitoring device will be considered as any such device inclusive of cameras, vacuum sensor, weighing device, RFID detector, or other functional equivalent which determines whether or not an item has been successfully gripped when reviewing the prior art. Because these claim limitations are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, they are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. If applicant does not intend to have these limitations interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitations to avoid them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitations recites sufficient structure to perform the claimed function so as to avoid them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. This application includes one or more claim limitations that use the word “means” or “step” but are nonetheless not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph because the claim limitation(s) recite(s) sufficient structure, materials, or acts to entirely perform the recited function. Such claim limitations are: “means of the detection device” in claims 1 and 19; and “means of the monitoring device” in claim 14. Examiner did not consider such devices as functions and thus will not interpret these uses of “means” as themselves invoking 112f interpretations. However, Examiner notes that such devices are related to or directly invoke 112f claim interpretations as being a generic unit or device which are designed to perform a desired detection or monitoring function (see above). Because these claim limitations are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, they are not being interpreted to cover only the corresponding structure, material, or acts described in the specification as performing the claimed function, and equivalents thereof. If applicant intends to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to remove the structure, materials, or acts that performs the claimed function; or (2) present a sufficient showing that the claim limitation(s) does/do not recite sufficient structure, materials, or acts to perform the claimed function. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 2-6, 8-13, 14-15, and 18 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 2 recites the limitation “…wherein target selection data are determined which contain information as to which of the two or more gripping determination algorithms has determined the gripping point candidate selected as the target gripping point in a control cycle, and/or wherein selection frequency data are determined which, for each gripping point determination algorithm, represent how often in a specified time interval comprising a plurality of control cycles this gripping point determination algorithm has determined a gripping point candidate later selected as the target gripping point” in lines 2-13. First, by using the term and/or it is understood that either value can be exclusively determined. However, by referring to the gripping point determination algorithm as this gripping point determination algorithm when regarding the selection frequency, it is unclear which gripping point determination algorithm is this gripping point determination algorithm. This gripping point determination algorithm could be that gripping point determination algorithm determined as the target selection data or alternatively, this gripping point algorithm could be the gripping point determination algorithm which is the subject of the selection frequency data more generally. Provided that the target selection data and selection frequency data are provided as non-exclusive alternatives (i.e., and/or), Examiner best understands the claim to mean the latter in which this gripping point determination algorithm is a general algorithm which is the subject of the given selection frequency data. Examiner will thus interpret the limitation to instead read “…wherein target selection data are determined which contain information as to which of the two or more gripping determination algorithms has determined the gripping point candidate selected as the target gripping point in a control cycle, and/or wherein selection frequency data are determined which, for each gripping point determination algorithm, represent how often in a specified interval of time comprising a plurality of control cycles a specific gripping point determination algorithm has determined a gripping point candidate later selected as the target gripping point”. Claims 3-6 are rejected as being dependent on claim 2. Claim 3 recites the limitation, “…wherein the target selection data and/or the selection frequency data are transmitted to an external computer or an external data network, or to a computer or to a data network of a provider providing the particular gripping point determination algorithm” in lines 2-6. First, it is unclear whether an external computer should be considered as a computer, and whether an external data network should be considered as a data network of a provider. What makes this limitation confusing is the placement of a comma which seems to group “an external computer or an external data network” as a separate grouping from “a computer or a data network of a provider”. It is unclear whether then only one of the four processors should be considered, each of the four processors should be considered, or any combination of the four processors should be considered when evaluating the limitation. In addition, further confusion is introduced with the and/or which separates the target selection data and the selection frequency data. It is unclear if there is a dependency with the following limitations regarding the different processors which receive data in which one might require a specific type of data or both types of data. Examiner best understands each case to be any one of or any combination of the provided processors and corresponding data. As such, Examiner will read the limitation as, “…wherein at least one of the target selection data or the selection frequency data are transmitted to at least one of an external computer, an external data network, a computer, or to a data network of a provider…”. Additionally, with regard to the above limitation of claim 3, it is unclear which gripping point algorithm is the particular gripping point determination algorithm. Examiner assumes the particular algorithm refers to that which determines the target gripping point candidate. As such, Examiner will interpret this limitation instead to read “…providing the gripping point determination algorithm which selected the target gripping point”. Claim 4 is rejected as having a similar limitation regarding the particular gripping point determination algorithm. Claims 5-6 are also rejected as being dependent from claims 3 and 4. Claim 4 recites “…one of the following criteria…” in line 3 before reciting “…and/or…” in line 18. It is unclear how if the claim requires strictly one of the criteria, how such criteria may be linked by the and conjunction. Examiner thus will interpret the claim to read “…one or more of the following criteria…” such that the and/or conjunction makes sense in the context of the claim. This interpretation additionally mimics that of claim 9 which similarly regards criteria instead for the algorithm selection criteria and additionally is most clear regarding claim 5 which may require a weighted selection criteria. Claims 5 and 6 are rejected as being dependent on claim 4. Claim 4 further recites “…that gripping point candidate is selected as the target gripping point which has the highest confidence value…” in lines 6-7. Such language is confusing because it is unclear which gripping point candidate is specifically selected as the target gripping point and whether such gripping point candidate is that of the “particular” gripping point determination algorithm in lines 4-5 or a general gripping point candidate which has been selected as the target gripping point. Examiner best interprets the claim to be “…wherein the gripping point candidate which is selected as the target gripping point has the highest confidence value…”. Such interpretation generally links the gripping point candidate to that which is generally selected as the target gripping point. Claims 5 and 6 are rejected as being dependent on claim 4. Claim 4 further recites “…a position or coordinates, and/or orientation of the gripping point candidate…” in lines 8-9. This limitation is confusing as it is unclear which of a position, coordinate, and orientation are to be considered for the gripping point candidate. It appears as though a position or coordinates are considered in conjunction, and then whichever of the two are considered from the pair, they may be considered either in conjunction or exclusively from orientation. However, provided that there is additionally an and/or linking the first pair conjoined by or to the orientation, it is unclear whether or not all three of a position, coordinates, and orientation may be considered in this evaluation. As such, Examiner best interprets the limitation to instead read “…at least one of a position, coordinates, or orientation of the gripping point candidate…” such that any combination of the three criteria may be considered. Claims 5 and 6 are rejected as being dependent on claim 4. Claim 4 further recites “…a property or type of the item to be gripped, in its geometry, surface quality, and/or material properties…”. It is unclear which geometry, surface quality, or material properties are being referred to as its does not adequately point to the item, the property, the type, or another unknown entity. It is additionally unclear what is meant by in its geometry, surface quality, and/or material properties as Examiner cannot ascertain what is required to be in a geometry, surface quality, or material property. Additionally, it is unclear whether the grouping linked by the and/or conjunction includes any of the three qualities, inclusive of a combination of strictly two qualities, or if such conjunction is meant to determine only one of or all of the three qualities more strictly. As such, Examiner best interprets the limitation to instead read, “…a property or type of the item to be gripped, wherein the property or the type of the item is based on at least one of the item’s geometry, surface quality, or material properties…”. Claims 5 and 6 are rejected as being dependent on claim 4. Claim 9 is additionally rejected as having a similar limitation of claim 4 as rejected above. Claims 10-13 are rejected as being dependent on claim 9. Claim 4 further recites “…a property or type of the employed end effector, a geometry and/or arrangement of a gripping location of the end effector” in lines 19-21. It is unclear whether a property or type of end effector is to be considered separately or in conjunction with a geometry and/or arrangement of a gripping location of the end effector. By separating a property or type generally regarding the end effector from a geometry and/or arrangement which corresponds to the specific gripping location of the end effector it is assumed that each are separate qualities to be considered in evaluating the criteria. However, one might also be led to believe that such property or type might be exclusively considered as separate entities to each of a geometry and the an arrangement. Examiner best interprets that such property or type are furthered by qualities of geometry and arrangement, and thus will read the limitation as “…a property or type of the employed end effector, wherein the property or type of end effector is determined based on a geometry and/or arrangement of a gripping location of the end effector.” Claims 5 and 6 are rejected as being dependent on claim 4. Claim 6 recites “…wherein a gripping point candidate is selected as the target gripping point depending on a position of the gripping point candidate…” in lines 4-6. It is unclear whether the position being referred to is the same position of the gripping point candidate as was recited in claim 4, or if the position being referred to is a position of the gripping point candidate in the ranking list which is introduced in claim 6. Examiner best interprets this limitation to be the position of the gripping point candidate in the ranked list and thus will read the limitation as “…wherein a gripping point candidate is selected as the target gripping point depending on a position of the gripping point candidate in the ranking list…”. Claim 8 further recites “…selecting one of the two or more gripping point determination algorithms as the target evaluation algorithm…” in lines 5-6. There is insufficient antecedent basis for this limitation of the claim. No such target evaluation algorithm has been defined and as such it is unclear which target evaluation algorithm is being referred to. As such, Examiner will read the claim to generally refer to “…selecting one of the two or more gripping point determination algorithms as a target evaluation algorithm…” such that any target evaluation algorithm will suffice. Claims 9-13 are rejected as being dependent on claim 8. Claim 9 recites the limitation “…a particular evaluation speed of the gripping point determination algorithms…” in lines 5-6. It is unclear what is meant by particular in this claim limitation. For instance, a particular evaluation speed could be that of one iteration of a control cycle for a respective algorithm, or a particular evaluation speed could be that of the common evaluation speed based on the type of algorithm which is run. Based on Applicant’s specification, it is best understood that such a particular evaluation speed is a calculation duration and Examiner therefore will read the limitation as “a calculation duration of the gripping point determination algorithms”. Claims 10-13 are rejected as being dependent on claim 9. Claim 14 recites “…the method comprising the reception of gripping success data…” in lines 4-5. There is insufficient antecedent basis in this limitation. No such gripping success data has been defined and as such it is unclear which receipt of such data is being referred to. Provided additionally that this is a method step, it is best understood that by the reception applicant simply means an act of receiving the data. As such, Examiner best interprets the limitation to read “…the method comprising receiving gripping success data…”. Claim 15 recites “…wherein the handling system comprises a plurality of end effectors which can optionally be coupled to the at least one robot, and/or wherein the handling system comprises a plurality of robots, on each of which at least one end effector for gripping an item is arranged, wherein the selection of a gripping point candidate as the target point and/or the selection of a gripping point determination algorithm as the target evaluation algorithm depends on which end effector is coupled to the at least one robot, and/or which of the optional plurality of robots is to grip the item” in lines 2-11. First, the multiple uses of and/or in these limitations makes it unclear which aspects are grouped together and linked in the conjunction. For example, each and/or may be interpreted as determining individual limitations to be considered such that the plurality of end effectors, the plurality of robots which select a gripping point candidate, the selection of a gripping point determination algorithm depending on the end effector, and finally an optional plurality of robots gripping an item are each to be considered separately. Such an interpretation would not make much sense technically. An alternative interpretation might also be that such handling system with the plurality of robots determines the selection of a candidate or an algorithm, but such selection would additionally depend on the end effectors of the initial limitation which is separated from the body of the claim by a comma followed by and/or. Examiner best understands that Applicant wished to provide a selection process which is unique to either the case of a plurality of end effectors coupled to the robot or alternatively the case of the plurality of robots which is coupled to at least one end effector. Further, Applicant refers to the optional coupling of a plurality of end effectors to the at least one robot and additionally the optional plurality of robots. In either case, as evaluated individually, Examiner cannot ascertain how either claimed feature is optional. In the case of the plurality of end effectors, it is unclear if the optional attachment of the end effector would determine there is a robot that does not have an end effector or if there is merely the case in which only one of the plurality of end effectors is attached to the robot. Examiner best understands that a robot is coupled to at least one of a plurality of end effectors and as such the coupling of the end effector is not optional. In the case of the plurality of robots, it is unclear if the optional plurality of robots would determine no such robots exist within a given embodiment, or if there is merely a case in which only one of the plurality of robots exists in the embodiment. Examiner best understands that at least one robot must be present in the embodiment and as such the plurality of robots is not optional. Additionally, there is insufficient antecedent basis for the selection of a gripping point determination algorithm as the target evaluation algorithm. Claim 1 does not define such a selection of a target evaluation algorithm. Instead, Claim 8 defines the selection process for a target evaluation algorithm. As such, Examiner best understands the claim to depend from claim 8 which recites this feature. As a result of the multiple issues stated above, Examiner will read the claim as: “The computer-implemented method of claim 8, wherein the handling system comprises at least one of: a plurality of end effectors which can be coupled to the at least one robot, wherein either the selection of a gripping point candidate as the target gripping point or the selection of a gripping point determination algorithm as the target evaluation algorithm depends on which of the plurality of end effectors is coupled to the at least one robot, or a plurality of robots on each of which at least one end effector for gripping an item is arranged, wherein either the selection of a gripping point candidate as the target gripping point or the selection of a gripping point determination algorithm as the target evaluation algorithm depends on which of the plurality of robots is to grip the item.” Claim 18 recites “…comprising commands which, when the method is executed by a computer, cause the computer to execute the steps of the computer-implemented method according to claim 1” in lines 1-4. It is unclear whether the method as generally recited is the same method as the computer-implemented method of claim 1 or if such general method is that of the commands of the computer program product. Examiner best understands that the method which is generally referred to is that of the commands of the computer program product and as such will interpret the claim to read “…comprising commands which, when the commands are executed by a computer, cause the computer to execute the computer-implemented method according to claim 1.” Examiner notes wherein the claims have been addressed below, in view of the prior art record, as best understood by the Examiner in light of the 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph rejections provided herein. NOTE: In light of the number of above informalities and rejected limitations, Examiner encourages applicant to additionally review claims upon amendment to determine that all such intended interpretations of the claims are captured by the language of the claim as will be determined by the prior art rejections recited below. Claim Rejections - 35 USC § 101 Claim 18 is rejected under 35 U.S.C. § 101 because the claimed invention is directed to non-statutory subject matter. The claims do not fall within at least one of the four categories of patent eligible subject matter because the broadest reasonable interpretation of the “a computer program product comprising commands” encompasses software per se. See MPEP §2106.03.11 (“Non-limiting examples of claims that are not directed to any of the statutory categories include: ... products that do not have a physical or tangible form, such as information (often referred to as “data per se”) or a computer program per se (often referred to as “software per se”) when claimed as a product without any structural recitations)”. A computer program product comprising commands does not itself have any computer or processing ability, or any memory and therefore is directed towards software per se. Page 21 of the specification recites “A computer program is stored on the non- volatile memory device and comprises commands which, when executed by the data processing system of the control device, cause the data processing system to execute the method described above.” Inclusion of such non-transitory computer-readable storage medium is thus recommended to overcome this 101 rejection. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. NOTE: Wherein the claims have been addressed below regarding 103 rejections, any significant interpretations by Examiner with regard to 112(b) clarity issues required to make the rejection have been injected into the claim such that the Examiner’s interpretation which has been injected is highlighted by bold italics. NOTE (2): Wherein the claims have been addressed below regarding 103 rejections, any alternatives which have not been considered by the prior art of record and/or any such teachings which are not explicitly referenced by the prior art have been struck through. Claims 1-13 and 15-19 are rejected under 35 U.S.C. 103 as being unpatentable over Ku et al. (US 2023/0256601 A1; hereinafter “Ku ‘601”) in view of Hickman et al. (US Patent No. 8,965,104; hereinafter “Hickman”). Regarding claim 1, Ku ‘601 teaches a computer-implemented method for controlling a handling system (“The method can be performed using a system of one or more computers in one or more locations configured to control a robot having grasping capabilities (an example of which is shown in FIG. 2) including one or more: robots 220, sensors, computing systems 230, sensors 240, and/or any other suitable components” [0021]. Thus, the method is computer-implemented and is configured to control a grasping robot, i.e., handling system.), the computer-implemented method comprising: - at least one robot on which an end effector for gripping an item is arranged (“The robot 220 functions to manipulate an object. The robot can include one or more: end effectors 222, robotic arms 224, and/or any other suitable components” [0022]. Thus, there is a robot with an end effector which manipulates, i.e., grips, an object, i.e., item.); - a detection device comprising at least one detection unit which is designed to capture an image of an item to be gripped (“The sensors 240 function to sample measurements of a physical scene. The sensors 240 can include: visual sensors (e.g., monocular cameras, stereo cameras, projected light systems, TOF systems, etc.), acoustic sensors, actuation feedback systems, and/or any other suitable sensors” [0024]. Thus, there are visual sensors which sample measurements of a physical scene, i.e., detection units which are designed to capture an image of the item(s) to be gripped. Such visual sensors which are listed satisfy the 112f claim interpretation of detection unit above, in which such sensors are inclusive of cameras.); and - a control device for controlling the handling system, wherein the control device comprises a data processing system and a non-volatile memory device (“The computing system 230 functions to perform one or more steps of the method, but can additionally and/or alternatively provide any other suitable functionality. The computing system 230 can be local to the robot, remote, and/or otherwise located” [0025]. “Embodiments of the subject matter described in this specification can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions encoded on a tangible non-transitory storage medium for execution by, or to control the operation of, data processing apparatus” [0084]. Thus, the computing system which controls the handling system according to the methods of the disclosure comprises computer programs stored on a non-transitory storage medium, i.e., non-volatile memory device, which are further executed by a data processing apparatus, i.e., data processing system.), wherein the method comprising performing one or more control cycles (“All or portions of the method can be performed once, iteratively, repeatedly (e.g., for different objects, for different physical scenes, for different time frames, for different sensors), periodically, and/or otherwise performed” [0029]. Thus, provided that the method is performed once, iteratively, periodically, or repeatedly, the method performs one or more control cycles.), each control cycle comprising: a) receiving image data that represent an image of at least one portion of the item to be gripped, which image is captured by means of the detection device (“S100 functions to determine a measurement of a physical scene having a container 410 that contains one or more objects 420 to be grasped; example shown in FIG. 4. The measurement can include one measurement, multiple measurements, and/or any other suitable number of measurements. The measurement can be captured by a sensor, retrieved from a database, and/or otherwise determined. The measurement can be an image, depth information, point clouds, video, and/or any other suitable measurement” [0031]. Thus, the method determines a measurement of the container which contains objects to be grasped, and such a received measurement is an image which is captured by a sensor, i.e., the detection device.), b) determining a target gripping point for the end effector on the item, comprising analyzing the image data (“Selecting a grasp S40 functions to determine a grasp proposal from the set based on a final score associated with each grasp proposal” [0067]. Thus, there is a selected grasp proposal, i.e., target gripping point, which is a result of determining a set of grasp proposals (see Fig. 3, S300) via analyzing the image data.), and c) generating control signals which cause the at least one robot to grip the item at the target gripping point by means of the end effector (“Executing the grasp trajectory S50 functions to move the robotic arm and/or end effector to grasp an object in the physical scene based on the calculated grasp trajectory associated to the selected grasp proposal” [0069]. Thus, there is a step in the method which moves the end effector to grasp the object based on the selected grasp proposal, i.e., generates control signals causing the robot to grip the item at the target gripping point by means of the end effector.), the determination of the target gripping point comprises: b1) analyzing the image data by (“Determining a set of grasp proposals across the workspace S300 functions to determine grasp proposals within the workspace. Each grasp proposal can be associated with a virtual projection of the end effector onto the physical scene (e.g., a “window”), associated with an end effector pose (e.g., location and orientation; x, y, z position and α, β, γ orientation), and/or associated with any other suitable end effector attribute. Grasp proposals can be: predetermined, dynamically determined, randomly determined, and/or otherwise determined” [0033]. Thus, a set of gripping candidates using any of a variety of algorithms to project grasp proposals into the workspace. The workspace is determined in S100 and S200 based on a captured image (see [0031-0032]).); and b2) selecting a gripping point candidate from the set Me of determined gripping point candidates as the target gripping point depending on one or more specified gripping point selection criteria (“Selecting a grasp S40 functions to determine a grasp proposal from the set based on a final score associated with each grasp proposal… In a first variant, S40 can include selecting a grasp proposal based on the preliminary score. In a second variant, S40 can include selecting a grasp proposal based on one or more heuristic scores. In a third variant, S40 can include selecting a grasp proposal based on a combination of the preliminary and heuristic scores. In a fourth variant, S40 can include selecting a grasp proposal based on the grasp score or a combination with preliminary score and/or heuristic score” [0067]. A target gripping point is determined in S40 depending on a preliminary score, one or more heuristic scores, or any combination of such scores.). However, Ku ‘601 does not explicitly teach …analyzing the image data by two or more mutually independent gripping point determination algorithms… Hickman, in the same field of endeavor, teaches …analyzing the image data by two or more mutually independent (“As described above, to determine an appropriate image processing algorithm and corresponding parameter set for the robot 301, the cloud processing engine 302 shown in example 300 applies "n" different image processing algorithms 305, 306, 307 to the image included in the image data 304 received from the robot 301. Example 300 also shows each algorithm of the plurality of algorithms 305, 306, 307 being applied to the image multiple times. In operation, each application of the image processing algorithm to the image is executed with a different set of image processing parameters (which may include some similar or same parameters), and each application of a particular algorithm configured with a corresponding parameter set yields a different image processing result” (C 14, L 48-60). Thus, there are two or more mutually independent algorithms which analyze the image data.)… Hickman does not explicitly apply the algorithms to gripping point determinations, only general image processing. However, given that Ku ‘601 teaches an analysis of gripping point candidates with each iteration of image analysis, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the gripping point determinations of Ku ‘601 to include image analysis with a plurality of algorithms as taught by Hickman with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because a higher quality image processing results would lead to a higher quality grasp proposal due to higher quality attribute detections in the respective image of the object to be gripped (Hickman, (C2,L 63-C3,L14)). Such a modification would additionally be a combination of known methods which yield predictable results (see MPEP 2143.I(A)). Regarding claim 2, Ku ‘601 as modified by Hickman (references made to Hickman) teaches the computer-implemented method according to claim 1, wherein target selection data are determined which contain information as to which of the two or more gripping point determination algorithms has determined the gripping point candidate selected as the target gripping point in a control cycle (“After determining a particular image processing algorithm and corresponding parameter set for execution by the robot's 301 machine vision system, the cloud processing engine 302 (alone or in combination with other cloud computing system components) may send the determined image processing algorithm and parameter set to the robot 301 via a response 317” (C12,L47-53). Thus, the selected image processing algorithm and parameter set are considered as the target selection data containing information as to which of the two or more algorithms has determined the selection in a control cycle based on a quality score.), and/or Regarding claim 3, Ku ‘601 as modified by Hickman (references made to Hickman) teaches the computer-implemented method according to claim 2, wherein at least one of the target selection data or the selection frequency data are transmitted to at least one of an external computer, an external data network, a computer, or a data network of a provider providing the gripping point determination algorithm which selected the target gripping point (“For example, in some embodiments, after the cloud processing engine 302 has selected a particular algorithm and corresponding parameter set based on the above-described multiple algorithm analysis process, the cloud processing engine 302 may send the selected algorithm and parameter set (or at least an indication of the selected algorithm and parameter set) to the machine vision knowledge base 303 along with the environmental data, task data, and/or object data (as previously described) that may have been received from the robot 301 along with the image data 304” (C18,L11-20). Thus, the target selection data is transmitted to an external knowledge base, i.e., data network.). Regarding claim 4, Ku ‘601 as modified by Hickman (references made to Ku ‘601) teaches the computer-implemented method according to claim 3, wherein the at least one gripping point selection criterion comprises one or more of the following criteria: - at least one of a position, coordinates, or orientation of the gripping point candidate in a coordinate system of the handling system (“This process can determine a second score representative of how favorable and/or unfavorable a grasp proposal is based on the associated waypoints, grasp parameters (e.g., orientation, location, etc.) and/or other heuristics” [0059]. Thus, there is a heuristic measurement for grasp scoring which is based on the orientation and location of the gripping point candidate, i.e., grasp proposal.); - a probability of success when gripping the item at the gripping point candidate, wherein the gripping point candidate is selected as the target gripping point which has the highest probability of success (“The heuristic score can be indicative of trajectory success, efficiency, speed, and/or any other suitable metric” [0059]. Thus, there is a heuristic based on success. Additionally, “In variants, the grasp can be selected based on: a probability of grasp success, a grasp execution speed, and/or otherwise selected. The probability of grasp success can be determined based on the scene features appearing within each grasp window, waypoints for the grasp (e.g., calculated using safety margins, etc.), and/or otherwise determined” [0018]. Thus, there is a passage which describes selection based on probability of success.); - a property or type of the item to be gripped, wherein the property or type of the item is based on at least one of the item’s geometry, surface quality, or material properties (“In a fourth variant, S400 can include determining a preliminary score based on the material of the object at the grasp proposal. For example, S400 can include determining the material of the object (e.g., using RGB imagery), and assigning a more favorable score to grasp proposals with object material more suitable for grasping (e.g., material suitable for strong suction seal), such as nonporous surfaces or flat surfaces” [0043]. Thus, there is a scoring criteria which depends on a material property and/or geometry of the item to be gripped.); - energy consumption to be expected when gripping the item at the gripping point candidate (“The heuristic score can be indicative of trajectory success, efficiency, speed, and/or any other suitable metric” [0059]. Thus, there is a heuristic based on efficiency and speed, i.e., energy consumption to be expected.); and/or Regarding claim 5, Ku ‘601 as modified by Hickman (references made to Ku ‘601) teaches the computer-implemented method according to claim 4, wherein selecting the gripping point candidate as the target gripping point comprises: - receiving gripping point selection criteria data which represent a user-specified selection of one or more of the gripping point selection criteria, and selecting the gripping point candidate as the target gripping point depending on the selected gripping point selection criterion or selected gripping point selection criteria (“Selecting a grasp S40 functions to determine a grasp proposal from the set based on a final score associated with each grasp proposal. The final score can be a combined score (e.g., scaled, weighted) based on the preliminary score and the one or more heuristic scores, based only on the preliminary score, based only on one heuristic score, based only on multiple heuristic scores, based only on the grasp score, and/or based on any other suitable score or combination thereof” [0067]. Thus, the selection occurs based on a specified scoring criteria which can be based on a preliminary score, one or more heuristic scores, or the grasp score.); and/or - receiving gripping point selection criteria weighting data which represents a user-specified weighting of the gripping point selection criteria, and selecting the gripping point candidate as the target gripping point depending on the weighted gripping point selection criteria (“Selecting a grasp S40 functions to determine a grasp proposal from the set based on a final score associated with each grasp proposal. The final score can be a combined score (e.g., scaled, weighted) based on the preliminary score and the one or more heuristic scores, based only on the preliminary score, based only on one heuristic score, based only on multiple heuristic scores, based only on the grasp score, and/or based on any other suitable score or combination thereof” [0067]. Thus, the selection occurs based on a specified scoring criteria which is weighted.). Ku ‘601 does not explicitly state that such final score criteria are user-selected or received. However, given that such scoring criteria are given as alternative methods to be considered, it would have been obvious to one of ordinary skill in the art that the system would require a user-selection for the criteria based on design incentives and the goal of operation (see MPEP 2143.I(F)). Regarding claim 6, Ku ‘601 as modified by Hickman (references made to Ku ‘601) teaches the computer-implemented method according claim 5, wherein the determined gripping point candidates are evaluated depending on gripping point selection criteria and sorted in a ranking list (“The system can for example rank and filter the predetermined grasps according to their scores” [0079]. Thus, the grasps are ranked according to their scores, i.e., gripping point selection criteria.), wherein a gripping point candidate is selected as the target gripping point depending on a position of the gripping point candidate in the ranking list, wherein the uppermost gripping point candidate in the ranking list is selected as the target gripping point (“The system can then select a single predetermined grasp having the highest score” [0079]. Thus, the target gripping point is selected based on a position of the candidate in the ranking list wherein the candidate with the highest score, i.e., uppermost candidate in the list, is selected.). Regarding claim 7, Ku ‘601 as modified by Hickman (references made to Hickman) teaches the computer-implemented method according to claim 1, wherein the determination of the target gripping point also comprising: b0) selecting the two or more gripping point determination algorithms from a set Ma of available gripping point determination algorithms (“Likewise, the cloud computing system 302 could query the machine vision knowledge base 303 to select a set of candidate algorithms and/or parameter sets for use in a multiple algorithm analysis described herein” (C17,L47-50). Thus, the machine vision knowledge base, i.e., set Ma of available algorithms, is queried to select two or more algorithms to be used in the multiple algorithm analysis.), wherein the two or more gripping point determination algorithms are selected from the set Ma depending on at least one specified algorithm selection criterion (“Alternatively, when the cloud processing engine 302 receives environmental data, object data, and/or task data associated with a particular image from the robot 301, the cloud processing engine 302 may search or query the machine vision knowledge base 303 to identify one or more candidate algorithms and parameter sets that are correlated with the same or similar environmental data, object data, and/or task data received from the robot 301 in the image data 304” (C19,L37-45). Thus, the algorithms are selected from the knowledge base, i.e., set Ma, based on environmental data, object data, and/or task data which will serve as the algorithm selection criterion.). Regarding claim 8, Ku ‘601 as modified by Hickman (references made to Hickman) teaches the computer-implemented method according to claim 7, wherein selecting the gripping point candidate as the target gripping point comprises: b2.1) selecting one of the two or more gripping point determination algorithms as a target evaluation algorithm depending on at least one specified algorithm selection criterion (“After generating the plurality of quality scores 310, 313, and 316, the cloud processing engine 302 determines which quality score is the highest. After determining the highest quality score, the cloud processing engine 302 selects the image processing algorithm and the parameter set that was used to generate the image processing result having the highest score, and then the cloud processing engine 302 sends an indication of the selected algorithm and parameter set to the robot 301 via response 317. In situations where multiple image processing results have the highest score, the cloud processing engine 302 may be configured to select one of the multiple highest scoring results based in part on environmental data, task data, and/or object data (as previously described) received from the robot 301” (C16, L23-36). Thus, the target evaluation algorithm with the highest quality score depends on the algorithm selection criteria inclusive of environmental data, task data, and object data.)… Ku ‘601 as modified does not explicitly teach …b2.2) selecting one of the gripping point candidates determined by the target evaluation algorithm as the target gripping point depending on the at least one gripping point selection criterion, or selecting the gripping point candidate determined by the target evaluation algorithm as the target gripping point. However, provided that the modification of Ku ‘601 in view of Hickman combines the gripping point determinations of Ku ‘601 with the algorithm determinations of Hickman such that each algorithm iteration provides a gripping point candidate which is reflected by a quality score (as taught by both Ku ‘601 and Hickman), it would be implied that the selected target algorithm would additionally select a target gripping point which has been determined by the selected target algorithm. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, that Ku ‘601 as modified by Hickman additionally teaches “selecting the gripping point candidate determined by the target evaluation algorithm as the target gripping point” with a reasonable expectation of success. Regarding claim 9, Ku ‘601 as modified by Hickman (references made to Hickman) teaches the computer-implemented method according to claim 8, wherein the at least one specified algorithm selection criterion comprises one or more of the following selection criteria: - a property or type of the item to be gripped, wherein the property or type of the item is based on at least one of the item’s geometry, surface quality, or material properties (“For example, if a robot needs to locate and retrieve a particular object based in part on its color (e.g., a red pen, a blue pen, a yellow highlighter, etc.), then it may be desirable for the robot to use an image processing algorithm configured to discern fine differences between small objects with a set of corresponding image processing parameters designed to detect the desired color. Similarly, if the robot needs to locate and retrieve a particular object based in part an image on the surface of the object (e.g., a coffee cup with a particular design or logo), then it may be desirable for the robot to use a different image processing algorithm configured with a set of corresponding image processing parameters to help the robot's machine vision system identify the particular design or logo on the cup. Additionally, if the robot needs to read a bar code, QR code or other type of code on the surface of an object, then it may be desirable for the robot to use yet a different image processing algorithm configured with a set of corresponding image processing parameters to help the robot's machine vision system pick out the details of the code from other information on the surface object” (C11,L25-44). Thus, the object data determined as the selection criteria for determining a target algorithm may be inclusive of a property or type with a given surface quality as is exemplified above.); Regarding claim 10, … the computer-implemented method according to claim 9, wherein the selection of the gripping point determination algorithm as the target evaluation algorithm comprises: - receiving algorithm selection criteria data which represent a user-specified selection of one or more of the algorithm selection criteria, and selecting the gripping point determination algorithm as the target evaluation algorithm depending on the selected algorithm selection criterion or selected algorithm selection criteria (“Many different types of client devices may be configured to communicate with components of the cloud computing system 102 for the purpose of accessing data and executing applications provided by the cloud computing system 102. For example, a computer 112, a mobile device 114, a host 116, and a robot client 118 are shown as examples of the types of client devices that may be configured to communicate with the cloud computing system 102” (C5, L60-67). “Additionally, any of the client devices may also include a user-interface (UI) configured to allow a user to interact with the client device. For example, the robot client 118 may include various buttons and/or a touchscreen interface configured to receive commands from a human or provide output information to a human. As another example, the robot client 118 may also include a microphone configured to receive voice commands from a human. Furthermore, the robot client 118 may also include one or more interfaces that allow various types of user-interface devices to be connected to the robot client 118. For example, the mobile device 114, the computer 112, and/or the host 116 may be configured to run a user-interface for sending and receiving information to/from the robot client 118 or otherwise configuring and controlling the robot client 118” (C6, L49-63). Thus, client devices, inclusive of user interfaces which communicate with the robot through the cloud platform, are used to issue specific task commands, such as the object retrieval determinations described in the previous claims. Therefore, the algorithm selection criteria which determine the quality scores are based on a user-selected task and object data, and thus, the user-selection aids the determination of the target algorithm as previously described.); and/or Regarding claim 11, Ku ‘601 in view of Hickman (references cited directly) teaches the computer-implemented method according to claim 10, wherein the selection of the target evaluation algorithm comprises: b2.1.1) selecting one of the two or more gripping point determination algorithms as a test algorithm, wherein the test algorithm is selected depending on at least one of the specified algorithm selection criteria (“In some embodiments, rather than selecting the image processing result having the highest quality score, the cloud processing engine 302 may instead select an image processing result having a quality score that meets or exceeds a minimum quality score threshold” (Hickman, (C16,L36-40)). Thus, an image processing result which meets a minimum threshold based on the quality score will be considered as the test algorithm.); b2.1.2) specifying a gripping point rejection criterion or a plurality of gripping point rejection criteria (“Optionally adjusting each grasp proposal S500 functions to adjust each grasp proposal until a predetermined condition is satisfied” (Ku ‘601, [0046]). “In a third variant, S500 can include removing grasp proposals of the set that fail the condition from consideration for subsequent steps” (Ku ‘601, [0049]). Thus, the predetermined conditions may be considered as gripping point rejection criteria wherein a gripping point is rejected from further consideration when the condition is not met.); and b2.1.3) checking whether the at least one gripping point candidate determined by the test algorithm meets the specified gripping point rejection criterion or meets one of the specified gripping point rejection criteria, wherein, when the at least one gripping point candidate does not meet a gripping point rejection criteria, the test algorithm is selected as the target evaluation algorithm (“There are multiple ways that the robot 301 can determine that its machine vision system is starting to degrade. For example, any set of one or more of the following conditions could be sufficient to trigger the process of obtaining an sending new image data (perhaps along with additional data) to the cloud processing system 302 for selecting a new parameter set or a new algorithm and parameter set: (i) the quality score of an image processing result falls below a threshold quality score…” (Hickman, (C17,L23-31)). Thus, when the quality score threshold is not met, i.e., a predetermined condition is not satisfied, a new target evaluation algorithm will be considered. Otherwise, the robot may continue to use the selected algorithm, and thus will determine the test algorithm as a target algorithm when the quality score does not fall beneath the threshold.). Therefore, it would have been obvious to one of ordinary skill in the art to have modified the rejection criteria of Ku ‘601 to be inclusive of quality score criteria as determined by Hickman to reflect a test algorithm selection protocol with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to combine such teachings because early detections of degradations of a given image processing algorithm would mitigate failures in the gripping determination system. Regarding claim 12, Ku ‘601 as modified by Hickman (references made to Hickman) teaches the computer-implemented method according to claim 11, wherein when the at least one gripping point candidate meets the specified gripping point rejection criterion or one of the plurality of specified gripping point rejection criteria, steps b2.1.1) to b2.1.3) are repeated, wherein in step b2.1.1) a different gripping point determination algorithm is selected as a test algorithm (C17,L5-50 determines that when degradation in the quality result of a given algorithm occurs, a new algorithm is selected as the next algorithm, i.e., a test algorithm, and the process for degradation analysis may be repeated.). Regarding claim 13, Ku ‘601 as modified by Hickman (references made to Ku ‘601) teaches the computer-implemented method according to claim 12, wherein the specified gripping point rejection criterion is one of the following criteria, or wherein the plurality of specified gripping point rejection criteria comprise one or more of the following criteria: a. gripping point candidate cannot be approached by the at least one end effector; and/or b. the end effector approaching the gripping point candidate would lead to a collision of the end effector with another item with a given probability (“Optionally adjusting each grasp proposal S500 functions to adjust each grasp proposal until a predetermined condition is satisfied. The condition can be: a grasp vector does not collide with a predetermined workspace feature (e.g., lip of a box), a grasp vector is within an interval distance of a predetermined workspace feature, and/or any other suitable condition” [0046]. Thus, grasps which determine to be obstructed, i.e., unapproachable, and/or which would lead to a collision are adjusted, i.e., preliminarily rejected.). Regarding claim 15, Ku ‘601 as modified by Hickman (references made to Ku ‘601) teaches the computer-implemented method of claim 8, wherein the handling system comprises at least one of: a plurality of end effectors which can be coupled to the at least one robot (“The robot 220 functions to manipulate an object. The robot can include one or more: end effectors 222, robotic arms 224, and/or any other suitable components. The end effector 222 can be: a suction cup, a gripper, and/or any other suitable end effector” [0022]. Thus, the robot may be coupled to a plurality of end effectors including a suction cup, gripper, or other such end effectors.), wherein either the selection of a gripping point candidate as the target gripping point or the selection of a gripping point determination algorithm as the target evaluation algorithm depends on which of the plurality of end effectors is coupled to the at least one robot (“The robot 220 and/or end effector 222 can be associated with an active surface (e.g., grasping region, contact region, etc.). The active surface can be associated with a projection of the end effector’s active surface (e.g., “a grasp window”) onto a plane (e.g., x-y plane, vertical plane, etc.). The grasp window (e.g., “window”) for each end effector, each grasp pose (e.g., the template grasp pose, etc.) can be known and/or predetermined for one or more end effector orientations (e.g., determined from the set of grasp proposals), but can alternatively be dynamically determined and/or otherwise determined. In a first example where a template grasp orientation is used for grasp proposal generation, the grasp window for the end effector can be determined once (e.g., based on the template grasp orientation) and reused for multiple grasps. In a second example, a different grasp window can be calculated for each grasp proposal (e.g., predetermined or determined based on scene information). However, the robot can be otherwise configured” [0023]. Thus, in each example, the selection of a gripping point candidate is determined based on which of the plurality of end effectors is coupled to the robot, as each end effector has a specific grasp window which may otherwise be manipulated according to the type of end effector.), or Regarding claim 16, Ku ‘601 as modified by Hickman (references made to Ku ‘601) teaches a handling system, comprising: - at least one robot on which an end effector for gripping an item is arranged (“The robot 220 functions to manipulate an object. The robot can include one or more: end effectors 222, robotic arms 224, and/or any other suitable components” [0022]. Thus, there is an end effector arranged on a robot which is for manipulating objects, i.e., gripping an item.); - a detection device comprising at least one detection unit, or a camera, which is designed to capture an image of an item to be gripped (“The method can be performed using a system of one or more computers in one or more locations configured to control a robot having grasping capabilities (an example of which is shown in FIG. 2) including one or more: robots 220, sensors, computing systems 230, sensors 240, and/or any other suitable components” [0021]. “The measurement can be an image, depth information, point clouds, video, and/or any other suitable measurement. For example, S100 can include: moving a robot arm in front of a constrained volume containing objects, capturing an image (e.g., and/or depth information) with a camera, wherein the camera is mounted in front of the gripper” [0031]. Thus, there is a detection device having at least one detection unit, or camera, which is designed to capture an image of an item to be gripped, per the disclosed method and S100.); and - a control device for controlling the handling system, wherein the control device comprises a data processing system and a non-volatile memory device, wherein a computer program is stored on the non-volatile memory device which comprises commands which, when executed by the data processing system, cause the data processing system to execute the method according to claim 1 (“The computing system 230 functions to perform one or more steps of the method, but can additionally and/or alternatively provide any other suitable functionality. The computing system 230 can be local to the robot, remote, and/or otherwise located” [0025]. “Embodiments of the subject matter described in this specification can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions encoded on a tangible non-transitory storage medium for execution by, or to control the operation of, data processing apparatus” [0084]. Thus, there is a control device with a data processing apparatus and a non-volatile memory which executes the disclosed methods.). Regarding claim 17, Ku ‘601 as modified by Hickman (references made to Ku ‘601) teaches the handling system according to claim 16, wherein the end effector is a suction gripping apparatus, an elastomer suction gripper, or a vacuum gripper (“The end effector 222 can be: a suction cup” [0022]. Thus, the end effector is a suction gripping apparatus.). Regarding claim 18, Ku ‘601 as modified by Hickman (references made to Ku ‘601) teaches a computer program product comprising commands which, when the commands are executed by a computer, cause the computer to execute the steps of the computer-implemented method claim 1 (“Embodiments of the subject matter described in this specification can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions encoded on a tangible non-transitory storage medium for execution by, or to control the operation of, data processing apparatus” [0084]. Thus, there is a computer program product stored on a non-transitory medium which performs the disclosed methods.). Regarding claim 19, Ku ‘601 teaches a handling system (“The method can be performed using a system of one or more computers in one or more locations configured to control a robot having grasping capabilities (an example of which is shown in FIG. 2) including one or more: robots 220, sensors, computing systems 230, sensors 240, and/or any other suitable components” [0021]. Thus, the disclosure is directed to a grasping robot, i.e., handling system.) comprising: - at least one robot on which an end effector for gripping an item is arranged (“The robot 220 functions to manipulate an object. The robot can include one or more: end effectors 222, robotic arms 224, and/or any other suitable components” [0022]. Thus, there is a robot with an end effector which manipulates, i.e., grips, an object, i.e., item.); - a detection device comprising at least one detection unit which is designed to capture an image of an item to be gripped (“The sensors 240 function to sample measurements of a physical scene. The sensors 240 can include: visual sensors (e.g., monocular cameras, stereo cameras, projected light systems, TOF systems, etc.), acoustic sensors, actuation feedback systems, and/or any other suitable sensors” [0024]. Thus, there are visual sensors which sample measurements of a physical scene, i.e., detection units which are designed to capture an image of the item(s) to be gripped. Such visual sensors which are listed satisfy the 112f claim interpretation of detection unit above, in which such sensors are inclusive of cameras.); and - a control device for controlling the handling system, wherein the control device comprises a data processing system and a non-volatile memory device (“The computing system 230 functions to perform one or more steps of the method, but can additionally and/or alternatively provide any other suitable functionality. The computing system 230 can be local to the robot, remote, and/or otherwise located” [0025]. “Embodiments of the subject matter described in this specification can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions encoded on a tangible non-transitory storage medium for execution by, or to control the operation of, data processing apparatus” [0084]. Thus, the computing system which controls the handling system according to the methods of the disclosure comprises computer programs stored on a non-transitory storage medium, i.e., non-volatile memory device, which are further executed by a data processing apparatus, i.e., data processing system.), wherein a computer program is stored on the non-volatile memory device which comprises commands which, when executed by the data processing system, cause the data processing system to execute one or more control cycles (“Embodiments of the subject matter described in this specification can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions encoded on a tangible non-transitory storage medium for execution by, or to control the operation of, data processing apparatus” [0084]. Thus, there is a computer program product stored on a non-transitory medium which performs the disclosed methods. “All or portions of the method can be performed once, iteratively, repeatedly (e.g., for different objects, for different physical scenes, for different time frames, for different sensors), periodically, and/or otherwise performed” [0029]. Thus, provided that the method is performed once, iteratively, periodically, or repeatedly, the method performs one or more control cycles.), wherein each control cycle comprising: a) receiving image data that represent an image of at least one portion of the item to be gripped, which image is captured by means of the detection device (“S100 functions to determine a measurement of a physical scene having a container 410 that contains one or more objects 420 to be grasped; example shown in FIG. 4. The measurement can include one measurement, multiple measurements, and/or any other suitable number of measurements. The measurement can be captured by a sensor, retrieved from a database, and/or otherwise determined. The measurement can be an image, depth information, point clouds, video, and/or any other suitable measurement” [0031]. Thus, the method determines a measurement of the container which contains objects to be grasped, and such a received measurement is an image which is captured by a sensor, i.e., the detection device.), b) determining a target gripping point for the end effector on the item, comprising analyzing the image data (“Selecting a grasp S40 functions to determine a grasp proposal from the set based on a final score associated with each grasp proposal” [0067]. Thus, there is a selected grasp proposal, i.e., target gripping point, which is a result of determining a set of grasp proposals (see Fig. 3, S300) via analyzing the image data.), and c) generating control signals which cause the at least one robot to grip the item at the target gripping point by means of the end effector (“Executing the grasp trajectory S50 functions to move the robotic arm and/or end effector to grasp an object in the physical scene based on the calculated grasp trajectory associated to the selected grasp proposal” [0069]. Thus, there is a step in the method which moves the end effector to grasp the object based on the selected grasp proposal, i.e., generates control signals causing the robot to grip the item at the target gripping point by means of the end effector.), the determination of the target gripping point comprises: b1) analyzing the image data by (“Determining a set of grasp proposals across the workspace S300 functions to determine grasp proposals within the workspace. Each grasp proposal can be associated with a virtual projection of the end effector onto the physical scene (e.g., a “window”), associated with an end effector pose (e.g., location and orientation; x, y, z position and α, β, γ orientation), and/or associated with any other suitable end effector attribute. Grasp proposals can be: predetermined, dynamically determined, randomly determined, and/or otherwise determined” [0033]. Thus, a set of gripping candidates using any of a variety of algorithms to project grasp proposals into the workspace. The workspace is determined in S100 and S200 based on a captured image (see [0031-0032]).); and b2) selecting a gripping point candidate from the set Me of determined gripping point candidates as the target gripping point depending on one or more specified gripping point selection criteria (“Selecting a grasp S40 functions to determine a grasp proposal from the set based on a final score associated with each grasp proposal… In a first variant, S40 can include selecting a grasp proposal based on the preliminary score. In a second variant, S40 can include selecting a grasp proposal based on one or more heuristic scores. In a third variant, S40 can include selecting a grasp proposal based on a combination of the preliminary and heuristic scores. In a fourth variant, S40 can include selecting a grasp proposal based on the grasp score or a combination with preliminary score and/or heuristic score” [0067]. A target gripping point is determined in S40 depending on a preliminary score, one or more heuristic scores, or any combination of such scores.). However, Ku ‘601 does not explicitly teach …analyzing the image data by two or more mutually independent gripping point determination algorithms… Hickman, in the same field of endeavor, teaches …analyzing the image data by two or more mutually independent (“As described above, to determine an appropriate image processing algorithm and corresponding parameter set for the robot 301, the cloud processing engine 302 shown in example 300 applies "n" different image processing algorithms 305, 306, 307 to the image included in the image data 304 received from the robot 301. Example 300 also shows each algorithm of the plurality of algorithms 305, 306, 307 being applied to the image multiple times. In operation, each application of the image processing algorithm to the image is executed with a different set of image processing parameters (which may include some similar or same parameters), and each application of a particular algorithm configured with a corresponding parameter set yields a different image processing result” (C 14, L 48-60). Thus, there are two or more mutually independent algorithms which analyze the image data.)… Hickman does not explicitly apply the algorithms to gripping point determinations, only general image processing. However, given that Ku ‘601 teaches an analysis of gripping point candidates with each iteration of image analysis, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the gripping point determinations of Ku ‘601 to include image analysis with a plurality of algorithms as taught by Hickman with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because a higher quality image processing results would lead to a higher quality grasp proposal due to higher quality attribute detections in the respective image of the object to be gripped (Hickman, (C2,L 63-C3,L14)). Such a modification would additionally be a combination of known methods which yield predictable results (see MPEP 2143.I(A)). Claim 14 is rejected under 35 U.S.C. 103 as being unpatentable over Ku ‘601 in view of Hickman and further in view of Ku et al. (US 2022/0016767 A1; hereinafter “Ku ‘767”). Regarding claim 14, Ku ‘601 as modified by Hickman teaches the computer-implemented method according to claim 1. However, Ku ‘601 as modified does not teach …wherein the handling system comprises a monitoring device which is designed to monitor the gripping of the item at the target gripping point, the method comprising receiving gripping success data generated by means of the monitoring device, which represent a gripping success when gripping the item at the target gripping point, wherein a probability of success is determined from the gripping success data, which represents a gripping success to be expected when gripping an item at a gripping point candidate determined by the gripping point determination algorithm which has determined the gripping point candidate selected as the target gripping point, wherein the probability of success in a subsequent control cycle forms a gripping point selection criterion and/or an algorithm selection criterion. Ku ‘767, pertinent to the problem at hand, teaches … wherein the handling system comprises a monitoring device which is designed to monitor the gripping of the item at the target gripping point (“Actuation feedback sensors of the actuation feedback system preferably function to enable control of the robot arm (and/or joints therein) and/or the end effector, but can additionally or alternatively be used to determine the outcome (e.g., success or failure) of a grasp attempt” [0032]. Thus, actuation feedback sensors monitor the gripping of an item, specifically the gripping outcome of a grasp attempt at a target gripping point.), the method comprising receiving gripping success data generated by means of the monitoring device, which represent a gripping success when gripping the item at the target gripping point, wherein a probability of success is determined from the gripping success data, which represents a gripping success to be expected when gripping an item at a gripping point candidate determined by the gripping point determination algorithm which has determined the gripping point candidate selected as the target gripping point (“The predetermined grasp probability score can be determined based on the number of grasp attempt successes and the number of grasp attempt failures (e.g., success divided by total grasp outcomes, failure divided by total grasp outcomes, etc.)” [0040]. Thus, there is a number of grasp attempt successes, i.e., gripping success data, which represent a gripping success when gripping the item at the target gripping point. Subsequently, the grasp probability score, i.e., probability of success, is determined by such attempt successes and determines the likelihood of success to be expected when gripping an item at a gripping point candidate which is selected as the target gripping point (see Fig. 3 and associated descriptions for more details).), wherein the probability of success in a subsequent control cycle forms a gripping point selection criterion and/or an algorithm selection criterion (“In this variant, the grasp probability score for each subregion is predetermined, wherein the grasp locations for each subregion (determined in S400) can be weighted, retained, and/or removed to calculate the graspability score based on the subregion's grasp probability. The subregion's grasp probability score is preferably determined based on empirical grasping data (e.g., historical success/failure of grasps within the given subregion; success rate for the given subregion; etc.) but can be otherwise determined” [0088]. “Selecting a candidate grasp location S500 can function to select a grasp location that is most likely to result in a grasp success. The candidate grasp location is preferably the grasp location corresponding to the highest graspability score” [0108]. Thus, the probability of success determines a graspability score which is the selection criteria for selecting a candidate grasp location.). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the gripping point selection criteria of Ku ‘601 to include the probabilities of success as taught by Ku ‘767 with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make such a modification because by basing the selection criteria on empirical success data, the grasping methods can function to increase the accuracy of grasping an object or additionally increase efficiency or speed at which the object may be grasped (Ku ‘767, [0015]). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Other references which were considered by Examiner in evaluating the current claims but were not relied upon are included in the attached “Notice of References Cited” form (i.e., PTO-892). Any inquiry concerning this communication or earlier communications from the examiner should be directed to SIDNEY L MOLNAR whose telephone number is (571)272-2276. The examiner can normally be reached 8 A.M. to 3 P.M. EST Monday-Friday. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jonathan (Wade) Miles can be reached at (571) 270-7777. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /S.L.M./Examiner, Art Unit 3656 /WADE MILES/Supervisory Patent Examiner, Art Unit 3656
Read full office action

Prosecution Timeline

Apr 09, 2024
Application Filed
Feb 02, 2026
Non-Final Rejection — §101, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12600039
ROBOT, CONVEYING SYSTEM, AND ROBOT-CONTROLLING METHOD
2y 5m to grant Granted Apr 14, 2026
Patent 12533807
ROBOTIC APPARATUS AND CONTROL METHOD THEREOF
2y 5m to grant Granted Jan 27, 2026
Patent 12479098
SURGICAL ROBOTIC SYSTEM WITH ACCESS PORT STORAGE
2y 5m to grant Granted Nov 25, 2025
Patent 12384048
TRANSFER APPARATUS
2y 5m to grant Granted Aug 12, 2025
Patent 12376922
TOOL HEAD POSTURE ADJUSTMENT METHOD, APPARATUS AND READABLE STORAGE MEDIUM
2y 5m to grant Granted Aug 05, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
54%
Grant Probability
99%
With Interview (+85.7%)
2y 4m
Median Time to Grant
Low
PTA Risk
Based on 13 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month