Prosecution Insights
Last updated: April 19, 2026
Application No. 18/713,338

MOBILE BODY AND POSITIONING CONTROL METHOD

Final Rejection §102§103§112
Filed
May 24, 2024
Examiner
GAMMON, MATTHEW CHRISTOPHER
Art Unit
3657
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Sony Group Corporation
OA Round
2 (Final)
65%
Grant Probability
Moderate
3-4
OA Rounds
2y 9m
To Grant
88%
With Interview

Examiner Intelligence

Grants 65% of resolved cases
65%
Career Allow Rate
66 granted / 102 resolved
+12.7% vs TC avg
Strong +23% interview lift
Without
With
+23.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
32 currently pending
Career history
134
Total Applications
across all art units

Statute-Specific Performance

§101
7.4%
-32.6% vs TC avg
§103
32.4%
-7.6% vs TC avg
§102
26.8%
-13.2% vs TC avg
§112
31.1%
-8.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 102 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Remarks Claim Objections The objections to the claims provided in the Office Action dated 09/02/2025 are withdrawn in light of Applicant’s amendments. Specification The objections to the specification provided in the Office Action dated 09/02/2025 are withdrawn in light of Applicant’s amendments. Claim Interpretation - 35 USC § 112(f) The interpretations under 35 USC § 112(f) provided in the Office Action dated 09/02/2025 are withdrawn in light of Applicant’s amendments and comments. Claim Rejections - 35 USC § 112(b) The rejections of the claims provided in the Office Action dated 09/02/2025 are withdrawn in light of Applicant’s amendments, except for those partially or generally repeated in the updated set of rejections below. For example, while it is clear to the Examiner that corrective actions were taken in Claim 4, the claim remains replete with unclear language and the rejections are generally maintained. Claim Rejections - 35 USC § 102 Applicant’s arguments have been considered but are moot because the new ground of rejection provided below under 35 USC § 103 does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the arguments. In particular, it is a trivial matter for the While Applicant states that “Bingham, at best, describes switching between the short-range time-of-flight sensor and the long-range time-of-flight sensor based on whether the grip is open or closed” on Pages 19 and 20 of Applicant’s Remarks filed 12/02/2025, what is actually disclosed by Bingham in [0044] is that switching may occur based on “the gripper state”. The nature of “gripper state” is not explicitly disclosed. It is suggested that at minimum “gripper state” is inclusive of at least the states “open” and “close”, but is not limited thereto. The phrase “gripper state” holds broad meaning to one of ordinary skill in the art. Additionally, while Bingham clearly illustrates and discloses that the fields of view of the sensors overlap from zero to some distance (see for example [0140] and Figure 11), Bingham explicitly states in the sentence preceding the discussion of the gripper being “open” or “closed” that the sensors may be “used” in specific ranges of an example of 0 – 10 cm and 10 – 100 cm. Furthermore, Bingham clearly indicates, and it is common knowledge, that one sensor modality or even a given sensor configured for a particular range, etc. can have greater fidelity than another at particular ranges, distances, etc. ([0040] and [0141] relate). It therefore appears that Bingham merely does not disclose with explicit clarity the implied combination to use the sensor having most fidelity for a given range based on a current measured value where the distance of 10 cm is an example threshold where the gripper grasps an object. Consequently, a rejection could be provided based simply on an obvious combination of Bingham alone. Furthermore, and related to the updated rejection that is actually provided below, it is common knowledge that grasping tasks involve opening and closing the gripper, and that said operation of opening/closing is performed with respect to a target object once the object is sufficiently close. This form of operation may be observed in the basic actions taken when a human grasps an object; the object is only grasped if the hand is closed while the object is within reach. Consequently, under this basic mode of operation the open and closed gripper states occur under the same distance threshold, rendering, under an explicit teaching as such, Bingham’s operation as performing the limitations. This form of operation is explicitly disclosed in clear terms in Hinkle (US 10471591 B1) which the updated rejection under 35 U.S.C. 103 provided below relies upon. Additionally, cascading levels of sensor use and control from macro to micro control are also clearly demonstrated in Smith (US 20100256814 A1). See for example flowchart in Figure 5. Similarly, Kawamae (US 20240027617 A1) demonstrates switching between appropriate ranging detection based on the distance to an object. In general, the concept appears to amount to simply “use the right tool for the job” in the context of proximity sensors, wherein Bingham already discloses most of the requisite context without reliance even on the knowledge of one of ordinary skill in the art. Claim Rejections - 35 USC § 103 Applicant’s arguments appear to rely on the arguments presented with respect to 35 USC § 103 which are already addressed above. Claim Objections Claims 6 and 15 are objected to because of the following informalities: Claim 6 recites the limitation “based on decrease”. It should read “based on a decrease”. Claim 15 recites the limitation “the first sensor is same”. It should read “the first sensor is the same”. Appropriate correction is required. Claim Interpretation General Notes: Terms of “first”, “second”, “third” carry no inherent limitation and serve only as a basis for further limitation, for example, easy reference to different items. Claims 1 and 18 exhibit unusual phrasing in the recitation of “the distance from the part of the mobile body to the object that is one of equal to or less than the first threshold”. Examiner notes that it appears to require the determining step/action in the claim, but is not technically dependent from it. Claim Rejections - 35 USC § 112(a) The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112: The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention. Claims 5 and 8 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. Regarding Claims 5 and 8, the claims recite the verb “deactivate”, “deactivation”, and similar. The disclosure appears to only use the word “stop” without any recitation as to the particular nature of “stop”, and never appears to recite the verb “deactivate” or similar. It is believed that “deactivate” and “stop” are not completely equivalent terms which while they may have overlapping meaning, similarly cause non-overlapping scope which is consequently unsupported. Therefore, the claim contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for pre-AIA the inventor(s), at the time the application was filed, had possession of the claimed invention. Claim Rejections - 35 USC § 112(b) The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 4 and 13 – 14 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Regarding Claim 4, the claim recites various unclear limitations. Examiner furthermore notes that it is further unclear if there is support for the claimed subject matter. However, inasmuch as even the exact nature of the claim scope is unclear, and the interpretation taken in the interest of compact prosecution, no rejections are presently provided under 35 U.S.C. 112(a) First, the claim recites the limitation: “the first threshold is a distance from the object to a width that is in a horizontal direction of an irradiation region of the first sensor, the horizontal direction is perpendicular to the part of the mobile body” (emphasis added) First, the phrasing is especially vague. There is no particular meaning to “a horizontal direction of an irradiation region” on its own. The limitation following attempting to actually define it is similarly unclear. The structure of the mobile body is not claimed in any manner, let alone the geometric relationship of the sensor, any fields of view, irradiation regions, etc. to each other or even individually. As one example, for a spherical “part of the mobile body” there is no direction which does not happen to be “perpendicular” to the part in some manner. As another example, for a sharp corner, any number of directions might be considered as perpendicular. In general, as the part is presumed to exist in three dimensions perpendicularity is highly arbitrary and/or trivial. Furthermore, what appears to actually be disclosed is that the “width” is in a plane perpendicular to the sensor field of view axis. Additionally, while there may not be any clear correlation between the sensor and mobile part, such a direction would not generally be considered perpendicular to the part of the mobile body, but would instead be generally parallel to it. See illustration in Figure 6, or even Figure 5 cited as support but which does not appear actually related to this limitation/feature wherein the line D2 or line of the object closest to the part is generally parallel to the surface of the part wherein the sensors are mounted co-planarly. Furthermore, this “distance” claiming appears to potentially only have support if the sensors and part have the specific geometric relationship illustrated in Figure 6 that is not presently claimed (coplanar, flat surface, etc.). Next, the claim recites the limitations of: “a size” with respect to areas. Areas already have an inherent size, therefore the inclusion of “size” is either redundant or indicates some further measurement related “of” the areas rather than being a comparison of the areas directly as would be understood from the disclosure. If “size” simply refers to the area already recited, it is entirely unnecessary. Otherwise, the limitation appears particularly arbitrary as presently constructed, as no standard for measuring size is provided for that isn’t the area itself already, and may thus be freely and arbitrarily chosen. For example, choosing only a portion of a length, distance, area, etc., or of a length, distance, area, etc. that includes the area, etc. Next, the claim recites the limitation: “the first area associated with the plurality of second sensors corresponds to an area between visual field of views of two second sensors of the plurality of second sensors”. The nature of “between” is not clearly claimed in any manner. Furthermore, what appears disclosed and which only makes sense is a line having a width spanning between the two fields of view of the sensors, wherein the line is perpendicular to both of the sensors field of view axis or similar. There is no clear illustration or explanation of what an area between two sensors having non-overlapping fields of view would be. Furthermore, the claim makes no clear claimed relationship of “at the width” and the areas and size in a clear manner such that it establishes a clear geometric relationship between the items, particularly in light of the above discussion of “a size”, etc. Furthermore, even within the more specific claiming provided with respect to “the first area” and “the second area” the term “area” appears inconsistent between the first and second sensor as disclosed, and additionally said second sensor “area” appears highly arbitrary such that between the two measurement areas the terms appear to be relative terms. In the case of the first sensor disclosed as a depth camera, only the overlapping area between an irradiation region of a light emitting element and the field of view/visual field of the depth camera, which would appear to be a literal area specific to an area that can actually be measured, whereas the “area” or even “width” of the second sensor encompasses area, region, width, etc. wherein no measurement can or does take place. Therefore, and alternatively, if an “area” is inclusive of unmeasured area, the term or label appears entirely arbitrary as there is no objective standard under the phrasing by which one might objectively determine one area equivalent to another. Furthermore, what is considered measurement area is also unclear, as the sensors measure in a volume and thus any number of planes may be considered with respect to area. In the interest of compact prosecution, and as it is not entirely clear the intended scope of the limitations, particularly in light of the limited supporting disclosure, Examiner has interpreted the claim as a whole after “distance” to not exist, wherein it thus simply reads “wherein the first threshold is set to a distance at which a size of a measurement area of the plurality of second sensors is equal to or larger than a size of a measurement area of the first sensor”. Examiner notes, in the interest of compact prosecution, the feature that appears to be attempted to be claimed, if claimed in a manner which is clear, narrow, and supported by the originally filed disclosure, appears to hold potential merit as allowable subject matter. This is not an indication of allowable subject matter, as further search and consideration is required of any actual claim language. Some of what actually appears supported and may sufficiently bound the claims to a non-arbitrary state are features of: the threshold occurs with respect to a comparison of widths of sensor measurement fields, one width corresponds to the actual measured field of view the other width corresponds to a width from one sensor field of view to another all sensors have fields of along a sensor field of view axis, the field of view axis being perpendicular to the sensor sensing plane all sensors axes are parallel all sensors measurement planes (where they take measurements, e.g. transmitting and receiving data) are generally coplanar each width is within a plane perpendicular to the sensors field of view/measurement axis if defining a threshold distance with respect to the mobile body part rather than the plane of the sensors, that the distance measured be from a portion of the mobile body being generally coplanar with all sensors Regarding Claim 13, the claim recites the limitations: “the adjustment of each of the posture and the position of the hand by the control method is in each of a case where: mobile body is controlled based on the first distance data, and the mobile body is controlled based on the second distance data.” It is unclear what these limitations mean. In particular, the phrasing “in each of a case where” is grammatically unusual and unclear. For example, it is not clear if it recites contingent limitations (MPEP 2111.04(II) relates), or is referring to the existing “cases” above which appear to already occur above in Claim 12, or something else. The best understanding is that the limitations mean: The mobile body is controlled based on the first distance data or the second distance data. However, this does not appear to further narrow the claim from Claim 12, just as the limitations preceding this do not appear to further narrow the claim from Claim 12 (as a CPU adjusting a posture and position of something as claimed in Claim 12 is a “control method”). Consequently, Claim 13 is not presently understood to be narrowing from Claim 12 in any meaningful way, as the only understanding of the claim does not appear to recite any further narrowing limitations and instead recite the same limitations in a different but not patentably distinct way. Regarding Claim 14, the claim depends from Claim 13 and inherits the deficiencies of Claim 13. Thus, Claim 14 is rejected under the same logic as Claim 13. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1, 4 – 5, 7 – 13, and 15 – 18 are rejected under 35 U.S.C. 103 as being unpatentable over Bingham et al. (US 20190176348 A1) further in view of Hinkle (US 10471591 B1). Regarding Claim 1, Bingham teaches: A mobile body (See at least [0109] “FIG. 7 illustrates a sensing device for a robotic gripper, in accordance with example embodiments”) comprising: a first sensor configured to measure three-dimensional information associated with an object (See at least [0109] “The PCB 700 may include sensors including … a long-range time-of-flight sensor 720, and an infrared microcamera 730 arranged on a front side of PCB 700”. Examiner notes that the nature of “three-dimensional information” is particularly broad. Measuring 3D information is generally performed when measurements are taken of a 3D environment. The claim does not specify that the data of the sensor is of a particular nature. Additionally, the particular nature of the association with the object is not claimed); a second sensor configured to measure a distance from a part of the mobile body to the object (See at least [0109] “The PCB 700 may include sensors including a short-range time-of-flight sensor 710 …”), wherein a first minimum measurement distance of the first sensor is greater than a second minimum measurement distance of the second sensor, and a first visual field of view of the first sensor overlaps with a second visual field of view of the second sensor (See first at least distinctions of “short” and “long” range of [0109] above. See also at least [0113] “In some examples, the camera 730 may be configured to detect objects within a range that extends past the range of the short-range time-of-flight sensor, but does not extend as far as the range of the long-range time-of-flight sensor”. Bingham clearly contemplates sensors having different ranges and modalities to optimize the grasping system. See at least [0038], [0047], and [0141]); and a central processing unit (CPU) configured to: acquire first distance data from the first sensor, wherein the first distance data is associated with the distance from the part of the mobile body to the object (See at least “distance to an object” of [0041] or “reflectance of a detected object” of [0042], Figures 12 and 13, etc. In summary, all proximity sensors measure distance information to the object); determine, based on the first distance data, that the distance from the part of the mobile body to the object is one of equal to or less than a first threshold (The narrowness of this limitation depends on other limitations. Individually, Bingham discloses this limitation as sensors have ranges, etc., the nature of “determine” is not claimed with particularity, and at present “threshold” might be any triggering condition. See [0007] below, of in particular see the combination with Hinkle below), wherein … a measurement range of the first sensor overlaps with a measurement range of the second sensor (See at least [0044] “Each time-of-flight sensor may have a different sensing range. For instance, a first short-range time-of-flight sensor may be used to detect objects in a range of 0-10 centimeters, and a second long-range time-of-flight sensor may be used to detect objects in a range of 10-200 centimeters. … In some examples, a control system may explicitly switch between using data from the short-range time-of-flight sensor or the long-range time-of-flight sensor depending on the gripper state”, [0113] “As a specific example, the camera 730 may generate 60×60 grayscale images with a range of about 60 centimeters from the palm of the gripper. In some examples, the camera 730 may be configured to detect objects within a range that extends past the range of the short-range time-of-flight sensor, but does not extend as far as the range of the long-range time-of-flight sensor”, [0122] “Each non-contact sensor on PCB 700 may generate sensor data for a different specific region in the general direction between digits 804, 806”, and Figure 11) in the specific range (This limitation effectively does not narrow the claim in any manner as the range is not limited in any manner here or later. For example, the range may be the range of any and all values) switch from the first sensor to the second sensor on based on the distance from the part of the mobile body to the object that is one of equal to or less than the first (Examiner notes that the nature of “switch” is not claimed. See at least [0007] “The method additionally includes controlling the robotic gripper based on the time-of-flight distance data and the grayscale image data” and [0044] “In some examples, a control system may explicitly switch between using data from the short-range time-of-flight sensor or the long-range time-of-flight sensor depending on the gripper state”). Bingham does not explicitly teach, but in combination with Hinkle teaches: … the first threshold is a threshold distance (See at least Column 9, Lines 1 – 3, “When the palm of the gripper moves to within the first threshold distance, the gripper may close around the object to grasp the object”) within a specific range (This limitation effectively does not narrow the claim in any manner as the range is not limited in any manner here or later. For example, the range may be the range of any and all values), and … It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to close the gripper at a threshold distance as taught by Hinkle in system of Bingham with a reasonable expectation of success. Having the manipulator grasp the object when it is with reach would remove grasp attempts that have no chance of success. Furthermore, and with respect to the combination of art, Bingham already discloses sensors having ideal ranges (discussion related to Figures 12 and 13, [0141], switching based on gripper state [0044]), only using particular sensors for particular ranges ([0044], switching based on gripper state ([0044], implies that gripper states include open and closed ([0044]), implies that use ranges and gripper states are related ([0044]), and that in-hand out-of-hand sensor data quality/accuracy varies (e.g. [0115] and [0161]). Consequently, the modification appears to simply be to connect these pieces of information with the explicit teachings of Hinkle regarding distance based control states. Additionally, Hinkle and Bingham appear directed towards shared subject matter, and thus combinations of features appears to be particularly obvious. See for example Figure 6 and Column 9 of Hinkle in comparison to Figure 8 and [0110] – [0112] of Bingham. Regarding Claim 4, Bingham teaches the recited limitations. See at least [0044], [0113], and Figure 11 again in light of the 112(b) rejections made above. Regarding Claim 5, Bingham teaches: The mobile body according to claim 1, wherein the CPU is further configured to deactivate the first sensor and activate the second sensor, and the deactivation of the first sensor and the activation of the second sensor are based on the distance from the part of the mobile body to the object that is one of equal to or less than the first threshold (This already appears disclosed by Bingham, particularly in combination with Hinkle as described above. Examiner furthermore notes that the nature of “stops”, “activates”, and “deactivates” is not claimed with any particularity and open to broad interpretation under a computer-implemented system. “Using” a sensor within specific ranges as indicated already is considered as clearly reading on this limitation as the data outside the range is not being used or “active” and the sensor thus “stopped”, “deactivated”, or similar from being in the control loop). Regarding Claim 7, Bingham teaches: The mobile body according to claim 2, wherein the CPU is further configured to switch from the second sensor to the first sensor based on the second distance data (This is already disclosed by Bingham, in particular in combination with Hinkle, above. See again [0044]. The sensor used and thus the data used for control are based on the distance threshold), and the second distance data indicates that the distance from the part of the mobile body to the object is one of equal to or more than a second threshold (See at least [0044] and [0113] again. Examiner notes that the first and second thresholds are not claimed as being different). Regarding Claim 8, Bingham teaches: The mobile body according to claim 7, wherein the CPU is further configured to deactivate the second sensor and activate the first sensor, and the deactivation of the second sensor and the activation of the first sensor are based on the distance from the part of the mobile body to the object that is one of equal to or more than the second threshold (This already appears disclosed by Bingham, particularly in combination with Hinkle as described above. Examiner furthermore notes that the nature of “stops”, “activates”, and “deactivates” is not claimed with any particularity and open to broad interpretation under a computer-implemented system. “Using” a sensor within specific ranges as indicated already is considered as clearly reading on this limitation as the data outside the range is not being used or “active” and the sensor thus “stopped”, “deactivated”, or similar from being in the control loop). Regarding Claim 9, Bingham teaches: The mobile body according to claim 1, further comprising a hand configured to grip the object, wherein the first sensor and the second sensor are in the hand (See at least [0044] “In further examples, the palm of a gripper may be equipped with multiple time-of-flight sensors, with or without an infrared camera” and Figure 8), and the CPU is further configured to control the hand based on a position of the object with respect to the hand (See at least [0007] “The method additionally includes controlling the robotic gripper based on the time-of-flight distance data and the grayscale image data” and Figure 9). Regarding Claim 10, Bingham teaches: The mobile body according to claim 9, wherein the hand includes: a palm; and at least two fingers connected to the palm, and the first sensor and the second sensor are on the palm (See at least Figure 8). Regarding Claim 11, Bingham teaches: The mobile body according to claim 10, further comprising a plurality of second sensors, wherein the plurality of second sensors includes the second sensor (The claims do not provide the nature of the second sensors that are not the second sensor. See at least Figure 8 with short-range time-of-flight sensor 710 and long-range time-of-flight sensor 720 which are both ToF sensors configured to measure distance), wherein the plurality of second sensors is at a plurality of locations within a vicinity of a periphery of the first sensor (See Figures 7 and 8). Regarding Claim 12, Bingham teaches: The mobile body according to claim 9, further comprising a plurality of second sensors, wherein the plurality of second sensors includes the second sensor, (The claims do not provide the nature of the second sensors that are not the second sensor. See at least Figure 8 with short-range time-of-flight sensor 710 and long-range time-of-flight sensor 720 which are both ToF sensors configured to measure distance)), and the CPU is further configured to: adjust each of a posture and a position of the hand with respect to the object on based on one of the first distance data or the second distance data; and control the hand to approach the object based on the adjustment of each of the posture and position of the hand (See at least [0133] “At block 906, method 900 may further include controlling the gripper based on the time-of-flight distance data and the grayscale image data. More specifically, data from the time-of-flight sensor and the infrared camera may be fused together, possibly in addition to data from other sensors, in order to generate control instructions for the gripper. The data fusion may involve heuristics-based and/or machine learning models. The control instructions may relate to a first temporal phase before grasping an object (e.g., identifying an object to grasp, approaching the object, determining an appropriate stopping distance, and/or visual servoing). The control instructions may also relate to a second temporal phase after an attempted grasp (e.g., confirming grasp success, evaluating quality of the grasp, and/or determining properties of the object). The control instructions may also relate to a third temporal phase after a successful grasp (e.g., slip detection while moving a grasped object)”). Regarding Claim 13, the claim does not presently appear to further narrow from Claim and is thus rejected under the same logic as above. The 112(b) rejection above relates. To reiterate, the adjustment of Claim 12 is inherently a control method, and is already recited as being based on the first or second distance data. Regarding Claim 15, Bingham teaches: The mobile body according to claim 1, further comprising a plurality of second sensors, wherein the plurality of second sensors includes the second sensor (The claims do not provide the nature of the second sensors that are not the second sensor. See at least Figure 8 with short-range time-of-flight sensor 710 and long-range time-of-flight sensor 720 which are both ToF sensors configured to measure distance), and a direction of measurement of the distance by the first sensor is same as a direction of measurement of the distance by each of the plurality of second sensors (See at least [0005] “The robotic gripping device further includes a time-of-flight sensor arranged on the palm such that the time-of-flight sensor is configured to generate time-of-flight distance data in a direction between the plurality of digits. The robotic gripping device additionally includes an infrared camera, comprising an infrared illumination source, where the infrared camera is arranged on the palm such that the infrared camera is configured to generate grayscale image data in the direction between the plurality of digits” and Figure 8). Regarding Claim 16, Bingham teaches: The mobile body according to claim 15, wherein the plurality of second sensors is at a plurality of locations within a vicinity of a periphery of the first sensor (See Figures 7 and 8). Regarding Claim 17, Bingham teaches: The mobile body according to claim 1, wherein the first sensor is a depth camera (See at least [0103] “FIG. 6 is a table that includes types of gripper sensors for different manipulation classes, in accordance with example embodiments”, [0108] “Within examples, an existing underactuated gripper, such as described herein with respect to FIGS. 4 and 5, may be augmented with selected sensors from Table 6 to allow for immediate improvement on the detection of grasp success and later benefits of rich sensor data for machine learning algorithms”, Figure 6 (at least “Camera (DVS)” and “3D ToF Camera”) and related [0106] “In some examples, sensors used to detect grasp shape may include … dynamic vision sensor (DVS) cameras; … and three-dimensional (3D) ToF cameras), and the second sensor is a distance sensor (See variously recited “time-of-flight sensor”). Regarding Claim 18, the claims are directed to effectively the same subject matter as Claim 1 with respect to the application of prior art. The claims are therefore rejected under the same logic as Claim 1 above. Applicant’s Remarks filed 12/02/2025 also argue the claims together. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Bingham et al. further in view of Hinkle and Gorman et al. (US 20200189631 A1). Regarding Claim 6, the combination of Bingham and Hinkle teaches: The mobile body according to claim 1, Bingham does not teach, but in combination with Gorman teaches: wherein the CPU is further configured to increase a sampling rate of the second sensor based on decrase in the distance from the part of the mobile body to the object (See at least [0061] “The local processor may also be configured to increase a rate of receiving the distance data (e.g., through increased sample rate by the distance sensor 108 itself, through increased sample rate of the distance sensor 108 data stream, etc.) of the distance between the first rail car 102a and the second rail car 102b as the distance between the first rail car 102a and the second rail car 102b decreases”). It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to increase the sampling rate of any distance measuring sensor as disclosed by Gorman in the system of Bingham or Bingham in combination with Hinkle with a reasonable expectation of success. It is well understood and routine to increase sampling rate of sensors detecting proximity to objects in an environment so as to allow for more precise detection of the object while conserving energy/bandwidth etc. to those times where said precision is most needed. See at least [0061] of Gorman “The local processor may also be configured to increase a rate of communicating the distance data to a remote processor as the distance between the first rail car 102a and the second rail car 102b decreases. In this manner, as the rail cars 102a, 102b move closer together, the increased rate of receiving data/communicating data allows for a more precise detection of the coupling process, and it reserves the highest energy/bandwidth of operation for when the coupling distance is closest to completion. It will be appreciated that many configurations are possible”. Claim 14 is rejected under 35 U.S.C. 103 as being unpatentable over Bingham et al. further in view of Hinkle and Couture et al. (US 20210215811 A1). Regarding Claim 14, the combination of Bingham and Hinkle teaches: The mobile body according to claim 13, Bingham does not teach, but in combination with Couture teaches: wherein the CPU is further configured to execute a calibration of the first sensor and a calibration of the second sensor in parallel (See at least [0121] “FIG. 7A is an illustration of a user interface on a device, of user interface units 212, according to an exemplary embodiment. According to at least one non-limiting exemplary embodiment, the user interface of FIG. 7A provides a user a system to run calibration tests on each individual sensor and/or a plurality of sensors 122, 124, 126, 128, as shown in FIG. 1B, either independently or concurrently, as well as additional sensors (not shown) simultaneously. That is, either a single sensor may be calibrated at a time or a plurality of sensors may be calibrated all at once at the same time”). It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to simultaneously (or in “parallel”) calibrate any sensors of the system of Bingham or Bingham in combination with Hinkle as disclosed by Couture with a reasonable expectation of success. Simultaneous calibration would shorten the amount of time required for calibration as well as ensure the same calibration conditions exist. Such simultaneous calibration is well known and routine, particularly wherein the nature of the calibration is not claimed and precludes such activity. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. With respect to the concept of switching sensors, sensor range modalities, and similar based on distance to a target see Smith (US 20100256814 A1 and Kawamae (US 20240027617 A1). See also: Arora et al. (US 4718023 A) which discloses sensors situated in a circle around an emitter/transmitter. Kim et al. (US 20090285664 A1) which discloses a hand having a plurality of distance sensors used for grasping. Amacker et al. (US 20190091875 A1) which discloses grippers having at least one finger having several short range sensors and at least one long range sensor. DeStories et al. (US 20230146712 A1) which discloses a robotic measurement device having proximity sensors surrounding an end effector and other measuring sensor on what can be considered a “palm”. Kim et al. (US 20230191617 A1) which discloses a gripper having distance sensors in the fingers of the gripper for controlling the grasp of an object and which is illustrated with overlapping fields of view. Any inquiry concerning this communication or earlier communications from the examiner should be directed to MATTHEW C GAMMON whose telephone number is (571)272-4919. The examiner can normally be reached M - F 10:00 - 6:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, ADAM MOTT can be reached on (571) 270-5376. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MATTHEW C GAMMON/Examiner, Art Unit 3657 /ADAM R MOTT/Supervisory Patent Examiner, Art Unit 3657
Read full office action

Prosecution Timeline

May 24, 2024
Application Filed
Aug 27, 2025
Non-Final Rejection — §102, §103, §112
Dec 02, 2025
Response Filed
Feb 17, 2026
Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12594673
Method of Calibrating Manipulator, Control System and Robot System
2y 5m to grant Granted Apr 07, 2026
Patent 12588646
MILKING SYSTEM COMPRISING A MILKING ROBOT
2y 5m to grant Granted Mar 31, 2026
Patent 12583110
ROBOT CONTROL SYSTEM
2y 5m to grant Granted Mar 24, 2026
Patent 12576523
CONTROLLING ROBOTS USING MULTI-MODAL LANGUAGE MODELS
2y 5m to grant Granted Mar 17, 2026
Patent 12544926
OBJECT INTERFERENCE CHECK METHOD
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
65%
Grant Probability
88%
With Interview (+23.4%)
2y 9m
Median Time to Grant
Moderate
PTA Risk
Based on 102 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month