DETAILED ACTION
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Request for Continued Examination
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 2/13/2026 has been entered.
Response to Arguments
Applicant's arguments filed 2/13/2026 have been fully considered but they are not persuasive. Applicant asserts that Harada fails to teach the newly amended features of claims 1, 12, and 14. As will be elaborated in the rejection below, Examiner notes that [0040] of Applicant's specification indicates that "… the sensor control unit 103, and the synchronous processing unit 104 may be … for example, a general-purpose computer … or may be logically implemented by a cloud computing system." Under the broadest reasonable interpretation of the claim in light of 35 U.S.C 112(f) (see non-final rejection dated 6/4/2025), this encompasses implementation of the claimed functionality in the form of computer logic (the synchronous processing unit and sensor control unit are simply functionality implemented on a computing device). Harada's measurement unit includes the functionality of each of these claimed elements, and each is connected logically so that they can communicate with each other (i.e. - the receipt of the drive signal causes triggers the sensor control functionality).
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1-2, 5-10, 12, and 14 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claims 1, 12, and 14 recites the limitation "wherein the robot control unit is connected to the robot and the synchronous control unit” (emphasis added). There is insufficient antecedent basis for this limitation in the claim. Examiner proceeds on the assumption that the synchronous control unit refers to the synchronous processing unit.
Claim Rejections - 35 USC § 102
The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action.
Claims 1-2, 5-6, 9, 12, and 14 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Harada (JP 2016203273 A).
Claim 1
Harada teaches
A measurement system configured to
perform a measurement using a sensor installed in a robot, and
(Harada - [0011] A robot 1 includes a robot hand 2 having a motor 2 a, a robot control unit 4 and a state monitoring unit 5 …
[0014] … As will be described in detail later, the motor drive signal incorporates a synchronization signal that synchronizes the timing at which the sensor 10 starts sensing with the timing at which the robot hand 2 starts moving.)
EXAMINER NOTE: See Fig. 1. The sensor 10 is installed on the robot arm.
comprising a synchronous processing unit configured to
acquire a robot drive signal transmitted from a robot control unit configured to control the robot and generate a trigger signal from the synchronous processing unit to a sensor control unit,
wherein the trigger signal to the sensor control unit is for triggering the sensor to start a measurement operation when the robot drive signal includes a command signal for the robot to start driving,
(Harada - [0029] In step S2, a motor drive signal is transmitted from the motor control unit 4b to the measurement unit 5a and the motor 2a. This motor drive signal incorporates a synchronization signal.
As a result, upon receiving the synchronization command, the motor control unit 4b can cause the measurement unit 5a to acquire a measurement value based on the synchronization command, and can also send a synchronization signal to the judgment unit 5b to start comparison with the normal range.)
EXAMINER NOTE: The motor control unit (robot control unit) sends the drive signal to the robot and the measurement unit (synchronous processing unit). As a result, the measurement unit begins to obtain measurements (synchronous processing unit generates a trigger signal to the sensor control unit for triggering the sensor to start a measurement operation).
CLAIM INTERPRETATION NOTE ON ABOVE: Examiner notes that [0040] of Applicant's specification indicates that "… the sensor control unit 103, and the synchronous processing unit 104 may be … for example, a general-purpose computer … or may be logically implemented by a cloud computing system." Under the broadest reasonable interpretation of the claim in light of 35 U.S.C 112(f) (see non-final rejection dated 6/4/2025), this encompasses implementation of the claimed functionality in the form of computer logic (the synchronous processing unit and sensor control unit are simply functionality implemented on a computing device). Harada's measurement unit includes the functionality of each of these claimed elements, and each is connected logically so that they can communicate with each other (i.e. - the receipt of the drive signal causes triggers the sensor control functionality)
wherein
the robot control unit is connected to the robot and the synchronous control unit,
EXAMINER NOTE: See Harada, Fig. 1 (annotated, translated below). The motor control unit (robot control unit) is connected to the measurement unit (synchronous processing unit)
PNG
media_image1.png
609
597
media_image1.png
Greyscale
The synchronous processing unit is further connected to the sensor control unit,
EXAMINER NOTE: See above comments regarding Harada [0029]. The synchronous processing unit functionality is logically connected to the sensor control unit functionality.
and, the robot drive signal is input to both the robot and the synchronous processing unit.
(Harada - [0014] The motor drive signal sent from the servo controller unit 4b1 to the servo driver unit 4b2 is also sent to a measuring unit 5a, which will be described later. As will be described in detail later, the motor drive signal incorporates a synchronization signal that synchronizes the timing at which the sensor 10 starts sensing with the timing at which the robot hand 2 starts moving)
Claim 2
Harada teaches the limitations of claim 1 as outlined above. Harada further teaches
wherein the command signal for the robot to start driving includes a command pulse signal or an encoder pulse signal for driving a motor installed in the robot.
(Harada - [0030] Referring to FIG. 8B, the synchronization signal is incorporated into the motor drive signal … Specifically, the synchronization signal is incorporated into the motor drive signal during the transition from a robot stopped state to a robot moving state. Such a motor drive signal is transmitted from the servo controller unit 4b1 to the measurement unit 5a and also to the servo driver unit 4b2. Upon receiving the synchronization signal, the measurement unit 5a starts sensing, and the servo driver unit 4b2 instructs the motor 2a on a motor drive current value.)
EXAMINER NOTE: See Translated Fig. 8B below. The synchronization signal is a pulse.
PNG
media_image2.png
477
696
media_image2.png
Greyscale
Claim 5
Harada teaches the limitations of claim 1 as outlined above. Harada further teaches
a measurement result generation unit configured to
generate measurement result information by acquiring
measurement data that the sensor has acquired based on the trigger signal
and operation data of the robot that has operated based on the robot drive signal.
(Harada - [0015] … A sensor 10 for acquiring measurement values used to determine the operating state of the robot hand 2 is electrically connected to the measurement unit 5 a. … By determining whether the pressure value acquired by this sensor 10 is being output at the appropriate timing in response to the robot's commands, it is possible to determine whether the robot hand 2 is operating appropriately.
[0004] Incidentally, in order to determine whether or not various tasks performed by a robot have been successful, it is sometimes necessary to monitor whether or not measured values relating to position and force, which change from moment to moment as the robot moves, are within a predetermined normal range.
[0034] In step S4, the judgment unit 5b compares the measurement value acquired by the measurement unit 5a with a normal range stored in advance to judge whether the robot operation is successful or not.)
EXAMINER NOTE: The commands, drive signals, and "normal range" (values associated with robot movement) correspond to robot operation data. The pressure values correspond to measurement data.
Claim 6
Harada teaches the limitations of claim 5 as outlined above. Harada further teaches
wherein each of the measurement data and the operation data includes time information,
(Harada - [0015] … A sensor 10 for acquiring measurement values used to determine the operating state of the robot hand 2 is electrically connected to the measurement unit 5 a. … By determining whether the pressure value acquired by this sensor 10 is being output at the appropriate timing in response to the robot's commands, it is possible to determine whether the robot hand 2 is operating appropriately.)
and the measurement result generation unit generates,
based on information about a time of a start of a measurement included in the measurement data and information about a time of a start of an operation included in the operation data,
(Harada - [0026] In the robot 1 of this embodiment, in order to avoid such a delay in starting the operation of the robot hand 2, a signal for starting sensing is transmitted together with the motor drive signal.
[0028] First, in step S1, a robot operation command is sent from the sequence control unit 4a to the motor control unit 4b.
Here, the robot operation command includes, as a synchronization command, a position command that requests the motor 2a to operate at a speed equal to or faster than the speed at which the motor 2a can be started. …
… Specifically, referring to FIG. 8A, the synchronization command is included in the robot operation command as a signal of frequency f1….
… A signal having such characteristics is recognized by the servo controller unit 4b1 as a signal to start an operation, but on the other hand, it is not recognized as a normal position/speed command value for operating the robot hand 2.
[0029] In step S2, a motor drive signal is transmitted from the motor control unit 4b to the measurement unit 5a and the motor 2a.
This motor drive signal incorporates a synchronization signal.
As a result, upon receiving the synchronization command, the motor control unit 4b can cause the measurement unit 5a to acquire a measurement value based on the synchronization command, and can also send a synchronization signal to the judgment unit 5b to start comparison with the normal range.
[0030] … Specifically, the synchronization signal is incorporated into the motor drive signal during the transition from a robot stopped state to a robot moving state. Such a motor drive signal is transmitted from the servo controller unit 4b1 to the measurement unit 5a and also to the servo driver unit 4b2. Upon receiving the synchronization signal, the measurement unit 5a starts sensing, and the servo driver unit 4b2 instructs the motor 2a on a motor drive current value.
[0035] In this way, the robot 1 of this embodiment can synchronize the timing at which the sensor 10 that monitors the operation of the robot hand 2 starts sensing with the timing at which the robot hand 2 starts operating)
EXAMINER NOTE: The synchronization signals coordinate the timing of the robot operation start time and the measurement start time in order to begin the comparison to determine whether the robot operation is successful. See also Translated Fig. 8 below, which is referenced in the above citations.
PNG
media_image3.png
1172
895
media_image3.png
Greyscale
the measurement result information in which the measurement data and the operation data are associated with each other.
[0034] In step S4, the judgment unit 5b compares the measurement value acquired by the measurement unit 5a with a normal range stored in advance to judge whether the robot operation is successful or not.
EXAMINER NOTE: The measurement is associated with robot operation.
Claim 9
Harada teaches the limitations of claim 1 as outlined above. Harada further teaches
a drive signal generation unit configured to
generate the robot drive signal based on an operation command for the robot,
and transmit the generated robot drive signal to the robot.
(Harada - [0028] First, in step S1, a robot operation command is sent from the sequence control unit 4a to the motor control unit 4b.
Here, the robot operation command includes, as a synchronization command, a position command that requests the motor 2a to operate …
[0029] In step S2, a motor drive signal is transmitted from the motor control unit 4b to the measurement unit 5a and the motor 2a.
[0030] … Specifically, the synchronization signal is incorporated into the motor drive signal during the transition from a robot stopped state to a robot moving state. Such a motor drive signal is transmitted from the servo controller unit 4b1 to the measurement unit 5a and also to the servo driver unit 4b2. Upon receiving the synchronization signal, the measurement unit 5a starts sensing, and the servo driver unit 4b2 instructs the motor 2a on a motor drive current value.)
Claim 12
Harada teaches
A measurement method for performing a measurement using a sensor installed in a robot,
(Harada - [0011] A robot 1 includes a robot hand 2 having a motor 2 a, a robot control unit 4 and a state monitoring unit 5 …
[0014] … As will be described in detail later, the motor drive signal incorporates a synchronization signal that synchronizes the timing at which the sensor 10 starts sensing with the timing at which the robot hand 2 starts moving.)
EXAMINER NOTE: See Fig. 1. The sensor 10 is installed on the robot arm.
Comprising: generating, by a robot control unit configured to control the robot, a robot drive signal;
(Harada - [0013] … The servo controller 4b1 receives a robot operation command sent by the sequence control unit 4a and sends a motor drive signal to the servo driver 4b2. The servo driver unit 4b2 receives the motor drive signal and drives the motor 2a based on the motor drive signal.)
transmitting the robot drive signal to both the robot and a synchronous processing unit;
acquiring, by the synchronous processing unit, the robot drive signal;
(Harada - [0014] The motor drive signal sent from the servo controller unit 4b1 to the servo driver unit 4b2 is also sent to a measuring unit 5a, which will be described later. As will be described in detail later, the motor drive signal incorporates a synchronization signal that synchronizes the timing at which the sensor 10 starts sensing with the timing at which the robot hand 2 starts moving)
and generating, by the synchronous processing unit, a trigger signal from the synchronous processing unit to a sensor control unit, wherein the trigger signal to the sensor control unit is for triggering the sensor to start a measurement operation when the robot drive signal includes a command signal for the robot to start driving
(Harada - [0029] In step S2, a motor drive signal is transmitted from the motor control unit 4b to the measurement unit 5a and the motor 2a. This motor drive signal incorporates a synchronization signal.
As a result, upon receiving the synchronization command, the motor control unit 4b can cause the measurement unit 5a to acquire a measurement value based on the synchronization command, and can also send a synchronization signal to the judgment unit 5b to start comparison with the normal range.)
EXAMINER NOTE: The motor control unit (robot control unit) sends the drive signal to the robot and the measurement unit (synchronous processing unit). As a result, the measurement unit begins to obtain measurements (synchronous processing unit generates a trigger signal to the sensor control unit for triggering the sensor to start a measurement operation).
CLAIM INTERPRETATION NOTE ON ABOVE: Examiner notes that [0040] of Applicant's specification indicates that "… the sensor control unit 103, and the synchronous processing unit 104 may be … for example, a general-purpose computer … or may be logically implemented by a cloud computing system." Under the broadest reasonable interpretation of the claim in light of 35 U.S.C 112(f) (see non-final rejection dated 6/4/2025), this encompasses implementation of the claimed functionality in the form of computer logic (the synchronous processing unit and sensor control unit are simply functionality implemented on a computing device). Harada's measurement unit includes the functionality of each of these claimed elements, and each is connected logically so that they can communicate with each other (i.e. - the receipt of the drive signal causes triggers the sensor control functionality)
wherein the robot control unit is connected to the robot and the synchronous control unit
EXAMINER NOTE: See Harada, Fig. 1 (annotated, translated below). The motor control unit (robot control unit) is connected to the measurement unit (synchronous processing unit)
PNG
media_image4.png
609
597
media_image4.png
Greyscale
wherein the synchronous processing unit is further connected to the sensor control unit
EXAMINER NOTE: See above comments regarding Harada [0029]. The synchronous processing unit functionality is logically connected to the sensor control unit functionality.
Claim 14
Harada teaches
A non-transitory computer readable medium storing a program for causing a computer to perform a method
(Harada - [0011] The state monitoring unit 5 corresponds to a second control device.
The robot control unit 4 and the state monitoring unit 5 are both computers. Referring to FIG. 2, the hardware configuration of the robot control unit 4 is shown in outline.
As shown in FIG. 2, the robot control unit 4 includes a CPU (Central Processing Unit) 41, a ROM (Read Only Memory) 42, a RAM (Random Access Memory) 43, and a storage unit (here, a HDD (Hard Disk Drive)) 44. The robot control unit 4 also includes an input/output interface 45, a drive 46 for a portable storage medium, and the like.
These components of the robot control unit 4 are connected to a bus 48 . In the robot control unit 4, the CPU 41 executes a program stored in the ROM 42 or the HDD 44, or a program read by the portable storage medium drive 46 from the portable storage medium 47, thereby realizing the functions of the sequence control unit 4a and the motor control unit 4b. On the other hand, the state monitoring unit 5 is also a computer, and the functions of a measuring unit 5a and a determining unit 5b, which will be described later, are realized by executing a program. Here, the programs include robot control programs.)
for performing a measurement using a sensor installed in a robot,
(Harada - [0011] A robot 1 includes a robot hand 2 having a motor 2 a, a robot control unit 4 and a state monitoring unit 5 …
[0014] … As will be described in detail later, the motor drive signal incorporates a synchronization signal that synchronizes the timing at which the sensor 10 starts sensing with the timing at which the robot hand 2 starts moving.)
EXAMINER NOTE: See Fig. 1. The sensor 10 is installed on the robot arm.
the program being configured to cause the computer, as the measurement method, to perform:
A process of, by a robot control unit configured to control the robot, generating a robot drive signal;
(Harada - [0013] … The servo controller 4b1 receives a robot operation command sent by the sequence control unit 4a and sends a motor drive signal to the servo driver 4b2. The servo driver unit 4b2 receives the motor drive signal and drives the motor 2a based on the motor drive signal.)
A process of transmitting the robot drive signal to both the robot and a synchronous processing unit; a process of, by the synchronous processing unit, acquiring the robot drive signal;
(Harada - [0014] The motor drive signal sent from the servo controller unit 4b1 to the servo driver unit 4b2 is also sent to a measuring unit 5a, which will be described later. As will be described in detail later, the motor drive signal incorporates a synchronization signal that synchronizes the timing at which the sensor 10 starts sensing with the timing at which the robot hand 2 starts moving)
and a process of, by the synchronous processing unit, generating a trigger signal from the synchronous processing unit to a sensor control unit, wherein the trigger signal to the sensor control unit is for triggering the sensor to start a measurement operation when the robot drive signal includes a command signal for the robot to start driving.
(Harada - [0029] In step S2, a motor drive signal is transmitted from the motor control unit 4b to the measurement unit 5a and the motor 2a. This motor drive signal incorporates a synchronization signal.
As a result, upon receiving the synchronization command, the motor control unit 4b can cause the measurement unit 5a to acquire a measurement value based on the synchronization command, and can also send a synchronization signal to the judgment unit 5b to start comparison with the normal range.)
EXAMINER NOTE: The motor control unit (robot control unit) sends the drive signal to the robot and the measurement unit (synchronous processing unit). As a result, the measurement unit begins to obtain measurements (synchronous processing unit generates a trigger signal to the sensor control unit for triggering the sensor to start a measurement operation).
CLAIM INTERPRETATION NOTE ON ABOVE: Examiner notes that [0040] of Applicant's specification indicates that "… the sensor control unit 103, and the synchronous processing unit 104 may be … for example, a general-purpose computer … or may be logically implemented by a cloud computing system." Under the broadest reasonable interpretation of the claim in light of 35 U.S.C 112(f) (see non-final rejection dated 6/4/2025), this encompasses implementation of the claimed functionality in the form of computer logic (the synchronous processing unit and sensor control unit are simply functionality implemented on a computing device). Harada's measurement unit includes the functionality of each of these claimed elements, and each is connected logically so that they can communicate with each other (i.e. - the receipt of the drive signal causes triggers the sensor control functionality)
wherein the robot control unit is connected to the robot and the synchronous control unit,
EXAMINER NOTE: See Harada, Fig. 1 (annotated, translated below). The motor control unit (robot control unit) is connected to the measurement unit (synchronous processing unit)
PNG
media_image4.png
609
597
media_image4.png
Greyscale
wherein the synchronous processing unit is further connected to the sensor control unit
EXAMINER NOTE: See above comments regarding Harada [0029]. The synchronous processing unit functionality is logically connected to the sensor control unit functionality.
Claim Rejections - 35 USC § 103
The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action.
Claim 7 is rejected under 35 U.S.C. 103 as being unpatentable over Harada as applied to claim 1 above, and further in view of Mohtasham (“Haptic Object Parameter Estimation during Within-Hand- Manipulation with a Simple Robot Gripper," 2020)
Claim 7
Harada teaches the limitations of claim 5 as outlined above. Harada's system utilizes a pressure sensor to obtain measurements, but does not mention obtaining a position or shape of an object to be measured. Thus, Harada does not explicitly teach the following limitations in combination:
wherein the measurement result information includes information about a position and a shape of an object to be measured, measured by the sensor.
However, Mohtasham teaches a robotic gripper equipped with a pressure sensor which is used to determine the shape and location of an object.
(Mohtasham - [p. 141, col 1, para. 1] In this paper we explore the potential of using the simple platform of the VF hand as a tool for performing tactile data acquisition through EPs, via the WIHM task of rolling an object between the fingers, to estimate the shape and pose of grasped objects …
[p.141, col 2, Tactile Sensing Robot Finger] … sensor finger was designed to mount for two TakkStrip 2 barometric pressure sensor arrays along its contact surface (Figure 3). The TakkStrip sensors are manufactured by … embedding an atmospheric pressure sensor in urethane in order to create a robust and low-cost tactile sensor.)
Mohtasham indicates that the pressure profile detected by the sensor changes depending on the shape of the object and how it is grasped.
(Mohtasham - [p. 143, col 2 thru p.144, col 1, Shape Algorithm] The object's shape is determined by observing patterns in the sensor data that indicate surface and edge contacts. For each channel, positive values greater than 200 indicate that the sensor channel is activated, meaning that an object is applying significant pressure on that channel. Negative values indicate that a channel has formed a vacuum at that instance due to pressure being applied to an adjacent channel, causing the urethane to lift from the vent hole, in a ‘see-saw' effect.
The shape algorithm functions by stepping through each channel and analyzing the data over the complete 20second window and recording the number of observed cuboid and cylinder features into the ‘score variables', scu and scy, respectively. Once this is complete the relative values of the scores are used to determine the object's shape …
The algorithm first finds the index of the maximum value for each sensor channel over the entire time period. For each channel the algorithm then searches for ranges of consecutive (in time), activated values localized around this maximum. If the number of activated pixels falls within a range determined by calibration, then the cylinder score increases by one. This feature is indicative of the constant contact of an object surface as it is rolled.
For a cuboid, the algorithm looks for the overlap of the maximum values across adjacent channels, due to the cuboid's flat edge applying pressure to multiple sensors at a given instance of time. If multiple channels overlap within a small time threshold, then the algorithm increases the cuboid count by one. ….
[p. 144, col 1, Location Algorithm] For a selected point in time, the location algorithm estimates the position of the center of the object, …
For a selected time instance, the algorithm first identifies which sensor channels are activated based on pressure readings greater than 200, which indicates significant pressure applied to the channel. For a cylinder, only a single channel should be activated at a given time instance because the radius makes a single contact with the channel as it rolls over time. For a cuboid, multiple channels should be activated if the flat surface is making contact with the sensor at the selected time instance. Otherwise, the object is rotating on its edge and a negative pressure value is observed.)
Recall from above that Harada checks for abnormalities in the robot operation by evaluating a sensed range
(Harada - [0014] … Here, sensing refers to a series of processes in which the measurement unit 5a acquires a measurement value from the sensor 10, and the judgment unit 5b compares the measurement value with a normal range to judge whether the operation of the robot hand 2 has been successful.)
According to Mohtasham, this "normal range" is likely to vary based on the object shape and position. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to Modify Harada's system with Mohtasham's suggestion to sense object shape and position in order to increase perception of the haptic properties of an object such that different objects may be identified and handled appropriately.
(Mohtasham - [p. 140, col 2, ln 5-19] In addition to repositioning an object to achieve alternative grasps or orientations, WIHM also increases perception of the haptic properties of an object. … This further extends tactile sensing beyond only the small portion of an object that happens to be underneath a fingertip during grasping. Performing manipulation of an object to glean more haptic information about it is commonly known as an Exploratory Procedure (EP) [12]. … However, the majority of robotic approaches to Eps utilize gross arm motions and support surfaces [13], [15] rather than WIHM. Such arm motions have additional workspace requirements and can often involve setting down and regrasping an object, which can break up functional tasks. During WIHM, a robot arm may stay fixed as the necessary data is extracted. This may even occur as the object is being carried between locations. …)
Claim 8 is rejected under 35 U.S.C. 103 as being unpatentable over Harada as applied to claim 1 above, and further in view of Jindo (EP 3482890 A1).
Claim 8
Harada teaches the limitations of claim 1 as outlined above. Harada further teaches
wherein the synchronous processing unit includes a sensor control unit configured to
generate a sensor drive signal for driving the sensor as the trigger signal,
(Harada - [0028] … Specifically, referring to FIG. 8A, the synchronization command is included in the robot operation command as a signal of frequency f1….
[0030] … Upon receiving the synchronization signal, the measurement unit 5a starts sensing, and the servo driver unit 4b2 instructs the motor 2a on a motor drive current value.)
While examiner believes Harada implicitly includes a means for outputting a sensor drive signal to the sensor due to the fact that the sensor is selectively turned on under specific conditions, Harada does not explicitly disclose such a means. For thoroughness, Examiner relies on Jindo to teach
acquire the sensor drive signal, and output the acquired sensor drive signal to the sensor.
(Jindo - [0022] Imaging control section 63 controls overall imaging device 60. Imaging control section 63 outputs a control signal to lighting section 61 to control the light
emitted from lighting section 61, and outputs a control signal to imaging section 62 to perform image capturing.)
While Jindo's disclosure refers to an imaging sensor, Jindo's teachings primarily relate to the synchronization of robot motion with corresponding sensor data, and therefore teaches a very similar invention. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to incorporate a means for outputting a sensor drive signal to the sensor in order to drive the sensor. Such a modification would be well within the capabilities of one skilled in the art, and would yield predictable results (driving the sensor).
Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Harada in view of Kearns (US 20120182392 A1) and Uchida (US 20180262656 A1)
Claim 10
Harada teaches the limitations of claim 1 as outlined above. Uchida alternatively teaches the following elements of claim 1:
generate a trigger signal … to a sensor control unit,
wherein the trigger signal to the sensor control unit is for triggering the sensor to start a measurement operation
(Uchida - [0022] The processing device 1 can include a measurement device 700 for measuring the object 500, and a robot system 800 for processing the object 500 based on the measurement result of the measurement device 700. The measurement device 700 can include a sensor 100,
[0035] The sensor 100 can include … a controller 30 for controlling … the image capturing device 20. The controller 30 outputs a measurement time signal as timing information indicating … an image capturing period for causing the image capturing device 20 to capture the object 500.
[0040] The controller 30 can be configured to, for example, start … image capturing by the image capturing device 20 in response to a measurement trigger provided from the robot controlling device 400.)
Uchida demonstrates that it is common in the art to provide a trigger signal to a controller integrated into a sensor (sensor control unit), where the trigger signal triggers the sensor to start measuring. This differs from Harada in that the sensor control unit is explicitly located within the sensor as hardware rather than implemented logically as discussed in the rejection of claim 1. Figure 2 of Uchida illustrates the structure of the sensor 100.
PNG
media_image5.png
565
480
media_image5.png
Greyscale
Harada alone may not explicitly teach the limitations of claim 10 in the claimed combination. However, Uchida teaches
wherein the synchronous processing unit: generates a sensor drive signal as the trigger signal, the sensor drive signal being a periodic pulse signal;
(Uchida - [0043] The operation of the processing device 1 according to the first embodiment will be described below with reference to a timing chart shown in FIG. 3. The measurement trigger (3-a) output from the robot controlling device 400 is received by the controller 30 after a lapse of a transmission delay time τ1 generated in an interface circuit and communication path forming the interface between the robot controlling device 400 and the controller 30. In response to reception of the measurement trigger (3-a), the controller 30 starts control for measurement.)
EXAMINER NOTE: See Fig. 3. The measurement trigger (3-a) is a square wave (periodic pulse signal).
PNG
media_image6.png
421
624
media_image6.png
Greyscale
Harada and Uchida may not explicitly teach the remaining limitations of claim 10 in the claimed combination. However, Kearns teaches
and changes a period of the pulse signal of the sensor drive signal when it has acquired the robot drive signal indicating that at least one of a position and a speed of the sensor satisfies a predetermined condition.
(Kearns - [0145] The repeating 1212 operation can be performed at a relatively slow rate (e.g., slow frame rate) for relatively high resolution, an intermediate rate, or a high rate with a relatively low resolution. The frequency of the repeating 1212 operation may be adjustable by the robot controller 500. In some implementations, the controller 500 may raise or lower the frequency of the repeating 1212 operation upon receiving an event trigger. For example, a sensed item in the scene may trigger an event that causes an increased frequency of the repeating 1212 operation to sense an possibly eminent object 12 (e.g., doorway, threshold, or cliff) in the scene 10 … A relatively greater acquisition rate of image depth data can allow for relatively more reliable feature tracking within the scene.)
EXAMINER NOTE: The sensor alters the sampling frequency based on sensing an eminent object. Because the sensor is located on the robot, this event is triggered based on the relative position of the sensor to the object being detected.
Uchida demonstrates that utilizing a controller integrated into a sensor and using a pulsed signal from the controller to drive the sensor is common and well-known in the art. The implementation of this configuration would be within the capabilities of one of ordinary skill in the art, and would yield predictable results (driving the sensor). Therefore, driving the sensor via a periodic pulse would have been obvious to one of ordinary skill in the art before the effective filing date of the invention. Additionally, it would have been obvious to further modify the combination of Harada and Uchida with Kearns' suggestion to change period of the pulse in order to alter resolution when needed to avoid possibly imminent objects.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JAMES MILLER WATTS whose telephone number is (703)756-1249. The examiner can normally be reached 7:30-5:30 M-TH.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Adam Mott can be reached at 571-270-5376. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JAMES MILLER WATTS III/Examiner, Art Unit 3657
/ADAM R MOTT/Supervisory Patent Examiner, Art Unit 3657