DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Preliminary Amendment
The Preliminary Amendment filed on 06/18/2024 has been entered.
Response to Preliminary Remarks
Applicant's Preliminary Remarks filed 06/18/2024 concerning the Preliminary Amendment have been considered and those amendments have been entered.
Drawings
The drawings are objected to as failing to comply with 37 CFR 1.84(p)(5) because they do not include the following reference sign(s) mentioned in the description:
paragraph [0036] refers to 412c however 412c is not in the drawings;
paragraph [0157] refers to STEP 121 however STEP 121 is not in the drawings;
paragraph [0061] refers to STEP 122 however STEP 122 is not in the drawings; paragraph [0064] refers to STEP 123 however STEP 123 is not in the drawings; and
paragraphs [0067], [0071], and [0079] also refer to STEP 121, STEP 122, and STEP 123.
Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance.
CLAIM INTERPRETATION
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
Claims 1-7 have been interpreted under 35 U.S.C. 112(f) (pre-AIA 35 U.S.C. 112, sixth paragraph) to not invoke 35 U.S.C. 112(f) (pre-AIA 35 U.S.C. 112, sixth paragraph) claim interpretation. Regarding method claim 7 the claimed “step of” does not invoke 35 U.S.C. 112(f) (pre-AIA 35 U.S.C. 112, sixth paragraph) claim interpretation, refer to MPEP 2181 I.A. which states “If the claim element uses the phrase ‘step for,’ then Section 112, Para. 6 is presumed to apply…. On the other hand, the term ‘step’ alone and the phrase ‘steps of’ tend to show that Section 112, Para. 6 does not govern that limitation.");”.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-7 are rejected under 35 U.S.C. 102(a)(1) as being anticipated Wu et al., JP 2021046719 A, hereinafter Wu.
Claim 1:
1. A remote operation support system that causes a synthetic image to be displayed on a remote output interface of a remote operation device that allows remote operation of work machine (Wu: FIGs. 1-8, FIGs. 7(A) and 7(B) illustrate synthetic image displayed on display device D1 in the remote control room RC that allows control of the excavator 100.), the synthetic image being an image in which an index image corresponding to an index member positioned in an operator's room of the work machine is superimposed on a captured image captured by an imaging device which is mounted on the work machine and which is capable of capturing an image around the work machine (Wu: FIGs. 1-8, FIGs. 7(A) and 7(B) illustrate synthetic image of portions of operator’s cabin superimposed on real view from camera where the portions of the operator’s cabin correspond to the claimed index image such as the left operation lever G3L, the right operation lever G3R, the left traveling lever G4L, and the right traveling lever G4R positioned in an operator's room of the excavator 100, note the following paragraphs present in the English Translation “The operator OP who sees the image shown in FIG. 7 A can obtain a sense of realism as if he / she is operating the excavator 100 in the cabin 10.”, refer to paragraph [0091] and “The operator OP who sees the image shown in FIG. 7B can get a sense of reality as if he/she is operating the excavator 100 in the immediate vicinity of the excavation point.”, refer to paragraph [0101] .).
Refer to the paragraphs of the English Translation of FIGs. 6-7(B) which correspond to paragraphs [0059] to [0108] of Original document corresponding to approximately pages 15-29 of the English Translation which are reproduced below for convenience with paragraph numbers added that have been correlated by the Examiner to the original document’s paragraph numbers.
[0059]
Next, a configuration example of the remote control system of
the excavator 100 will be described with reference to FIG. FIG.
6 is a functional block diagram showing a configuration
example of the remote control system SYS.
[0060]
The remote control system SYS mainly includes a controller
30, a solenoid valve unit 45, an image pickup device C1, and a
communication device T1 mounted on the excavator 100, and
an operation sensor 29 and a remote controller 40 installed in
the remote control room RC. , The indoor image pickup device
C2, the display device D1, and the communication device T2.
[0061]
The controller 30 has an image generation unit 31, an
excavator state identification unit 32, and an actuator drive unit
33 as functional elements.
[0062]
The image generation unit 31 is configured to generate a
surrounding image including an image displayed on the display
device D1. The surrounding image is an image showing the
surrounding state of the excavator 100 that the operator could
see if there was an operator in the cabin 10. In the present
embodiment, the surrounding image is generated based on the
image captured by the image pickup apparatus C1.
Specifically, the image generation unit 31 generates a first
virtual viewpoint image as a surrounding image based on the
images captured by each of the rear camera C1 B, the front
camera C1 F, the left camera C1 L, and the right camera C1 R.
The first virtual viewpoint, which is the virtual viewpoint of the
first virtual viewpoint image, is the virtual operator viewpoint
E1 '(Fig.) Corresponding to the position of the operator's eyes
when the operator is seated in the driver's seat in the cabin 10.
See 4. ). However, the virtual operator viewpoint E 1 'may be
outside the cabin 10.
[0063]
In the present embodiment, the coordinates of the virtual
operator viewpoint E 1 ', which is the first virtual viewpoint, are
the positions of the eyes of the operator OP when the operator
OP is seated in the driver's seat DS of the remote control room
RC. It is derived based on E1 (see FIG. 4). The coordinates of
the operator's viewpoint E1 are transmitted from the remote
controller 40. The image generation unit 31 can derive the
coordinates of the virtual operator viewpoint E 1 'by converting
the coordinates of the operator viewpoint E1 in the operation
room coordinate system into the coordinates in the excavator
coordinate system.
[0064]
Further, in the present embodiment, the first virtual viewpoint
image corresponds to an image projected on the inner
peripheral surface of a virtual cylindrical virtual projection
surface surrounding the first virtual viewpoint. The virtual
projection plane may be the inner surface of a virtual sphere
surrounding the first virtual viewpoint, or may be the inner
surface of a virtual rectangular parallelepiped or cube
surrounding the first virtual viewpoint. By looking at the first
virtual viewpoint image generated in this way, the operator OP
can grasp the situation around the excavator 100 in three
dimensions. That is, the operator OP sees the first virtual
viewpoint image, for example, the depth of the loading platform
of the dump truck located in front of the excavator 100, the
height of the embankment on the ground, or the depth of the
hole on the ground. Etc. can be grasped more accurately.
[0065]
The image derived from the first virtual viewpoint image
displayed on the display device D1 is a part of the first virtual
viewpoint image generated by the image generation unit 31.
Specifically, the area of the image displayed by the display
device D 1, which occupies the entire area of the first virtual
viewpoint image, is based on the direction of the line of sight of
the operator OP seated in the driver's seat DS of the remote
control room RC. Will be decided. Information regarding the
direction of the line of sight of the operator OP is transmitted
from the remote controller 40.
[0066]
In this way, the image generation unit 31 generates a first
virtual viewpoint image as a surrounding image based on the
image output by the image pickup device C1 and the
coordinates of the operator viewpoint E1 transmitted from the
remote controller 40. Then, the image generation unit 31 cuts
out a part of the generated first virtual viewpoint image as a
partial peripheral image based on the information regarding
the direction of the line of sight of the operator OP transmitted
from the remote controller 40, and the cut out partial peripheral
image. Is transmitted to the display device D1 in the remote
control room RC.
[0067]
The excavator state specifying unit 32 is configured to specify
the state of the excavator 100. In this embodiment, the state of
the excavator 100 includes the position and orientation of the
excavator 100. The position of the excavator 100 is, for
example, the latitude, longitude, and altitude of the reference
point R2 at the excavator 100. The excavator state specifying
unit 32 specifies the position and orientation of the excavator
based on the output of the positioning device 18.
[0068]
The actuator drive unit 33 is configured to drive the actuator
mounted on the excavator 100. In the present embodiment,
the actuator drive unit 33 generates and outputs an operation
signal for each of the plurality of solenoid valves included in
the solenoid valve unit 45 based on the operation signal
transmitted from the remote controller 40.
[0069]
Each solenoid valve that receives the operation signal
increases or decreases the pilot pressure acting on the pilot
port of the corresponding control valve in the control valve unit
17. As a result, the hydraulic actuator corresponding to each
control valve operates at a speed corresponding to the stroke
amount of the control valve.
[0070]
The remote controller 40 has an operator state identification
unit 41, an image composition unit 42, and an operation signal
generation unit 43 as functional elements.
[0071]
The operator state specifying unit 41 is configured to specify
the state of the operator OP in the remote control room RC.
The state of the operator OP includes the position of the eyes
of the operator OP and the direction of the line of sight. The
operator state specifying unit 41 specifies the eye position and
the direction of the line of sight of the operator OP based on
the output of the indoor image pickup device C2. Specifically,
the operator state specifying unit 41 performs various image
processing on the image captured by the indoor image pickup
device C2, and sets the coordinates of the eye position of the
operator OP in the operation room coordinate system to the
operator viewpoint E1 (FIG. 4). Refer to.) Specified as the
coordinates. Further, the operator state specifying unit 41
performs various image processing on the image captured by
the indoor image pickup device C2 to specify the direction of
the line of sight of the operator OP in the operation room
coordinate system.
[0072]
The operator state specifying unit 41 is the operator viewpoint
E1 based on the output of a device other than the indoor
image pickup device C2, such as a LIDAR installed in the
remote control room RC or an inertial measurement unit
attached to the display device D1. The coordinates of the
operator OP and the direction of the line of sight of the
operator OP may be derived. In this case, the inertial
measurement unit may include a positioning device.
[0073]
Then, the operator state specifying unit 41 transmits
information regarding the coordinates of the operator viewpoint
E1 and the direction of the line of sight of the operator OP
toward the excavator 100 through the communication device
T2.
[0074]
The image composition unit 42 is configured to generate a
composite image by synthesizing a partial peripheral image
transmitted from the controller 30 and another image. In the
present embodiment, another image is an image of the
operation device 26 captured by the indoor image pickup
device C2. The image of the operating device 26 may be a
graphic such as computer graphics representing the state of
the operating device 26. Further, another image may include
an image or a figure showing the inside of the cabin 10.
[0075]
Specifically, the image synthesizing unit 42 generates a
second virtual viewpoint image based on the image captured
by the indoor image pickup apparatus C2. The second virtual
viewpoint, which is the virtual viewpoint of the second virtual
viewpoint image, is the operator viewpoint E1.
[0076]
In the present embodiment, the second virtual viewpoint image
is an image projected on the inner peripheral surface of a
virtual cylindrical virtual projection surface surrounding the
second virtual viewpoint, like the first virtual viewpoint image.
The virtual projection plane may be the inner surface of a
virtual sphere surrounding the second virtual viewpoint, or may
be the inner surface of a virtual rectangular parallelepiped or
cube surrounding the second virtual viewpoint.
[0077]
The image derived from the second virtual viewpoint image
displayed on the display device D1 is a part of the second
virtual viewpoint image generated by the image synthesizing
unit 42. Specifically, the area of the image displayed by the
display device D1 occupying the entire area of the second
virtual viewpoint image is based on the direction of the line of
sight of the operator OP seated in the driver's seat DS of the
remote control room RC. Will be decided.
[0078]
In this way, the image synthesizing unit 42 generates the
second virtual viewpoint image based on the image output by
the indoor image pickup apparatus C2 and the coordinates of
the operator viewpoint E1. Then, the image synthesizing unit
42 cuts out a part of the generated second virtual viewpoint
image based on the information regarding the direction of the
line of sight of the operator OP.
[0079]
Then, the image synthesizing unit 42 synthesizes the cut-out
image and the partial peripheral image transmitted from the
controller 30.
[0080]
Another image may be a design surface image which is an
image generated based on the design surface information DI.
In the present embodiment, the image synthesizing unit 42
displays a figure such as computer graphics representing the
position of the design surface based on the design surface
information DI stored in advance in the non-volatile storage
device constituting the remote controller 40. As a design
surface image, it is superimposed and displayed on the partial
surrounding image. The design surface is the ground when the
excavation work using the excavator 100 is completed. By
looking at the design surface, the operator can grasp the state
around the excavator 100 when the excavation work is
completed even before the excavation work is completed. In
this case, the image synthesizing unit 42 determines the
position where the design surface image should be
superimposed and displayed in the partial peripheral image
based on the position and orientation of the excavator
specified by the shovel state specifying unit 32.
[0081]
The operation signal generation unit 43 is configured to
generate an operation signal. In the present embodiment, the
operation signal generation unit 43 is configured to generate
an operation signal based on the output of the operation
sensor 29.
[0082]
With the above-described configuration, the remote control
system SYS allows the operator OP in the remote control room
RC to remotely control the excavator 100 at a remote location.
At that time, the remote control system SYS enables the
operator OP to visually recognize the surrounding image
generated based on the image captured by the image pickup
device C 1 attached to the excavator 100 in real time.
Specifically, the remote control system SYS displays a part of
the surrounding image generated mainly based on the image
captured by the image pickup device C1 on the head-mounted
display as the display device D1 worn by the operator OP. It
can be displayed. The operator OP who sees the image
displayed on the display device D1 can obtain a sense of
reality as if he / she is operating the excavator 100 in the cabin
10. Alternatively, if the virtual operator viewpoint E1 'is outside
the cabin 10, for example, at a position several meters ahead
of the cabin 10, the operator OP is as if outside the cabin 10 in
the immediate vicinity of the bucket 6. You can get a sense of
realism as if you were operating the excavator 100 remotely.
[0083]
Further, the remote control system SYS is configured so that
the position of the eyes and the direction of the face (line of
sight) of the operator OP can be specified based on the image
captured by the indoor image pickup device C2 installed in the
remote control room RC. There is. The remote control system
SYS is configured to change the content of the image
displayed on the display device D1 according to the change in
the eye position and the face (line of sight) direction of the
operator OP. Specifically, the remote control system SYS
displays which area of the first viewpoint conversion image
and the second viewpoint conversion image according to the
change in the eye position and the face (line of sight) direction
of the operator OP. Is configured to determine. Therefore, the
operator OP can see the image in the desired direction simply
by turning his face in the desired direction.
[0084]
Further, the remote control system SYS can display a figure or
an image representing the state of the operation device 26
installed in the remote control room RC on the display device
D1 together with a partial peripheral image which is a part of
the peripheral image. The remote control system SYS displays
a figure or an image representing the state of the operation
device 26 on an image portion corresponding to the actual
installation position of the operation device 26. Therefore, the
operator OP can recognize the state of the operating device 26
which is actually invisible even when the non-transmissive
head-mounted display is worn as the display device D1.
Further, the operator OP can easily re-grasp the operation
device 26 even when the operator releases the operation
device 26.
[0085]
Next, the image displayed on the display device D1 will be
described with reference to FIG. 7. FIG. 7 shows two
configuration examples of the image displayed on the display
device D1. Specifically, FIG. 7A is obtained by synthesizing a
figure representing the inside of the cabin 10 and a first virtual
viewpoint image (strictly speaking, a partial peripheral image
that is a part of the first virtual viewpoint image). An example of
the image is shown. FIG. 7B shows a first virtual viewpoint
image (strictly speaking, a partial peripheral image that is a
part of the first virtual viewpoint image) and a second virtual
viewpoint image (strictly speaking, a part of the second virtual
viewpoint image). ) And is shown as an example of the image
obtained by synthesizing.
[0086]
In the image shown in FIG. 7A, the figures representing the
inside of the cabin 10 are the left pillar figure G1, the right pillar
figure G2, the left operation lever figure G3L, the right
operation lever figure G3R, and the left traveling lever figure.
Includes G4L and the right traveling lever figure G4R. The
figure representing the inside of the cabin 10 may include the
figure of the engine speed adjustment dial 75.
[0087]
The figure G3L of the left operating lever is configured to
change according to a change in the output of the operating
sensor 29. That is, the figure G3L of the left operation lever is
displayed so as to be able to represent the magnitude of the
actual operation amount ( operation angle) of the left operation
lever operated by the operator OP in the remote control room
RC. The figure G3L of the left operation lever may be
configured to change according to a change in the image of
the operation device 26 captured by the indoor image pickup
device C2. The same applies to the figure G3R of the right
operating lever, the figure G4L of the left traveling lever, and
the figure G4R of the right traveling lever.
[0088]
The figure G3L of the left operating lever shown in FIG. 7A
shows that the actual left operating lever is tilted in the arm
closing direction (forward direction). Further, the figure G3R of
the right operating lever shown in FIG. 7A shows that the
actual right operating lever is tilted in the boom lowering
direction (forward direction). In the image shown in FIG. 7A,
the state when each of the left operation lever and the right
operation lever is not operated is represented by a broken line.
[0089]
Further, in the image shown in FIG. 7A, the first virtual
viewpoint image includes the image G7 of the boom 4, the
image GB of the arm 5, and the image G9 of the bucket 6.
[0090]
Further, the image shown in FIG. 7A is configured to change
according to the change in the direction of the face (line of
sight) when the operator OP changes the direction of the face
(line of sight).
[0091]
The operator OP who sees the image shown in FIG. 7 A can
obtain a sense of realism as if he / she is operating the
excavator 100 in the cabin 10.
[0092]
Further, since the figure G3L of the left operation lever is
displayed, the operator OP can easily grasp how much the left
operation lever is tilted. Similarly, since the figure G3R of the
right operating lever is displayed, the operator OP can easily
grasp how much the right operating lever is tilted.
[0093]
Further, since the figure G4L of the left traveling lever is
displayed, the operator OP is a non-transparent type as the
display device D1 even when the hand holding the left
operating lever is moved to the left traveling lever. There is no
need to remove the head-mounted display from your head.
This is because the figure G4L of the left traveling lever is
displayed in the image portion corresponding to the actual
position of the left traveling lever installed in the remote control
room RC. Similarly, since the figure G3L of the left operating
lever is displayed, the operator OP removes the display device
D1 from the head even when moving the hand holding the left
traveling lever to the left operating lever. No need. This is
because the figure G3L of the left operation lever is displayed
in the image portion corresponding to the actual position of the
left operation lever installed in the remote control room RC.
The same applies to the figure G3R of the right operating lever
and the figure G4R of the right traveling lever.
[0094]
The display of the figure representing the inside of the cabin
10 may be omitted. This is to ensure that the image portion
corresponding to the spatial portion outside the cabin 10 is
displayed over as wide a range as possible.
[0095]
Further, the figure representing the inside of the cabin 10 may
be composed of the figure G3L of the left operating lever, the
figure G3R of the right operating lever, the figure G4L of the
left traveling lever, and the graphic G4R of the right traveling
lever. In this case, the left operating lever figure G3L, the right
operating lever figure G3R, the left traveling lever figure G4L,
and the right traveling lever figure G4R are configured to be
displayed only when a predetermined condition is satisfied.
You may be. The predetermined condition is that, for example,
all of the left operation lever, the right operation lever, the left
travel lever, and the right travel lever installed in the remote
control room RC are in the non-operation state. This is to
facilitate the movement of the hand between the operating
lever and the traveling lever as described above.
[0096]
Further, the figure representing the inside of the cabin 10 may
be composed of the figure G3L of the left operation lever and
the figure G3R of the right operation lever. Even in this case,
the figure G3L of the left operation lever and the figure G3R of
the right operation lever may be configured to be displayed
only when a predetermined condition is satisfied.
[0097]
Further, the figure representing the inside of the cabin 10 may
include the figure of the arm (including the hand portion) of the
operator OP. In this case, the figure of the arm of the operator
OP may be displayed so as to change according to the change
of the actual state of the arm of the operator OP. The actual
state of the arm of the operator OP may be specified, for
example, based on the image captured by the indoor image
pickup device C2.
[0098]
The image shown in FIG. 7B is generated by synthesizing the
first virtual viewpoint image and the second virtual viewpoint
image. In the image shown in FIG. 7B, the second virtual
viewpoint image includes the image G3LG of the left operation
lever installed in the remote control room RC and the image
G3RG of the right operation lever installed in the remote
control room RC ...
[0099]
In the image shown in FIG. 7B, the image G3LG of the left
operation lever includes an image of a part of the left arm of
the operator OP holding the left operation lever with the left
hand, and the image G3RG of the right operation lever is the
right operation. Includes a partial image of the right arm of the
operator OP holding the lever with his right hand.
[0100]
Since the image G3LG of the left operation lever is generated
based on the image captured by the indoor image pickup
device C2, the actual operation contents of the left operation
lever operated by the operator OP in the remote control room
RC are displayed as they are ... The same applies to the
image G3RG of the right operating lever.
[0101]
The operator OP who sees the image shown in FIG. 7B can
get a sense of reality as if he/ she is operating the excavator
100 in the immediate vicinity of the excavation point.
[0102]
Further, since the image G3LG of the left operation lever is
displayed, the operator OP can easily grasp how much the left
operation lever is tilted. Similarly, since the image G3RG of the
right operating lever is displayed, the operator OP can easily
grasp how much the right operating lever is tilted.
[0103]
Although the three-dimensional coordinates of the virtual
operator viewpoint E1 'related to the image shown in FIG. 7 (B)
are closer to the bucket 6 than the three-dimensional
coordinates of the virtual operator viewpoint E 1 'related to the
image shown in FIG. 7 (A). , May be closer to the bucket 6. In
this case, the image shown in FIG. 7B may be an enlarged
image portion of the image shown in FIG. 7 A around the
bucket 6.
[0104]
Even in that case, that is, even when the virtual operator
viewpoint E 1 'is outside the cabin 10, the remote control system
SYS displays the image G3LG of the left operation lever and
the image G3RG of the right operation lever. It may be
displayed at an appropriate position in the image shown in 7
(B). The appropriate positions of the left operation lever image
G3LG and the right operation lever image G3RG as seen from
the virtual operator viewpoint E 1 'are, for example, the position
of the eyes of the operator OP in the remote control room RC.
It is a position corresponding to each position of the left
operation lever and the right operation lever in the remote
control room RC seen from the viewpoint.
[0105]
With this configuration, even when the virtual operator
viewpoint E 1 'is outside the cabin 10, the operator OP can
easily grasp how much each of the left operation lever and the
right operation lever is tilted ...
[0106]
Further, in the image shown in FIG. 7B, the figure showing the
inside of the cabin 10 is not displayed. Therefore, the operator
OP can also see the image portion corresponding to the space
portion that could not be seen because it was blocked by the
left pillar, the right pillar, and the like if it was in the cabin 10.
However, the remote control system SYS may display the
figure G1 of the left pillar and the figure G2 of the right pillar at
appropriate positions in the image shown in FIG. 7 (B). In this
case, the operator OP can obtain the actual bucket 6 and the
actual bucket 6 by grasping, for example, the size of the
distance between the figure G1 of the left pillar and the figure
G2 of the right pillar in the image displayed on the display
device D1. The positional relationship with the cabin 10 can be
recognized more accurately.
[0107]
Further, the display of the image G3LG of the left operation
lever and the image G3RG of the right operation lever may be
omitted. This is to ensure that the image portion corresponding
to the spatial portion outside the cabin 10 is displayed over as
wide a range as possible.
[0108]
Further, the image G3LG of the left operation lever and the
image G3RG of the right operation lever may be configured to
be displayed only when a predetermined condition is satisfied.
The predetermined condition is that, for example, the left
operation lever and the right operation lever installed in the
remote control room RC are in the non-operation state. This is
to facilitate the movement of the hand between the operating
lever and the traveling lever as described above.
Claim 2:
2. The remote operation support system according to claim 1,
wherein the index image is an image having such transparency that at least part of a superimposed region in the captured image is visible or a line map representing at least one of outline or a ridge line of the index member (Wu: FIG. 7(B), paragraph [0100], in an embodiment only image G3LG and image G3RG are displayed with the remainder of the cabin captured by indoor image pickup not displayed thus transparent, additionally FIG. 7(B) itself shows a line map representing at least one of outline or a ridge line of the index member.).
[0100]
Since the image G3LG of the left operation lever is generated
based on the image captured by the indoor image pickup
device C2, the actual operation contents of the left operation
lever operated by the operator OP in the remote control room
RC are displayed as they are ... The same applies to the
image G3RG of the right operating lever.
Claim 3:
3. The remote operation support system according to claim 1 wherein the index image is displayed with a size according to a magnification of the captured image (Wu: paragraph [0106], left pillar and right pillar has display size to enable “In this case, the operator OP can obtain the actual bucket 6 and the actual bucket 6 by grasping, for example, the size of the distance between the figure G1 of the left pillar and the figure G2 of the right pillar in the image displayed on the display device D1”, thus, the left pillar and right pillar are displayed with a perspective size that facilitates obtaining the actual bucket 6.).
[0106]
Further, in the image shown in FIG. 7B, the figure showing the
inside of the cabin 10 is not displayed. Therefore, the operator
OP can also see the image portion corresponding to the space
portion that could not be seen because it was blocked by the
left pillar, the right pillar, and the like if it was in the cabin 10.
However, the remote control system SYS may display the
figure G1 of the left pillar and the figure G2 of the right pillar at
appropriate positions in the image shown in FIG. 7 (B). In this
case, the operator OP can obtain the actual bucket 6 and the
actual bucket 6 by grasping, for example, the size of the
distance between the figure G1 of the left pillar and the figure
G2 of the right pillar in the image displayed on the display
device D1. The positional relationship with the cabin 10 can be
recognized more accurately.
Claim 4:
4. The remote operation support system according to claim 1, wherein in a case where rotation operation of an upper rotating body on which the imaging device is mounted in the work machine with respect to a lower traveling body is started, the synthetic image is displayed on the remote output interface (Wu: FIGs. 7(A) and 7(B) and paragraphs [0095], [0096], and [0108]; no operation of the left operation lever, the right operation lever, the left traveling lever, and the right traveling lever controls display or no display of the left operating lever figure G3L, the right operating lever figure G3R, the left traveling lever figure G4L, and the right traveling lever figure G4R; the BRI of the claimed “rotation operation of an upper rotating body on which the imaging device is mounted in the work machine with respect to a lower traveling body is started” covers many embodiments such as:
Wu displaying FIG. 7(A) while rotating and not rotating; and
Wu having the excavator 100 in the ON state ready to receive rotate commands from the left traveling lever or the right traveling lever which controls display of the levers: the left operating lever figure G3L, the right operating lever figure G3R, the left traveling lever figure G4L, and the right traveling lever figure G4R.). Wu’s FIGs. 7(A) and 7(B) and paragraphs [0095], [0096], and [0108] follows:
PNG
media_image1.png
840
564
media_image1.png
Greyscale
[0095]
Further, the figure representing the inside of the cabin 10 may
be composed of the figure G3L of the left operating lever, the
figure G3R of the right operating lever, the figure G4L of the
left traveling lever, and the graphic G4R of the right traveling
lever. In this case, the left operating lever figure G3L, the right
operating lever figure G3R, the left traveling lever figure G4L,
and the right traveling lever figure G4R are configured to be
displayed only when a predetermined condition is satisfied.
You may be. The predetermined condition is that, for example,
all of the left operation lever, the right operation lever, the left
travel lever, and the right travel lever installed in the remote
control room RC are in the non-operation state. This is to
facilitate the movement of the hand between the operating
lever and the traveling lever as described above.
[0096]
Further, the figure representing the inside of the cabin 10 may
be composed of the figure G3L of the left operation lever and
the figure G3R of the right operation lever. Even in this case,
the figure G3L of the left operation lever and the figure G3R of
the right operation lever may be configured to be displayed
only when a predetermined condition is satisfied.
[0108]
Further, the image G3LG of the left operation lever and the
image G3RG of the right operation lever may be configured to
be displayed only when a predetermined condition is satisfied.
The predetermined condition is that, for example, the left
operation lever and the right operation lever installed in the
remote control room RC are in the non-operation state. This is
to facilitate the movement of the hand between the operating
lever and the traveling lever as described above.
Claim 5:
The remote operation support system according to claim 1, wherein in a case where rotation operation of an upper rotating body on which the imaging device is mounted in the work machine with respect to a lower traveling body comes to an end, display of the synthetic image on the remote output interface is ended (Wu: FIGs. 7(A) and 7(B) and paragraphs [0095], [0096], and [0108]; no operation of the left operation lever, the right operation lever, the left traveling lever, and the right traveling lever controls display or no display of the left operating lever figure G3L, the right operating lever figure G3R, the left traveling lever figure G4L, and the right traveling lever figure G4R; the BRI of the claimed “rotation operation of an upper rotating body on which the imaging device is mounted in the work machine with respect to a lower traveling body comes to an end” covers many embodiments such as:
Wu not displaying the levers G3L, G3R, G4L, and G4R when the left traveling lever G4L and/or the right traveling lever G4R are not operated resulting in no and/or no more rotation but the left operating lever figure G3L and/or the right operating lever figure G3R being operated; and
Wu having the excavator 100 rotating at the time no more operation of the left traveling lever or the right traveling lever occurs which controls no display of the levers: the left operating lever figure G3L, the right operating lever figure G3R, the left traveling lever figure G4L, and the right traveling lever figure G4R.). Wu’s FIGs. 7(A) and 7(B) and paragraphs [0095], [0096], and [0108] was reproduced above.
Claim 6:
6. A remote operation support composite system comprising:
the remote operation support system according to claim 1; and
at least one of
a work machine (Wu: FIGs. 1 and 4, excavator 100 is a work machine, at least paragraphs [0059] and [0060].) or
a remote operation device (Wu: FIG. 3, paragraph [0060], remote control system SYS has remote controller 40 installed in the remote control room RC.).
[0059]
Next, a configuration example of the remote control system of
the excavator 100 will be described with reference to FIG. FIG.
6 is a functional block diagram showing a configuration
example of the remote control system SYS.
[0060]
The remote control system SYS mainly includes a controller
30, a solenoid valve unit 45, an image pickup device C1, and a
communication device T1 mounted on the excavator 100, and
an operation sensor 29 and a remote controller 40 installed in
the remote control room RC. , The indoor image pickup device
C2, the display device D1, and the communication device T2.
Claim 7:
Claim 7 is a method claim version of system claim 1 and remote operation support method claim 7 is rejected for the same reason given for remote operation support system claim 1.
Prior Art
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Ito et al., US Patent Application Publication No. 2020/0399861, is pertinent to claims 4 and 5 regarding to display in response to rotation of upper body relative to lower body for example an arrow 50, refer to the abstract, paragraphs [0006], [0007], [0032]-[0037] and [0044]-[0047].
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JEFFERY A BRIER whose telephone number is (571)272-7656. The examiner can normally be reached on Mon-Fri from 8:30am-3:00pm.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Xiao M Wu, can be reached at telephone number 571-272-7761. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from Patent Center. Status information for published applications may be obtained from Patent Center. Status information for unpublished applications is available through Patent Center for authorized users only. Should you have questions about access to Patent Center, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) Form at https://www.uspto.gov/patents/uspto-automated- interview-request-air-form.
JEFFERY A. BRIER
Primary Examiner
Art Unit 2613
/JEFFERY A BRIER/Primary Examiner, Art Unit 2613