DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Election/Restrictions
Applicant’s election without traverse of Group I in the reply filed on 1/6/2026 is acknowledged however from further search and consideration the Groups I, II and III seems to be obvious from each other therefore the restriction will be withdrawn and all the claims presented will be examined.
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: “display unit” and “input unit” in claim 1 and “calculation unit” and “communication unit” in claims 12 and 19
Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-19 are rejected under 35 U.S.C. 101 because the claimed invention is directed to abstract idea without significantly more.
[101 Analysis Step 1]
Step 1, of the 2019 Guidance, first looks to whether the claimed invention is directed to a statutory category, namely a process, machine, manufactures, and compositions of mater.
The claim 1 is directed to an input device (i.e. machine), claim12 is directed to a server (i.e. machine), claim 18 is directed to method (i.e. process) and claim 19 is directed to a system (i.e. machine). Thus, claims 1, 12, 18 and 19 are one of four the statutory categories (Step 1: YES).
[101 Analysis Step 2A, Prong I]
Regarding Prong I of the Step 2A analysis in the 2019 PEG, the claims are to be analyzed to determine whether they recite subject matter that falls within one of the follow groups of abstract ideas: a) mathematical concepts, b) certain methods of organizing human activity, and/or c) mental processes.
Independent Claims 1 and 12 includes limitations that recite an abstract idea (emphasized below) and will be used as a representative claim(s) for the remainder of the 101 rejection.
Claim 1 recites:
An input device comprising:
a display unit which displays a map including a path along which a movable working machine can travel; and
an input unit for a user to input a first instruction to display an environment object indicating an environment related to the path along with the path on the map displayed on the display unit,
wherein the display unit displays the environment object at a position corresponding to the path on the map when the first instruction is input to the input unit.
Claim 12 recites:
A server which can communicate with an input device which displays a map including a path along which a movable working machine can travel, wherein environment information indicating an environment related to the path is input by a user on the map in the input device,
the server includes:
a calculation unit which calculates the path along which the movable working machine travels; and
a communication unit which receives, from the input device, position information indicating a position on the map at which the environment information is input by the user in the input device, and
the calculation unit re-calculates a new path along which the movable working machine should travel based on the position information and position information of the path when the communication unit receives the position information from the input device.
The examiner submits that the foregoing bolded limitations(s) constitute a “mental process” because under its broadest reasonable interpretations, the claims covers performance of the limitation in the human mind. For example, “to display…”, “calculates the path…”, and “re-calculates a new path…” in the context of the claims encompasses a person drawing a path in the map and looking at and using the data collected to formulating a judgement. Accordingly, the claim recites at least one abstract idea.
[101 Analysis Step 2A, Prong II]
Regarding Prong II of the Step 2A analysis in the 2019 PEG, the claims are to be analyzed to determine whether the claim, as a whole, integrates the abstract into a practical application. As noted in the 2019 PEG, it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception. The courts have indicated that additional elements merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application.”
In the present case, the additional limitations beyond the above-noted abstract idea are as follows (where the underlined portions are the “additional limitations” while the bolded portions continue to represent the “abstract idea”):
Claim 1 recites:
An input device comprising:
a display unit which displays a map including a path along which a movable working machine can travel; and
an input unit for a user to input a first instruction to display an environment object indicating an environment related to the path along with the path on the map displayed on the display unit,
wherein the display unit displays the environment object at a position corresponding to the path on the map when the first instruction is input to the input unit.
Claim 12 recites:
A server which can communicate with an input device which displays a map including a path along which a movable working machine can travel, wherein environment information indicating an environment related to the path is input by a user on the map in the input device,
the server includes:
a calculation unit which calculates the path along which the movable working machine travels; and
a communication unit which receives, from the input device, position information indicating a position on the map at which the environment information is input by the user in the input device, and
the calculation unit re-calculates a new path along which the movable working machine should travel based on the position information and position information of the path when the communication unit receives the position information from the input device.
For the following reason(s), the examiner submits that the above identified additional limitations do not integrate the above-noted abstract into a practical applications.
Regarding the additional limitations of “a display unit which displays a map including a path along which a movable working machine can travel”, “an input unit for a user to input a first instruction”, “input device which displays a map including a path along which a movable working machine can travel” and “a communication unit which receives, from the input device, position information indicating a position on the map at which the environment information is input by the user in the input device” the examiner submits that these limitations are insignificant extra-solution activities that merely use a computer (processing circuitry of a computer system) to perform the process. In particular, the display unit which display are recited at a high level of generality (i.e. as a general means of displaying information from the input device), and input device is amounts to mere data gathering, which is a form of insignificant extra-solution activity. Lastly, the “server”, “calculation unit”, “communication unit” and “calculation unit” are recited at a high-level of generality (i.e. as a generic processor performing a generic computer function) such that it amounts no more than mere instructions to apply the exception using a generic computer component.
Thus, taken alone, the additional elements do not integrate the abstract idea into a practical application. Further, looking at the additional limitation(s) as an ordered combination or as a whole, the limitation(s) add nothing that is not already present when looking at the elements taken individually. For instance, there is no indication that the additional elements, when considered as a whole, reflect an improvement in the functioning of a computer or an improvement to another technology or technical filed, apply or use the above-noted judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition, implement/use the above-noted judicial exception with a particular machine or manufacture that is integral to the claim, effect a transformation or reduction of a particular article to a different state or thing, or apply or use the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is not more than drafting effort designed to monopolize the exception (MPEP § 2106.05). Accordingly, the additional limitation(s) do/does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea.
[101 Analysis Step 2B]
Regarding Step 2B of the Revised Guidance, representative independent claims 1, 12, 18 and 19 do not include additional elements (considered both individually and as an ordered combination) that are sufficient to amount to significantly more than the judicial exception for the same reasons to those discussed above with respect to determining that the claim does not integrate the abstract idea into a practical application. As discussed above with respect to integration of the abstract idea into a practical application, the additional elements of using merely drawing the map by a person “to display an environment object” and further a computer to perform the calculating and recalculating… amounts to nothing more than mere instructions to apply the exception using a generic computer component. Mere instructions to apply an exception using a generic computer component cannot provide an inventive concept. And as discussed above, the additional limitations of “displays a map…”, “input a first instruction…” and “receives, from the input device, position information indicating…” the examiner submits that these limitations are insignificant extra-solution activities. Hence, the claims are not patent eligible.
Dependent claims 2-11, and 13-17 do not recite any further limitations that cause the claims to be directed towards statutory subject matter. The claims merely recite: abstract idea. Each of the further limitations expound upon the abstract ideas and do not recite additional elements integrating the abstract ideas into a practical application or additional elements that are not well-understood, routine or conventional. Therefore, dependent claims 2-11, and 13-17 are similarly rejected as being directed towards non-statutory subject matter.
Therefore, claims 1-19 is/are ineligible under 35 USC §101.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claim(s) 1-6, 8-9, 11-15 and 17-19 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Pub No. US 2024/0004384 A1 to Nishi et. al. (Nishi).
In Reference to Claim 1
An input device (23) comprising:
a display unit (23) which displays a map including a path along which a movable working machine can travel (see at least Nishi Figs. 1 and 6, and paragraphs [0080], [0105] “the operation display unit 23 is a user interface including a display unit, such as a liquid crystal display or an organic EL display, that displays various pieces of information and an operating unit, such as a touch panel, a mouse, or a keyboard, that receives operations. On an operation screen displayed on the display unit, an operator can operate the operating unit to register various pieces of information (such as work vehicle information, field information, and work information, which will be described later”, “Upon the reception process unit 212 accepts the starting operation from the operator, it causes the teaching operation screen D2 to display a route start position image Ms in which the route start position Ts1 is indicated at the entrance/exit H1 of the field F1 on the map (see FIG. 8A). As illustrated in FIG. 8A, the reception process unit 212 displays a guide route Mr (dotted line) that connects the field F1 and the field F2 as information to support the travel operation of the teaching travel. This allows the operator to perform the manual travel operation (manually drive) according to the guide route Mr, thereby making it easier to perform the teaching travel operation”); and
an input unit (touch screen of #23) for a user to input a first instruction (starting operation) to display an environment object indicating an environment related to the path along with the path on the map displayed on the display unit (see at least Nishi Figs. 6 and paragraphs [0094] “In addition, by allowing the operator to perform registration operation on the operation terminal 20, the setting process unit 211 sets information associated with the position and the shape of the field, the work start position to start work (travel start position), the work end position to finish work (travel end position), and the work direction, etc”),
wherein the display unit (23) displays the environment object at a position corresponding to the path on the map when the first instruction is input to the input unit (touch screen of #23) (see at least Nishi Figs. 8 and 17-23 and paragraphs [0107] “Upon the reception process unit 212 accepts the starting operation from the operator, it causes the teaching operation screen D2 to display a route start position image Ms in which the route start position Ts1 is indicated at the entrance/exit H1 of the field F1 on the map (see FIG. 8A). As illustrated in FIG. 8A, the reception process unit 212 displays a guide route Mr (dotted line) that connects the field F1 and the field F2 as information to support the travel operation of the teaching travel. This allows the operator to perform the manual travel operation (manually drive) according to the guide route Mr, thereby making it easier to perform the teaching travel operation” and “For example, when multiple obstacles are detected, the operation control unit 21 inquires of the operator about whether or not to register each of the obstacles as the non-avoidance travel target, it may store the position information of the obstacle which is selected as the non-avoidance travel target in association with the inter-field route. For example, if the obstacle sensor 18 detects the obstacles B1 and B2 on the road R0, the operation control unit 21 displays the obstacles B1 and B2 selectably on the teaching operation screen D2 shown in FIG. 17. The operator can select (touch operation) the icon images of obstacles B1 and B2 on the teaching operation screen D2. Upon the operator selects the obstacle B1, the operation control unit 21 inquires of the operator about whether or not to resister the obstacle B1 as the non-avoidance travel target. If the operator selects the answer (“YES”) indicating that obstacle B1 should be excluded from the avoidance travel target, the operation control unit 21 registers the obstacle B1 as the non-avoidance travel target. Subsequently, if the operator selects the obstacle B2, the operation control unit 21 inquires of the operator about whether or not to register the obstacle B2 as the non-avoidance travel target. If the operator selects the answer (“NO”) indicating that obstacle B2 should not be excluded from the avoidance travel target, the operation control unit 21 registers the obstacle B1 as the avoidance travel target. In this way, the operation control unit 21 may be able to register the detected obstacles as the avoidance travel target or the non-avoidance travel target for each obstacle”).
In Reference to Claim 2
The input device according to claim 1 (see rejection to claim 1 above), wherein the first instruction includes:
an instruction to indicate to start the first instruction (see at least Nishi Figs. 3 and paragraphs [0073] “The travel processor 111 controls travel of the work vehicle 10. Specifically, the travel processor 111 causes the work vehicle 10 to start autonomous travel upon acquiring a work start command from the operation terminal 20. For example, when the operator presses down a work start button on an operation screen of the operation terminal 20, the operation terminal 20 outputs the work start instruction to the work vehicle 10. Upon obtaining the work start instruction from the operation terminal 20, the travel processor 111 causes the work vehicle 10 to start the autonomous travel. Thus, the work vehicle 10 starts the autonomous travel in accordance with the target route R1 (see FIG. 4A) in the field F1, for example and starts working by the work machine 14. The work vehicle 10 starts the autonomous travel in accordance with the target route R2 (see FIG. 4B) in the field F2, for example and starts working by the work machine 14. The work vehicle 10 also performs the autonomous travel along the inter-field route R12 (see FIG. 3) on the road R0, for example. That is, the travel processor 111 can cause the work vehicle 10 to autonomously travel along the inter-field route R12 on the road R0 outside the field. For example, the travel processor 111 causes the work vehicle 10 to autonomously travel on the road R0 that connects the field F1 and the field F2 along the inter-field route R12 set on the road R0. The target route and the inter-field route that the work vehicle 10 travels autonomously are generated at the operation terminal 20, for example. By obtaining the data of the target route and the inter-field route from the operation terminal 20, the work vehicle 10 can travel autonomously along the target route and the inter-field route”); and
an instruction to specify the path displayed along with the environment object on the map (see at least Nishi Figs. 3 and paragraphs [0076] “the work vehicle 10 uniformly executes the avoidance travel process (travel restriction process) when the obstacle sensor 18 detects the obstacle. Therefore, as shown in FIG. 3, when the work vehicle 10 autonomously travels along the preset inter-field route R12 and the fixed objects such as trees B1 and B2 are detected as the obstacles, the avoidance travel process is executed even though there is no risk of collision as long as the work vehicle travels along the inter-field route R12. For example, while traveling along the inter-field route R12, when trees B1 are detected, the work vehicle 10 decelerates, or when trees B2 are detected, the work vehicle 10 is stopped. This results in a problem that reduces traveling efficiency due to unsmooth movement between fields. In contrast, the autonomous travel system 1 according to the present embodiment can improve the work efficiency in the work vehicle 10 that travels autonomously between multiple fields, as described below”)
In Reference to Claim 3
The input device according to claim 1 (see rejection to claim 1 above), wherein the input unit (touch screen of #23) further inputs a second instruction for the user to select a particular type of environment related to the path among a plurality of environments of different types (see at least Nishi Figs. 5 and 19 paragraphs [0195] “Upon the operation control unit 21 acquires the recognition result from the detection process device 17, it causes the teaching operation screen D2 shown in FIG. 19B to display the trees B5, B6, and B7 selectably. The operation control unit 21 may cause the teaching operation screen D2 to display the captured images of the trees B5, B6, and B7. The operator can select the trees to be registered as non-avoidance travel target on the teaching operation screen D2. The operation control unit 21 registers the trees selected by the operator as the non-avoidance travel target.”).
In Reference to Claim 4
The input device according to claim 3 (see rejection to claim 3 above), wherein the input unit (touch screen of #23) inputs a predetermined operation by the user on a position on the path on the map,
the display unit (touch screen of #23) displays options for the user to select a particular type of environment among the plurality of environments when the predetermined operation is input to the input unit (touch screen of #23) (see at least Nishi Figs. 5 and 19 paragraphs [0194] “As another embodiment of the disclosure, the operation control unit 21 may store information about the type of obstacle in association with the position information of the obstacle when the work vehicle 10 detects the obstacle during travel on the road R0 based on a teaching operation by the operator. For example, when the work vehicle 10 is equipped with a camera, the detection process device 17 acquires image data captured by the camera to recognize the type of obstacle. For example, the detection process device 17 analyzes the image data to recognize the trees, vehicles, materials, utility poles, pylons, people, etc. For example, as shown in FIG. 19A, the detection process device 17 recognizes three trees B5, B6, and B7”), and
the input unit (touch screen of #23) inputs an instruction for the user to select, among the options, a particular type of environment related to the path (see at least Nishi Figs. 5 and 19 paragraphs [0195] “Upon the operation control unit 21 acquires the recognition result from the detection process device 17, it causes the teaching operation screen D2 shown in FIG. 19B to display the trees B5, B6, and B7 selectably. The operation control unit 21 may cause the teaching operation screen D2 to display the captured images of the trees B5, B6, and B7. The operator can select the trees to be registered as non-avoidance travel target on the teaching operation screen D2. The operation control unit 21 registers the trees selected by the operator as the non-avoidance travel target.”).
In Reference to Claim 5
The input device according to claim 1 (see rejection to claim 1 above), wherein the environment related to the path includes a travel rule on the path (see at least Nishi Figs. 5 and 18B paragraphs [0193] “As another embodiment of the present disclosure, when the obstacle is detected while traveling on the road R0 based on the teaching operation by the operator, the operation control unit 21 may cause the operation terminal 20 to display information indicating that the obstacle is detected in a display mode according to the distance from the work vehicle 10 to the obstacle. For example, as shown in FIG. 18A, in the case where the distance from the work vehicle 10 to the obstacle B3 when the work vehicle 10 detects the obstacle B3 is different from the distance from the work vehicle 10 to the obstacle B4 when the work vehicle 10 detects the obstacle B4, the operation control unit 21 causes the operation terminal 20 to display the inter-field route in a different display mode. For example, as shown in FIG. 18B, in the case where the obstacle B3 is included in a stop area K11 (see FIG. 5), the operation control unit 21 cause the screen to display the partial route r3 of the section where obstacle B3 is detected with a solid red line, and in the case where the obstacle B4 is included in a deceleration area K12 (see FIG. 5), the operation control unit 21 causes the screen to display the partial route r4 of the section where the obstacle B4 is detected with a solid blue line. The operator may also be able to set the display mode to any other mode. In this case, the operation control unit 21 may cause the screen to display the partial routes r3 and r4 in a display mode according to the setting content set by the operator”).
In Reference to Claim 6
The input device according to claim 5 (see rejection to claim 5 above), wherein the travel rule includes at least one of prohibiting entry to the path and limiting a travel speed on the path (see at least Nishi Figs. 5 and 18B paragraphs [0193] “As another embodiment of the present disclosure, when the obstacle is detected while traveling on the road R0 based on the teaching operation by the operator, the operation control unit 21 may cause the operation terminal 20 to display information indicating that the obstacle is detected in a display mode according to the distance from the work vehicle 10 to the obstacle. For example, as shown in FIG. 18A, in the case where the distance from the work vehicle 10 to the obstacle B3 when the work vehicle 10 detects the obstacle B3 is different from the distance from the work vehicle 10 to the obstacle B4 when the work vehicle 10 detects the obstacle B4, the operation control unit 21 causes the operation terminal 20 to display the inter-field route in a different display mode. For example, as shown in FIG. 18B, in the case where the obstacle B3 is included in a stop area K11 (see FIG. 5), the operation control unit 21 cause the screen to display the partial route r3 of the section where obstacle B3 is detected with a solid red line, and in the case where the obstacle B4 is included in a deceleration area K12 (see FIG. 5), the operation control unit 21 causes the screen to display the partial route r4 of the section where the obstacle B4 is detected with a solid blue line. The operator may also be able to set the display mode to any other mode. In this case, the operation control unit 21 may cause the screen to display the partial routes r3 and r4 in a display mode according to the setting content set by the operator”).
In Reference to Claim 8
The input device according to claim 1 (see rejection to claim 1 above), wherein the input unit (touch screen of #23) further inputs a third instruction from a user to stop displaying the environment object displayed on the display unit, and the display unit stops displaying the environment object displayed on the map when the third instruction is input to the input unit (touch screen of #23) (see at least Nishi Figs. 5 and 20 paragraphs [0199] and [0200] “For example, during autonomous traveling along the inter-field route R12 shown in FIG. 19A by the work vehicle 10, when the detection process device 17 newly detects an obstacle B8 which is different from the obstacles (trees B5, B6, B7) registered in the obstacle information table E2, the operation control unit 21 inquires of the operator about whether to additionally resister the obstacle B8 as the non-avoidance travel target on the travel screen D4 shown in FIG. 20. If the detection process device 17 obtains a command issued by the operator to register the obstacle B8 as the non-avoidance travel, it resisters the obstacle B8 as the non-avoidance travel target. This allows the work vehicle 10 to perform normal travel process without performing avoidance travel process even if the work vehicle 10 detects the obstacle B8 during travel on the inter-field route R12 next time. If the operator knows in advance an obstacle to be newly disposed, the operator may perform registration operation of the obstacle as the non-avoidance travel target in advance” and “As another embodiment of the present disclosure, if the detection process device 17 does not detect the obstacle which is associated with the inter-field route during autonomous traveling along the inter-field route by the work vehicle 10, the operation control unit 21 may inquire of the operator whether or not to exclude the obstacle from the non-avoidance travel target. When the operation control unit 21 obtains a command issued by the operator to exclude the obstacle from the non-avoidance travel target, it deletes the position information of the obstacle which is associated with the inter-field route”).
In Reference to Claim 9
The input device according to claim 1 (see rejection to claim 1 above), wherein the input unit (touch screen of #23) inputs from the user a position on the map on the path at which the environment object is to be displayed, and
the display unit displays the environment object along with the path on the display unit when the position input to the input unit (touch screen of #23) is off the path on the map and the position input to the input unit (touch screen of #23) is within a predetermined range from the path (see at least Nishi Figs. 21 paragraphs [0201] “For example, if the detection process device 17 does not detect the trees B3 registered in the obstacle information table E2 during autonomous traveling along the inter-field route R12 shown in FIG. 19A by the work vehicle 10, the trees B3 may no longer exists due to cutting down or the like. In such a case, if the work vehicle 10 detects a new obstacle in the area where the trees B3 existed, it is necessary for the work vehicle 10 to perform avoidance travel process. Therefore, on the travel screen D4 shown in FIG. 21, the operation control unit 21 inquires of the operator whether or not to exclude the trees B3, which is registered as the non-avoidance travel target, from the non-avoidance travel target (set as an area to be detected). If the operation control unit 21 obtains a command issued by the operator to exclude the trees B3 from the non-avoidance travel target, it deletes the obstacle information corresponding to the trees B3 from the obstacle information table E2. For example, if the detection process device 17 detects afterwards the new obstacle in the area where the trees B3 existed during autonomous traveling along the inter-field route R12 by the work vehicle 10, this allows the work vehicle 10 to perform avoidance travel process properly”).
In Reference to Claim 11
The input device according to claim 1 (see rejection to claim 1 above), wherein the movable working machine (10) is an unmanned vehicle which performs autonomous navigation (see at least Nishi Figs. 1-2 paragraphs [0042]).
In Reference to Claim 12
A server (21) which can communicate with an input device (23) which displays a map including a path along which a movable working machine can travel, wherein environment information indicating an environment related to the path is input by a user on the map in the input device (23) (see at least Nishi Figs. 1 and paragraphs [0080] “the operation display unit 23 is a user interface including a display unit, such as a liquid crystal display or an organic EL display, that displays various pieces of information and an operating unit, such as a touch panel, a mouse, or a keyboard, that receives operations. On an operation screen displayed on the display unit, an operator can operate the operating unit to register various pieces of information (such as work vehicle information, field information, and work information, which will be described later”),
the server (21) includes:
a calculation unit (21) which calculates the path along which the movable working machine travels (see at least Nishi Figs. 1, 12 and 17-23 and paragraphs [0090], [0174] “The operation control unit 21 has control devices such as a CPU, a ROM, and a RAM. The CPU is a processor that executes various types of arithmetic processes. The ROM is a non-volatile storage unit that stores a control program, such as a BIOS, an OS, or the like, that causes the CPU to execute various types of arithmetic processes in advance. The RAM is a volatile or non-volatile storage unit that stores various pieces of information and is used as a temporary storage memory for the various processes executed by the CPU. The operation control unit 21 controls the operation terminal 20 by causing the CPU to execute various types of control programs stored in advance in the ROM or the storage unit 22” and “For example, as shown in FIG. 12, the operation control unit 21 generates the inter-field route R12 which connects the route start position Ts1 at the entrance/exit H1 of field F1 and the route end position Te2 at the entrance/exit H2 of field F2 and passes through the road R0. The operation control unit 21 notifies the operator of whether or not to register the generated inter-field route R12 on the teaching operation screen D2 illustrated in FIGS. 8C and 10B. When the operator confirms the inter-field route R12 on the teaching operation screen D2 and presses down the registration button, the operation control unit 21 acquires the command to register the inter-field route R12, and registers the inter-field route R12 in association with the field F1 and the field F2”); and
a communication unit (21) which receives, from the input device, position information indicating a position on the map at which information indicating the environment is input by the user in the input device (see at least Nishi Figs. 1, 12 and 17-23 and paragraphs [0092] and [0193] “the setting process unit 211 sets information associated with the work vehicle 10 (hereinafter, referred to as work vehicle information), information associated with the field (hereinafter, referred to as field information), and information associated with the specific way how to perform work (hereinafter, referred to as work information). The setting process unit 211 receives setting operations from the operator on the setting screen D1 illustrated in FIG. 6 to register each setting information” and “As another embodiment of the present disclosure, when the obstacle is detected while traveling on the road R0 based on the teaching operation by the operator, the operation control unit 21 may cause the operation terminal 20 to display information indicating that the obstacle is detected in a display mode according to the distance from the work vehicle 10 to the obstacle. For example, as shown in FIG. 18A, in the case where the distance from the work vehicle 10 to the obstacle B3 when the work vehicle 10 detects the obstacle B3 is different from the distance from the work vehicle 10 to the obstacle B4 when the work vehicle 10 detects the obstacle B4, the operation control unit 21 causes the operation terminal 20 to display the inter-field route in a different display mode. For example, as shown in FIG. 18B, in the case where the obstacle B3 is included in a stop area K11 (see FIG. 5), the operation control unit 21 cause the screen to display the partial route r3 of the section where obstacle B3 is detected with a solid red line, and in the case where the obstacle B4 is included in a deceleration area K12 (see FIG. 5), the operation control unit 21 causes the screen to display the partial route r4 of the section where the obstacle B4 is detected with a solid blue line. The operator may also be able to set the display mode to any other mode. In this case, the operation control unit 21 may cause the screen to display the partial routes r3 and r4 in a display mode according to the setting content set by the operator”),
the calculation unit (21) re-calculates a new path along which the movable working machine should travel based on the position information and position information of the path when the communication unit receives the position information from the input device (23) (see at least Nishi Figs. 1, 12 and 17-23 and paragraphs [0210] and [0211] “So, for example, the operation control unit 21 may cause the screen to display a message indicating that the area to be detected by the obstacle sensor 18 is changed or a message prompting the operator to execute the teaching travel again when the operation control unit 21 changes the inter-field route R12 after generating the inter-field route R12” and “When the inter-field route R12 shown in FIG. 23 is changed to the inter-field route R12 shown in FIG. 22, the obstacle B11, which is registered as the non-avoidance travel target, is excluded from the area to be detected by the obstacle sensor 18. In this case, the operation control unit 21 may automatically exclude the obstacle B11 from the non-avoidance travel target instead of the above mentioned messages”).
In Reference to Claim 13
The server according to claim 12 (see rejection to claim 12 above), wherein the environment related to the path (12) includes a travel rule on the path (R12) (see at least Nishi Figs. 1, 12 and 16-23 and paragraphs [0182] “The vehicle control device 11 determines that the obstacle is not the non-avoidance travel target if the position of the obstacle detected by the obstacle sensor 18 does not fall within the range from the start point position to the end point position registered in the obstacle information table E2, or if the obstacle is registered in the obstacle information table E2 in association with information that the obstacle is the avoidance travel target”).
In Reference to Claim 14
The server according to claim 12 (see rejection to claim 12 above), wherein the communication unit further: receives information indicating a state of the path; and transmits to the input device (23) the map further including information indicating the state of the path when the communication unit receives information indicating the state of the path (see at least Nishi Figs. 1, 12 and 16-23 and paragraphs [0207] and [208] “Here, it may be possible, for example, to replace the inter-field route R12 shown in FIG. 22 with the inter-field route R12 shown in FIG. 23. Specifically, the inter-field route R12 shown in FIG. 22 includes turning routes a1 and a2 each having a small turning radius. In this case, if the operator wishes to cause the work vehicle 10 to autonomously travel on a gradual turning route with a large turning radius, the operator performs editing operation on the generated inter-field route R12. For example, on the teaching operation screen D2, the operator selects the position image indicated by the position information or the line image of the generated route to perform editing operation by deleting the position image or line image, or changing the line type (straight line or curved line). This allows the operation control unit 21 to change the turning route to the turning route a3 having a larger turning radius as shown in FIG. 23” and “If the inter-field route R12 is changed from the route shown in FIG. 22 to the route shown in FIG. 23, the area to be detected by the obstacle sensor 18 may be different between when the work vehicle 10 travels on the inter-field route R12 shown in FIG. 22 and when the work vehicle 10 travels on the inter-field route R12 shown the inter-field route R12 shown in FIG. 23. Therefore, for example, when the inter-field route R12 shown in FIG. 22 is set as the non-avoidance travel target, the obstacles B12 and B13 detected by the obstacle sensor 18 may be registered as the non-avoidance travel target (dotted frame area), and the obstacle B11 may not be detected by the obstacle sensor 18”).
In Reference to Claim 15
The server according to claim 12 (see rejection to claim 12 above), wherein the communication unit receives information indicating a state of the path detected by the movable working machine (see at least Nishi Figs. 1, 12 and 16-23 and paragraphs [0185] “In this way, when the obstacle is detected while the work vehicle 10 is autonomously traveling along the inter-field route, if the position of the obstacle is included in the position information of the obstacle associated with the inter-field route (non-avoidance travel target) (S34: Yes), the vehicle control device 11 does not cause the work vehicle 10 to perform the avoidance travel (S331). After steps S35 and S331, the vehicle control device 11 moves the process to the step S36. In step S34, if the position of the detected obstacle falls within the range where a margin is added to the position information of the obstacle associated with the inter-field route, the vehicle control device 11 may determine that the obstacle is the non-avoidance travel target”).
In Reference to Claim 17
The server according to claim 12 (see rejection to claim 12 above), wherein the movable working machine is an unmanned vehicle which performs autonomous navigation (see at least Nishi Figs. 1-2 paragraphs [0042]).
In Reference to Claim 18
An input method including:
displaying a map including a path along which a movable working machine can travel (see at least Nishi Figs. 1 and 6, and paragraphs [0080], [0105] “the operation display unit 23 is a user interface including a display unit, such as a liquid crystal display or an organic EL display, that displays various pieces of information and an operating unit, such as a touch panel, a mouse, or a keyboard, that receives operations. On an operation screen displayed on the display unit, an operator can operate the operating unit to register various pieces of information (such as work vehicle information, field information, and work information, which will be described later”, “Upon the reception process unit 212 accepts the starting operation from the operator, it causes the teaching operation screen D2 to display a route start position image Ms in which the route start position Ts1 is indicated at the entrance/exit H1 of the field F1 on the map (see FIG. 8A). As illustrated in FIG. 8A, the reception process unit 212 displays a guide route Mr (dotted line) that connects the field F1 and the field F2 as information to support the travel operation of the teaching travel. This allows the operator to perform the manual travel operation (manually drive) according to the guide route Mr, thereby making it easier to perform the teaching travel operation”);
obtaining, by an input unit (touch screen of #23), a first instruction from a user on the map which is displayed, wherein the first instruction is to display an environment object indicating an environment related to the path along with the path (see at least Nishi Figs. 6 and paragraphs [0094] “In addition, by allowing the operator to perform registration operation on the operation terminal 20, the setting process unit 211 sets information associated with the position and the shape of the field, the work start position to start work (travel start position), the work end position to finish work (travel end position), and the work direction, etc”); and
displaying the environment object at a position corresponding to the path on the map when the first instruction is input (see at least Nishi Figs. 8 and 17-23 and paragraphs [0107] “Upon the reception process unit 212 accepts the starting operation from the operator, it causes the teaching operation screen D2 to display a route start position image Ms in which the route start position Ts1 is indicated at the entrance/exit H1 of the field F1 on the map (see FIG. 8A). As illustrated in FIG. 8A, the reception process unit 212 displays a guide route Mr (dotted line) that connects the field F1 and the field F2 as information to support the travel operation of the teaching travel. This allows the operator to perform the manual travel operation (manually drive) according to the guide route Mr, thereby making it easier to perform the teaching travel operation” and “For example, when multiple obstacles are detected, the operation control unit 21 inquires of the operator about whether or not to register each of the obstacles as the non-avoidance travel target, it may store the position information of the obstacle which is selected as the non-avoidance travel target in association with the inter-field route. For example, if the obstacle sensor 18 detects the obstacles B1 and B2 on the road R0, the operation control unit 21 displays the obstacles B1 and B2 selectably on the teaching operation screen D2 shown in FIG. 17. The operator can select (touch operation) the icon images of obstacles B1 and B2 on the teaching operation screen D2. Upon the operator selects the obstacle B1, the operation control unit 21 inquires of the operator about whether or not to resister the obstacle B1 as the non-avoidance travel target. If the operator selects the answer (“YES”) indicating that obstacle B1 should be excluded from the avoidance travel target, the operation control unit 21 registers the obstacle B1 as the non-avoidance travel target. Subsequently, if the operator selects the obstacle B2, the operation control unit 21 inquires of the operator about whether or not to register theobstacle B2 as the non-avoidance travel target. If the operator selects the answer (“NO”) indicating that obstacle B2 should not be excluded from the avoidance travel target, the operation control unit 21 registers the obstacle B1 as the avoidance travel target. In this way, the operation control unit 21 may be able to register the detected obstacles as the avoidance travel target or the non-avoidance travel target for each obstacle”).
In Reference to Claim 19
A system comprising: the input device according to claim 1 (see rejection to claim 1 above);
a calculation unit (21) which calculates the path along which the movable working machine travels (see at least Nishi Figs. 1, 12 and 17-23 and paragraphs [0090], [0174] “The operation control unit 21 has control devices such as a CPU, a ROM, and a RAM. The CPU is a processor that executes various types of arithmetic processes. The ROM is a non-volatile storage unit that stores a control program, such as a BIOS, an OS, or the like, that causes the CPU to execute various types of arithmetic processes in advance. The RAM is a volatile or non-volatile storage unit that stores various pieces of information and is used as a temporary storage memory for the various processes executed by the CPU. The operation control unit 21 controls the operation terminal 20 by causing the CPU to execute various types of control programs stored in advance in the ROM or the storage unit 22” and “For example, as shown in FIG. 12, the operation control unit 21 generates the inter-field route R12 which connects the route start position Ts1 at the entrance/exit H1 of field F1 and the route end position Te2 at the entrance/exit H2 of field F2 and passes through the road R0. The operation control unit 21 notifies the operator of whether or not to register the generated inter-field route R12 on the teaching operation screen D2 illustrated in FIGS. 8C and 10B. When the operator confirms the inter-field route R12 on the teaching operation screen D2 and presses down the registration button, the operation control unit 21 acquires the command to register the inter-field route R12, and registers the inter-field route R12 in association with the field F1 and the field F2”); and
a communication unit (21) which receives, from the input device, position information indicating a position on the map at which information indicating the environment is input by the user in the input device (see at least Nishi Figs. 1, 12 and 17-23 and paragraphs [0092] and [0193] “the setting process unit 211 sets information associated with the work vehicle 10 (hereinafter, referred to as work vehicle information), information associated with the field (hereinafter, referred to as field information), and information associated with the specific way how to perform work (hereinafter, referred to as work information). The setting process unit 211 receives setting operations from the operator on the setting screen D1 illustrated in FIG. 6 to register each setting information” and “As another embodiment of the present disclosure, when the obstacle is detected while traveling on the road R0 based on the teaching operation by the operator, the operation control unit 21 may cause the operation terminal 20 to display information indicating that the obstacle is detected in a display mode according to the distance from the work vehicle 10 to the obstacle. For example, as shown in FIG. 18A, in the case where the distance from the work vehicle 10 to the obstacle B3 when the work vehicle 10 detects the obstacle B3 is different from the distance from the work vehicle 10 to the obstacle B4 when the work vehicle 10 detects the obstacle B4, the operation control unit 21 causes the operation terminal 20 to display the inter-field route in a different display mode. For example, as shown in FIG. 18B, in the case where the obstacle B3 is included in a stop area K11 (see FIG. 5), the operation control unit 21 cause the screen to display the partial route r3 of the section where obstacle B3 is detected with a solid red line, and in the case where the obstacle B4 is included in a deceleration area K12 (see FIG. 5), the operation control unit 21 causes the screen to display the partial route r4 of the section where the obstacle B4 is detected with a solid blue line. The operator may also be able to set the display mode to any other mode. In this case, the operation control unit 21 may cause the screen to display the partial routes r3 and r4 in a display mode according to the setting content set by the operator”),
wherein the calculation unit (21) re-calculates a new path along which the movable working machine should travel based on the position information and position information of the path when the communication unit receives the position information from the input device (see at least Nishi Figs. 1, 12 and 17-23 and paragraphs [0210] and [0211] “So, for example, the operation control unit 21 may cause the screen to display a message indicating that the area to be detected by the obstacle sensor 18 is changed or a message prompting the operator to execute the teaching travel again when the operation control unit 21 changes the inter-field route R12 after generating the inter-field route R12” and “When the inter-field route R12 shown in FIG. 23 is changed to the inter-field route R12 shown in FIG. 22, the obstacle B11, which is registered as the non-avoidance travel target, is excluded from the area to be detected by the obstacle sensor 18. In this case, the operation control unit 21 may automatically exclude the obstacle B11 from the non-avoidance travel target instead of the above mentioned messages”).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim(s) 7 and 16 are rejected under 35 U.S.C. 103 as being unpatentable over Nishi in view of Pub No. US 2024/0302845 A1 to Miyashita et. al. (Miyashita).
In Reference to Claim 7
Nishi teaches (except for the bolded and italic recitations below):
The input device according to claim 1 (see rejection to claim 1 above), wherein the environment related to the path (R12) includes a road surface state on the path (R12) (such as obstacle) (see at least Nishi Figs. 1, 12 and 17-23 and paragraphs 75-76, and 192-205).
Nishi teaches to avoid obstacles on the route however does not explicitly teaches (bolded and italic recitations above) that conditions of the road surface state should be avoided or not. However, it is known in the art before the effective filing date of the claimed invention that road surface such as being muddy or having a caved in affects the operation of the vehicle. For example, Miyashita teaches that not only obstacles but also the road surface state should be determined to be avoided such as being muddy or having a caved. Miyashita further teaches that performing such step prevents risk of collision or preventing traveling in a route difficult to pass along (See at least Miyashita Fig.1 and paragraphs 166 and 170). Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the system of Nishi of to avoid not just obstacles but bad road surface such as muddy or having a caved in as taught by Miyashita in order to prevents risk of collision or preventing traveling in a route difficult to pass along.
In Reference to Claim 16
Nishi teaches (except for the bolded and italic recitations below):
The server according to claim 15 (see rejection to claim 15 above), wherein the information indicating the state of the path includes at least any of information indicating that there is a puddle on the path, information indicating that there is mud on the path, information indicating that there is a bump or dent on the path, and information indicating a degree of a gradient in the path (see at least Nishi Figs. 1, 12 and 17-23 and paragraphs 75-76, and 192-205).
Nishi teaches to avoid obstacles on the route however does not explicitly teaches (bolded and italic recitations above) that conditions of the road surface state should be avoided or not. However, it is known in the art before the effective filing date of the claimed invention that road surface such as being muddy or having a caved in affects the operation of the vehicle. For example, Miyashita teaches that not only obstacles but also the road surface state should be determined to be avoided such as being muddy or having a caved. Miyashita further teaches that performing such step prevents risk of collision or preventing traveling in a route difficult to pass along (See at least Miyashita Fig.1 and paragraphs 166 and 170). Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the system of Nishi of to avoid not just obstacles but bad road surface such as muddy or having a caved in as taught by Miyashita in order to pr
Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Nishi in view of Pub No. US 2016/0080438 A1 to Liang (Liang).
In Reference to Claim 10
Nishi teaches (except for the bolded and italic recitations below):
The input device according to claim 1 (see rejection to claim 1 above), wherein the display unit displays a lattice pattern object by superimposing the lattice pattern object on the map, the first instruction includes an instruction to specify a position of one compartment of the lattice pattern object, and the display unit displays the environment object at a position corresponding to the path on the map when at least a part of the path is included within the one compartment of the lattice pattern object instructed by the first instruction (see at least Nishi Figs. 1, 12 and 17-23 and paragraphs 73-76, and 192-205).
Nishi does not teach (bolded and italic recitations above) as to displays a lattice pattern object by superimposing the lattice pattern object on the map and displays the environment object at a position corresponding to the path on the map when at least a part of the path is included within the one compartment of the lattice pattern object. However, it is known in the art before the effective filing date of the claimed invention to displays a lattice pattern object by superimposing the lattice pattern object on the map and displays the environment object at a position corresponding to the path on the map when at least a part of the path is included within the one compartment of the lattice pattern object. For example, Liang teaches to displays a lattice pattern object by superimposing the lattice pattern object on the map and displays the environment object at a position corresponding to the path on the map when at least a part of the path is included within the one compartment of the lattice pattern object. Liang further teaches that performing such step provides zooming and further information in the grid (see at least Liang Figs. 3 and 12-13 and paragraphs 7-8, 41-42, 53 and 62). Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the system of Nishi to displays a lattice pattern object by superimposing the lattice pattern object on the map and displays the environment object at a position corresponding to the path on the map when at least a part of the path is included within the one compartment of the lattice pattern object as taught by Liang in order to provide zooming and further information in the grid.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Pub No. US 2024/0338037 A1 to Miyashita et. al (Miyshita1) teaches to avoid objects by rerouting the route based on the object.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to BRANDON DONGPA LEE whose telephone number is (571)270-3525. The examiner can normally be reached Monday - Friday, 8:00 am - 5:00 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Aniss Chad can be reached at (571) 270-3832. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/BRANDON D LEE/Primary Examiner, Art Unit 3662 February 20, 2026