DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
This action is in reply to the Application Number 18/726,131 filed on 07/02/2024.
Claims 1-14 are currently pending and have been examined.
This action is made NON-FINAL.
The examiner would like to note that this application is now being handled by examiner Jeffrey Chalhoub.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on July 2, 2024 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Drawings
The drawings are objected to as failing to comply with 37 CFR 1.84(p)(5) because they do not include the following reference sign(s) mentioned in the description:
“12104”.
The drawings are objected to as failing to comply with 37 CFR 1.84(p)(5) because they include the following reference character(s) not mentioned in the description:
“12001”,
“12104F”.
Corrected drawing sheets in compliance with 37 CFR 1.121(d), or amendment to the specification to add the reference character(s) in the description in compliance with 37 CFR 1.121(b) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance.
Claim Interpretation
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are:
“a detection unit” in claims 1-7 and 11,
“a map generation unit” in claims 11-12,
“a travel control unit” in claim 12.
Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
As per 35 U.S.C. 112(f), if a claim contains only a single limitation and this claim invokes 35 U.S.C. 112(f), then it is indefinite because it can take on a wide variety of structural embodiments and steps and there are no other limitations that are definitely required. A limitation which properly invokes 35 U.S.C. 112(f) must contain at least one other limitation (e.g. 35 U.S.C. 112(f) explicitly states it applies to "an element in a claim for a combination.")
Claims 1-12 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim limitation “a detection unit that detects an obstacle location” in claim 1, for instance, invokes 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. However, the written description fails to disclose the corresponding structure, material, or acts for performing the entire claimed function and to clearly link the structure, material, or acts to the function. The specification is devoid of adequate structure to perform the claimed function. In particular, the specification merely states the claimed function of detecting “an obstacle location that becomes obstruction for traveling in the traveling direction on the basis of the three-dimensional point cloud determined to be the traveling surface and the two-dimensional point cloud corresponding to the traveling direction”. There is no disclosure of any particular structure, either explicitly or inherently, to detect. The use of “an obstacle location that becomes obstruction for traveling in the traveling direction on the basis of the three-dimensional point cloud determined to be the traveling surface and the two-dimensional point cloud corresponding to the traveling direction” is not adequate structure for performing the detecting function because it does not describe a particular structure for the function and does not provide enough description for one of ordinary skill in the art to understand which structure or structures perform(s) the claimed function. Therefore, the claim is indefinite and is rejected under 35 U.S.C. 112(b) or pre-AIA 35 U.S.C. 112, second paragraph.
Claim limitation “a map generation unit that generates a map” in claim 11, for instance, invokes 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. However, the written description fails to disclose the corresponding structure, material, or acts for performing the entire claimed function and to clearly link the structure, material, or acts to the function. The specification is devoid of adequate structure to perform the claimed function. In particular, the specification merely states the claimed function of generating “a map based on two-dimensional information on a basis of information indicating the obstacle location detected by the detection unit”. There is no disclosure of any particular structure, either explicitly or inherently, to generate. The use of “a map based on two-dimensional information on a basis of information indicating the obstacle location detected by the detection unit” is not adequate structure for performing the generating function because it does not describe a particular structure for the function and does not provide enough description for one of ordinary skill in the art to understand which structure or structures perform(s) the claimed function. Therefore, the claim is indefinite and is rejected under 35 U.S.C. 112(b) or pre-AIA 35 U.S.C. 112, second paragraph.
Claim limitation “a travel control unit that controls traveling of a mobile body” in claim 12 invokes 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. However, the written description fails to disclose the corresponding structure, material, or acts for performing the entire claimed function and to clearly link the structure, material, or acts to the function. The specification is devoid of adequate structure to perform the claimed function. In particular, the specification merely states the claimed function of controlling “traveling of a mobile body on a basis of the map generated by the map generation unit”. There is no disclosure of any particular structure, either explicitly or inherently, to control. The use of “traveling of a mobile body on a basis of the map generated by the map generation unit” is not adequate structure for performing the controlling function because it does not describe a particular structure for the function and does not provide enough description for one of ordinary skill in the art to understand which structure or structures perform(s) the claimed function. Therefore, the claim is indefinite and is rejected under 35 U.S.C. 112(b) or pre-AIA 35 U.S.C. 112, second paragraph.
Applicant may:
(a) Amend the claim so that the claim limitation will no longer be interpreted as a limitation under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph;
(b) Amend the written description of the specification such that it expressly recites what structure, material, or acts perform the entire claimed function, without introducing any new matter (35 U.S.C. 132(a)); or
(c) Amend the written description of the specification such that it clearly links the structure, material, or acts disclosed therein to the function recited in the claim, without introducing any new matter (35 U.S.C. 132(a)).
If applicant is of the opinion that the written description of the specification already implicitly or inherently discloses the corresponding structure, material, or acts and clearly links them to the function so that one of ordinary skill in the art would recognize what structure, material, or acts perform the claimed function, applicant should clarify the record by either:
(a) Amending the written description of the specification such that it expressly recites the corresponding structure, material, or acts for performing the claimed function and clearly links or associates the structure, material, or acts to the claimed function, without introducing any new matter (35 U.S.C. 132(a)); or
(b) Stating on the record what the corresponding structure, material, or acts, which are implicitly or inherently set forth in the written description of the specification, perform the claimed function. For more information, see 37 CFR 1.75(d) and MPEP §§ 608.01(o) and 2181.
As the Applicant has invoked 112(f) properly but the specification does not provide a clear linking statement as to what the structural equivalents are, the Applicant’s claim limitations will be afforded their broadest reasonable interpretation.
The following is a quotation of the first paragraph of 35 U.S.C. 112(a):
(a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention.
The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112:
The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention.
Claims 1-12 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. As described above, the disclosure does not provide adequate structure to perform the claimed functions of, for example, detecting an obstacle location, generating a map, and controlling traveling of a mobile body in claims 1 and 11-12. The specification does not demonstrate that applicant has made an invention that achieves the claimed functions because the invention is not described with sufficient detail that one of ordinary skill in the art can reasonably conclude that the inventor had possession of the claimed invention.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-14 are rejected under 35 U.S.C. 101 as being directed to non-statutory subject matter because the claimed invention is directed to an abstract idea without reciting significantly more. The claims are being rejected according to the 2019 Revised Patent Subject Matter Eligibility Guidance (Federal Register, Vol. 84, No. 5, p. 50-57 (January 7, 2019).
Step One: Does the Claim Fall Within a Statutory Category?
Yes. Claim 1 is directed towards an information processing apparatus (machine). Dependent claims 2-12 are also directed towards an information processing apparatus (machine). Claim 13 is directed towards an information processing method (process). Finally, claim 14 is directed towards an information processing program (machine).
Step Two A, Prong One: Is a Judicial Exception Recited?
Yes. Taking into account claim 1 as one example, the claim recites detects an obstacle location that becomes obstruction for traveling in a traveling direction on a basis of a three-dimensional point cloud determined to be a traveling surface and a two-dimensional point cloud corresponding to the traveling direction. These limitations, as drafted, are simple processes that, under their broadest reasonable interpretation, cover performance of the limitations in the mind. That is, nothing in the claim elements precludes the steps from practically being performed in the mind. For example, the claim encompasses an individual analyzing a path and its environmental surroundings, noticing a vehicle traversing the path, reporting the path’s characteristics and environmental surroundings including obstacles and/or other entities to the vehicle, and providing driving instructions to the vehicle to traverse the path in order to avoid any potential collision with the obstacles/entities. Thus, the claim recites a mental process.
Step Two A, Prong Two: Is the Abstract Idea Integrated into a Practical Application?
No. Claim 1 recites one additional element – a detection unit. The detection unit is recited at a high-level of generality (i.e., as a means to transmit and receive data) such that it amounts to no more than mere instructions to apply the exception using a generic detection unit. Accordingly, the additional element does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea.
This type of abstract idea recited in claims 1-14 is a mental process.
Step Two B: Does the Claim Provide an Inventive Concept
No. Regarding claim 1, the claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional element of using a detection unit amounts to no more than mere instructions to apply the exception using a generic detection unit. Mere instructions to apply an exception using a detection unit cannot provide an inventive concept.
Dependent Claims
The dependent claims are merely further defining the abstract idea by providing field of use limitations on transmitting and receiving data and are not adding anything to the abstract idea set forth in the independent claims such that the invention will amount to significantly more than the abstract idea.
Claims 2-12 are merely field of use limitations which simply further limit the abstract idea set forth in claim 1. These claims do not contain further limitations that make them subject matter eligible.
For example, dependent claim 2 merely recites the well understood, routine and conventional computing functions of data transmission and gathering. These claims do not contain further limitations that make them subject matter eligible.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-6 and 8-14 are rejected under 35 U.S.C. 103 as being unpatentable over Takahashi (WO 2019176278 A1) in view of Hanaoka (U.S. Pub. No. 2015/0362921 A1).
Regarding Claim 1:
Takahashi teaches:
An information processing apparatus comprising a detection unit that detects an obstacle location that becomes obstruction for traveling in a traveling direction on a basis of a three-dimensional point cloud determined to be a traveling surface, (“In order to achieve the above object, an information processing apparatus according to an embodiment of the present technology includes an acquisition unit, a calculation unit, and a determination unit. The acquisition unit acquires information related to a height position of a sensor capable of detecting peripheral information of the moving body. The calculation unit calculates a determination region for determining a situation around the moving body based on the acquired information on the height position of the sensor. The determination unit determines a state of the periphery of the moving body based on the calculated determination area and the peripheral information detected by the sensor. In this information processing apparatus, a determination region for determining the situation around the moving body is calculated based on information on the height position of the sensor. This makes it possible to easily and accurately determine the situation around the moving body. The calculation unit may determine whether there is an obstacle around the moving body. The acquisition unit may acquire shape data related to the periphery of the moving body. In this case, the determination unit may determine a situation around the moving body based on the relationship between the detected shape data and the determination region. The acquisition unit may acquire three-dimensional point cloud data related to the periphery of the moving body. In this case, the determination unit may determine the situation around the moving body by determining whether or not the point data included in the three-dimensional point cloud data is included in the determination region. The calculation unit may change the height position of the determination region according to a change in the height position of the sensor. The calculation unit may calculate one or more determination planes that define the determination area. In this case, the one or more determination surfaces may include at least one of a first determination surface corresponding to the ground and a second determination surface corresponding to the ceiling surface.” (Takahashi: Description) Takahashi further mentions “An information processing method according to an embodiment of the present technology is an information processing method executed by a computer system, and includes acquiring information related to a height position of a sensor capable of detecting peripheral information of a moving object. Based on the acquired information on the height position of the sensor, a determination region for determining a situation around the moving body is calculated. Based on the calculated determination area and the surrounding information detected by the sensor, a situation around the moving body is determined.” (Takahashi: Description))
Takahashi does not teach but Hanaoka teaches:
and a two-dimensional point cloud corresponding to the traveling direction., (See (Hanaoka: Background Art – 4th-5th paragraphs, Description of Embodiments – 47th, 59th, and 62nd paragraphs, and Other Embodiment – 155th paragraph))
It would have been obvious to one of ordinary skill in the art at the time of filing, before the effective filing date of the claimed invention, to modify Takahashi with these above aforementioned teachings from Hanaoka in order to create an effective information processing apparatus, method, and program. At the time the invention was filed, one of ordinary skill in the art would have been motivated to incorporate Takahashi’s information processing device, information processing method, program, and mobile body with Hanaoka’s surrounding environment recognition device, method, and autonomous mobile system in order to detect an obstacle location that becomes obstruction for traveling in a traveling direction on a basis of a two-dimensional point cloud corresponding to the traveling direction. Combining Takahashi and Hanaoka would thus “enhance accuracy of recognizing a surrounding environment by detecting a level difference reliably and accurately even when there is a blind spot. Moreover, in the autonomous mobile system of the present invention, it is possible to prevent falling into a level difference by detecting the level difference reliably.” (Hanaoka: Summary of the Invention – 26th-27th paragraphs)
Regarding Claim 2:
Takahashi in view of Hanaoka, as shown in the rejection above, discloses the limitations of claim 1. Takahashi further teaches:
The information processing apparatus according to claim 1, wherein the detection unit determines the traveling surface on a basis of, (“The calculation unit may calculate one or more determination planes that define the determination area. In this case, the one or more determination surfaces may include at least one of a first determination surface corresponding to the ground and a second determination surface corresponding to the ceiling surface. The calculation unit may change the height position of at least one of the first determination surface and the second determination surface according to a change in the height position of the sensor.” (Takahashi: Description))
[…] the plane being specified on a basis of the three-dimensional point cloud., (“The acquisition unit may acquire three-dimensional point cloud data related to the periphery of the moving body. In this case, the determination unit may determine the situation around the moving body by determining whether or not the point data included in the three-dimensional point cloud data is included in the determination region. The calculation unit may change the height position of the determination region according to a change in the height position of the sensor. The calculation unit may calculate one or more determination planes that define the determination area. In this case, the one or more determination surfaces may include at least one of a first determination surface corresponding to the ground and a second determination surface corresponding to the ceiling surface.” (Takahashi: Description) Takahashi further mentions “For example, a method of extracting a plane from a three-dimensional point group obtained from a distance sensor and recognizing an object other than the plane as an obstacle is conceivable. By extracting the floor / ceiling surface from only the sensor signal, the floor surface can be removed only from the signal without depending on the posture of the sensor with respect to the outside world. However, this method requires a complicated procedure for detecting the floor surface from the point cloud, and is difficult to apply to a small robot with a scarce computer resource.” (Takahashi: Description))
Takahashi does not teach but Hanaoka teaches:
[…] a normal line-added point cloud added with normal line information of a normal line estimated for every point included in a plane, […], (See (Hanaoka: Description of Embodiments – 61st-70th and 72nd-79th paragraphs))
It would have been obvious to one of ordinary skill in the art at the time of filing, before the effective filing date of the claimed invention, to modify Takahashi with these above aforementioned teachings from Hanaoka in order to create an effective information processing apparatus, method, and program. At the time the invention was filed, one of ordinary skill in the art would have been motivated to incorporate Takahashi’s information processing device, information processing method, program, and mobile body with Hanaoka’s surrounding environment recognition device, method, and autonomous mobile system in order to detect an obstacle location that becomes obstruction for traveling in a traveling direction on a basis of a two-dimensional point cloud corresponding to the traveling direction. Combining Takahashi and Hanaoka would thus “enhance accuracy of recognizing a surrounding environment by detecting a level difference reliably and accurately even when there is a blind spot. Moreover, in the autonomous mobile system of the present invention, it is possible to prevent falling into a level difference by detecting the level difference reliably.” (Hanaoka: Summary of the Invention – 26th-27th paragraphs)
Regarding Claim 3:
Takahashi in view of Hanaoka, as shown in the rejection above, discloses the limitations of claim 2. Takahashi further teaches:
The information processing apparatus according to claim 2, wherein the detection unit determines whether each of divided areas is one of the obstacle location and the traveling surface on a basis of, (“The obstacle determination unit 33 acquires three-dimensional point cloud data detected by the distance sensor 25 (step 103). The obstacle determination unit 33 determines an obstacle based on the calculated determination region D and the detected three-dimensional point cloud data (step 104). The obstacle determination unit 33 determines an obstacle based on the relationship between the three-dimensional point cloud data and the determination area D. Specifically, it is determined whether or not an obstacle exists by determining whether or not the point data included in the three-dimensional point cloud data is included in the determination region D. As shown in FIG. 6, for example, points included in the determination area D (D0) are determined as points constituting an obstacle. Then, an object including a point constituting the obstacle is determined as the obstacle 3. In the example shown in FIG. 6, the object 4 a existing on the floor surface 1 and the object 4 b installed on the ceiling surface 2 are both determined as the obstacle 3. That is, in this embodiment, the three-dimensional point cloud data obtained from the distance sensor 25 is filtered based on the first and second identification surfaces 35 and 26. And when at least one part is contained in the determination area | region D (D0), it determines with the obstruction 3. FIG. Note that only points included in the determination region D (D0) may be determined as the obstacle 3. When it is determined that the obstacle 3 exists, for example, an avoidance operation such as detour or stop, and output of a warning sound or a warning message are executed. These processes are executed, for example, by the cooperation of the action plan processing unit 15 and the action control processing unit 16 illustrated in FIG. Moreover, the situation analysis unit 133, the planning unit 134, and the operation control unit 135 illustrated in FIG. In the present embodiment, these blocks correspond to a drive control unit that controls the drive unit based on the determination result by the determination unit. In the present embodiment, the height position calculation unit 30 functions as an acquisition unit that acquires information about the height position of the sensor that can detect the peripheral information of the moving object. The posture calculation unit 31 functions as an acquisition unit that acquires information regarding the tilt of the sensor. The attitude of the sensor will be described later. The determination region calculation unit 32 functions as a calculation unit that calculates a determination region for determining the situation around the moving body based on the acquired information on the height position of the sensor. The obstacle determination unit 33 functions as a determination unit that determines the situation around the moving body based on the calculated determination region and the peripheral information detected by the sensor.” (Takahashi: Description) Takahashi further mentions “As a result, as shown in FIG. 10, the floor surface 1 is included in the determination region D ′ and is determined as an obstacle. In addition, the object 4b existing on the ceiling surface 2 deviates from the determination area D ′, and an obstacle is lost. As shown in FIG. 11, based on the inclination amount Δθ, the first and second identification surfaces 35 ″ and 36 ″ are corrected so that each is a horizontal plane, and the determination region D ″ is calculated. To do. Even in this case, the floor surface 1 is included in the determination region D ′ and is determined as an obstacle. In addition, the object 4b existing on the ceiling surface 2 deviates from the determination area D ′, and an obstacle is lost. That is, erroneous detection due to the fluctuation amount Δt in the height direction remains. As shown in FIG. 12, the determination region D is calculated based on the height position H2 of the distance sensor 25 and the inclination of the distance sensor 25. That is, the height position and the inclination of the determination region D are changed according to the change in the height position and the inclination of the distance sensor 25. The inclination of the determination area D corresponds to the inclination of the position reference axis of the determination area D (the same axis as the detection axis L of the distance sensor 25). For example, with reference to the reference determination area D0 shown in FIG. 6, the determination area D is set to have the same height position as the height position of the reference determination area D0 and the same inclination as the inclination of the reference determination area D0. Calculated. That is, the first identification surface 35 is calculated so as to coincide with the height position and inclination of the first identification surface 35 in the reference determination area D0. Similarly, the second identification surface 36 is calculated so as to coincide with the height position and inclination of the second identification surface 36 in the reference determination region D0. As a result, as shown in FIG. 12, the first identification surface 35 is set at a position higher than the floor surface 1 by the offset hf. The second identification surface 36 is set at a position lower than the ceiling surface 2 by the offset hc. The plane formula of the first and second identification surfaces 35 and 36 is different from the first and second identification surfaces 35 and 36 in the reference state. Thus, compared with the state shown in FIG. 10, the first and second identification surfaces 35 and 36 are offset upward by the fluctuation amount Δt of the distance sensor 25, and the reverse rotation direction (−θ direction) by the inclination amount Δθ. ). Thereby, in the obstacle determination process in step 104, the objects 4a and 4b can be appropriately determined as the obstacle 3 as in the reference state shown in FIG. As a result, the influence of the swing of the distance sensor 25 in the height direction can be sufficiently prevented, and the floor surface 1, the ceiling surface 2, and the obstacle 3 can be identified with high accuracy. Further, since the height position and inclination of the determination region D (first and second identification surfaces 35 and 36) may be changed based on the height position of the distance sensor 25, a complicated algorithm is not required, and the amount of calculation is small. Thus, the obstacle 3 can be easily determined.” (Takahashi: Description, FIG. 4-10))
[…] height information of points included in each of the divided areas, the divided areas being obtained by […], (“The specific values of the offsets hf and hc are not limited and may be set as appropriate. For example, by setting the distance sensor 25 to be larger than the maximum error amount, it is possible to improve the obstacle determination accuracy. When the height of an object to be determined as an obstacle is specified, the offset hf is set at a position lower than the specified height. When the height of an object that does not need to be determined as an obstacle is specified, the offset hf is set at a position higher than the specified height. For example, the offset hf may be set according to a height that cannot be overcome by the robot 20, a height that can be sufficiently overcome, and the like. Similarly, when the height of the object determined as an obstacle from the ceiling surface 2 is defined on the ceiling surface 2 side, the offset hc is set at a position higher than the defined height. When the height from the ceiling surface 2 of an object that does not need to be determined as an obstacle is specified, the offset hc is set at a position lower than the specified height. For example, the offset hc may be set according to the maximum height of the head 22 of the robot 20 during walking. Thus, typically, the heights of the first and second identification surfaces 35 and 36 are set based on the moving environment, the performance of the robot 20, and the like so as to be advantageous for autonomous movement. For example, the heights of the first and second identification surfaces 35 and 36 are set so that an object to be avoided in movement can be extracted as an obstacle. Of course, the first identification surface 35 may be set at a position substantially equal to the position of the floor surface 1, and the first identification surface 36 may be set at a position substantially equal to the ceiling surface 2 (both offsets hf and hc are substantially zero). ).” (Takahashi: Description) Takahashi further mentions “As a result, as shown in FIG. 7, the first identification surface 35 ′ becomes lower than the floor surface 1, and the floor surface 1 is included in the determination region D ′. As a result, the floor surface 1 is determined as the obstacle 3. In addition, an object that can be traversed and should be ignored may be determined as the obstacle 3. In addition, the second identification surface 36 ′ is significantly lower than the ceiling surface 2, and the object 4b existing on the ceiling surface 2 deviates from the determination region D ′. As a result, the object 4b that should be determined as an obstacle cannot be seen, and an obstacle is lost. As a result, for example, an avoidance operation such as sudden braking is inadvertently performed, and proper autonomous movement is hindered. Further, the head 22 may collide with an object and the robot 20 may be damaged. As shown in FIG. 8, in the obstacle determination process according to the present embodiment, in step 102, the determination region D is calculated based on the height position H1 of the distance sensor 25. Specifically, the height position of the determination region D is changed according to the change in the height position of the distance sensor 25. In this embodiment, the determination area D is calculated so that the height position is the same as the height position of the reference determination area D0 with reference to the reference determination area D0 shown in FIG. That is, the first identification surface 35 is calculated so as to have the same height position as the height position of the first identification surface 35 in the reference determination region D0. Similarly, the second identification surface 36 is calculated so as to have the same height position as the height position of the second identification surface 36 in the reference determination region D0. Therefore, as shown in FIG. 8, the first identification surface 35 is set at a position higher than the floor surface 1 by the offset hf. The second identification surface 36 is set at a position lower than the ceiling surface 2 by the offset hc. The plane expression of the first and second identification surfaces 35 and 36 is different from the first and second identification surfaces 35 and 36 in the reference state. Thus, compared with the state shown in FIG. 7, the first and second identification surfaces 35 and 36 are offset upward along the vertical direction by the amount of variation Δt of the distance sensor 25. Thereby, in the obstacle determination process in step 104, the objects 4a and 4b can be appropriately determined as the obstacle 3 as in the reference state shown in FIG. As a result, the influence of the swing of the distance sensor 25 in the height direction is sufficiently prevented, and an object that does not obstruct the movement, such as the floor surface 1 and the ceiling surface 2, and an object that obstructs the movement (obstacle 3). Can be identified. Further, since the height position of the determination region D (first and second identification surfaces 35 and 36) may be changed based on the height position of the distance sensor 25, a complicated algorithm is not required, and the calculation amount is small. It is possible to easily determine the obstacle 3.” (Takahashi: Description, FIG. 4-10))
Takahashi does not teach but Hanaoka teaches:
[…] dividing a point cloud obtained by integrating the normal line-added point cloud and the two-dimensional point cloud by a grid having a predetermined size., (See (Hanaoka: Description of Embodiments – 57th-69th and 133rd-135th paragraphs and Other Embodiment – 155th-169th paragraphs))
It would have been obvious to one of ordinary skill in the art at the time of filing, before the effective filing date of the claimed invention, to modify Takahashi with these above aforementioned teachings from Hanaoka in order to create an effective information processing apparatus, method, and program. At the time the invention was filed, one of ordinary skill in the art would have been motivated to incorporate Takahashi’s information processing device, information processing method, program, and mobile body with Hanaoka’s surrounding environment recognition device, method, and autonomous mobile system in order to detect an obstacle location that becomes obstruction for traveling in a traveling direction on a basis of a two-dimensional point cloud corresponding to the traveling direction. Combining Takahashi and Hanaoka would thus “enhance accuracy of recognizing a surrounding environment by detecting a level difference reliably and accurately even when there is a blind spot. Moreover, in the autonomous mobile system of the present invention, it is possible to prevent falling into a level difference by detecting the level difference reliably.” (Hanaoka: Summary of the Invention – 26th-27th paragraphs)
Regarding Claim 4:
Takahashi in view of Hanaoka, as shown in the rejection above, discloses the limitations of claim 3. Takahashi further teaches:
The information processing apparatus according to claim 3, wherein, in a case where a target area that is a divided area to be a target among the divided areas, (“The obstacle determination unit 33 acquires three-dimensional point cloud data detected by the distance sensor 25 (step 103). The obstacle determination unit 33 determines an obstacle based on the calculated determination region D and the detected three-dimensional point cloud data (step 104). The obstacle determination unit 33 determines an obstacle based on the relationship between the three-dimensional point cloud data and the determination area D. Specifically, it is determined whether or not an obstacle exists by determining whether or not the point data included in the three-dimensional point cloud data is included in the determination region D. As shown in FIG. 6, for example, points included in the determination area D (D0) are determined as points constituting an obstacle. Then, an object including a point constituting the obstacle is determined as the obstacle 3. In the example shown in FIG. 6, the object 4 a existing on the floor surface 1 and the object 4 b installed on the ceiling surface 2 are both determined as the obstacle 3. That is, in this embodiment, the three-dimensional point cloud data obtained from the distance sensor 25 is filtered based on the first and second identification surfaces 35 and 26. And when at least one part is contained in the determination area | region D (D0), it determines with the obstruction 3. FIG. Note that only points included in the determination region D (D0) may be determined as the obstacle 3. When it is determined that the obstacle 3 exists, for example, an avoidance operation such as detour or stop, and output of a warning sound or a warning message are executed. These processes are executed, for example, by the cooperation of the action plan processing unit 15 and the action control processing unit 16 illustrated in FIG. Moreover, the situation analysis unit 133, the planning unit 134, and the operation control unit 135 illustrated in FIG. In the present embodiment, these blocks correspond to a drive control unit that controls the drive unit based on the determination result by the determination unit. In the present embodiment, the height position calculation unit 30 functions as an acquisition unit that acquires information about the height position of the sensor that can detect the peripheral information of the moving object. The posture calculation unit 31 functions as an acquisition unit that acquires information regarding the tilt of the sensor. The attitude of the sensor will be described later. The determination region calculation unit 32 functions as a calculation unit that calculates a determination region for determining the situation around the moving body based on the acquired information on the height position of the sensor. The obstacle determination unit 33 functions as a determination unit that determines the situation around the moving body based on the calculated determination region and the peripheral information detected by the sensor.” (Takahashi: Description) Takahashi further mentions “As a result, as shown in FIG. 10, the floor surface 1 is included in the determination region D ′ and is determined as an obstacle. In addition, the object 4b existing on the ceiling surface 2 deviates from the determination area D ′, and an obstacle is lost. As shown in FIG. 11, based on the inclination amount Δθ, the first and second identification surfaces 35 ″ and 36 ″ are corrected so that each is a horizontal plane, and the determination region D ″ is calculated. To do. Even in this case, the floor surface 1 is included in the determination region D ′ and is determined as an obstacle. In addition, the object 4b existing on the ceiling surface 2 deviates from the determination area D ′, and an obstacle is lost. That is, erroneous detection due to the fluctuation amount Δt in the height direction remains. As shown in FIG. 12, the determination region D is calculated based on the height position H2 of the distance sensor 25 and the inclination of the distance sensor 25. That is, the height position and the inclination of the determination region D are changed according to the change in the height position and the inclination of the distance sensor 25. The inclination of the determination area D corresponds to the inclination of the position reference axis of the determination area D (the same axis as the detection axis L of the distance sensor 25). For example, with reference to the reference determination area D0 shown in FIG. 6, the determination area D is set to have the same height position as the height position of the reference determination area D0 and the same inclination as the inclination of the reference determination area D0. Calculated. That is, the first identification surface 35 is calculated so as to coincide with the height position and inclination of the first identification surface 35 in the reference determination area D0. Similarly, the second identification surface 36 is calculated so as to coincide with the height position and inclination of the second identification surface 36 in the reference determination region D0. As a result, as shown in FIG. 12, the first identification surface 35 is set at a position higher than the floor surface 1 by the offset hf. The second identification surface 36 is set at a position lower than the ceiling surface 2 by the offset hc. The plane formula of the first and second identification surfaces 35 and 36 is different from the first and second identification surfaces 35 and 36 in the reference state. Thus, compared with the state shown in FIG. 10, the first and second identification surfaces 35 and 36 are offset upward by the fluctuation amount Δt of the distance sensor 25, and the reverse rotation direction (−θ direction) by the inclination amount Δθ. ). Thereby, in the obstacle determination process in step 104, the objects 4a and 4b can be appropriately determined as the obstacle 3 as in the reference state shown in FIG. As a result, the influence of the swing of the distance sensor 25 in the height direction can be sufficiently prevented, and the floor surface 1, the ceiling surface 2, and the obstacle 3 can be identified with high accuracy. Further, since the height position and inclination of the determination region D (first and second identification surfaces 35 and 36) may be changed based on the height position of the distance sensor 25, a complicated algorithm is not required, and the amount of calculation is small. Thus, the obstacle 3 can be easily determined.” (Takahashi: Description, FIG. 4-10))
[…] the detection unit determines whether the target area is one of the obstacle location and the traveling surface on a basis of height information of points included in the target area and a divided area having the normal line information among the divided areas around the target area., (“The specific values of the offsets hf and hc are not limited and may be set as appropriate. For example, by setting the distance sensor 25 to be larger than the maximum error amount, it is possible to improve the obstacle determination accuracy. When the height of an object to be determined as an obstacle is specified, the offset hf is set at a position lower than the specified height. When the height of an object that does not need to be determined as an obstacle is specified, the offset hf is set at a position higher than the specified height. For example, the offset hf may be set according to a height that cannot be overcome by the robot 20, a height that can be sufficiently overcome, and the like. Similarly, when the height of the object determined as an obstacle from the ceiling surface 2 is defined on the ceiling surface 2 side, the offset hc is set at a position higher than the defined height. When the height from the ceiling surface 2 of an object that does not need to be determined as an obstacle is specified, the offset hc is set at a position lower than the specified height. For example, the offset hc may be set according to the maximum height of the head 22 of the robot 20 during walking. Thus, typically, the heights of the first and second identification surfaces 35 and 36 are set based on the moving environment, the performance of the robot 20, and the like so as to be advantageous for autonomous movement. For example, the heights of the first and second identification surfaces 35 and 36 are set so that an object to be avoided in movement can be extracted as an obstacle. Of course, the first identification surface 35 may be set at a position substantially equal to the position of the floor surface 1, and the first identification surface 36 may be set at a position substantially equal to the ceiling surface 2 (both offsets hf and hc are substantially zero). ).” (Takahashi: Description) Takahashi further mentions “As a result, as shown in FIG. 7, the first identification surface 35 ′ becomes lower than the floor surface 1, and the floor surface 1 is included in the determination region D ′. As a result, the floor surface 1 is determined as the obstacle 3. In addition, an object that can be traversed and should be ignored may be determined as the obstacle 3. In addition, the second identification surface 36 ′ is significantly lower than the ceiling surface 2, and the object 4b existing on the ceiling surface 2 deviates from the determination region D ′. As a result, the object 4b that should be determined as an obstacle cannot be seen, and an obstacle is lost. As a result, for example, an avoidance operation such as sudden braking is inadvertently performed, and proper autonomous movement is hindered. Further, the head 22 may collide with an object and the robot 20 may be damaged. As shown in FIG. 8, in the obstacle determination process according to the present embodiment, in step 102, the determination region D is calculated based on the height position H1 of the distance sensor 25. Specifically, the height position of the determination region D is changed according to the change in the height position of the distance sensor 25. In this embodiment, the determination area D is calculated so that the height position is the same as the height position of the reference determination area D0 with reference to the reference determination area D0 shown in FIG. That is, the first identification surface 35 is calculated so as to have the same height position as the height position of the first identification surface 35 in the reference determination region D0. Similarly, the second identification surface 36 is calculated so as to have the same height position as the height position of the second identification surface 36 in the reference determination region D0. Therefore, as shown in FIG. 8, the first identification surface 35 is set at a position higher than the floor surface 1 by the offset hf. The second identification surface 36 is set at a position lower than the ceiling surface 2 by the offset hc. The plane expression of the first and second identification surfaces 35 and 36 is different from the first and second identification surfaces 35 and 36 in the reference state. Thus, compared with the state shown in FIG. 7, the first and second identification surfaces 35 and 36 are offset upward along the vertical direction by the amount of variation Δt of the distance sensor 25. Thereby, in the obstacle determination process in step 104, the objects 4a and 4b can be appropriately determined as the obstacle 3 as in the reference state shown in FIG. As a result, the influence of the swing of the distance sensor 25 in the height direction is sufficiently prevented, and an object that does not obstruct the movement, such as the floor surface 1 and the ceiling surface 2, and an object that obstructs the movement (obstacle 3). Can be identified. Further, since the height position of the determination region D (first and second identification surfaces 35 and 36) may be changed based on the height position of the distance sensor 25, a complicated algorithm is not required, and the calculation amount is small. It is possible to easily determine the obstacle 3.” (Takahashi: Description, FIG. 4-10))
Takahashi does not teach but Hanaoka teaches:
[…] does not include the normal line information, […], (See (Hanaoka: Description of Embodiments – 61st-70th and 72nd-79th paragraphs))
It would have been obvious to one of ordinary skill in the art at the time of filing, before the effective filing date of the claimed invention, to modify Takahashi with these above aforementioned teachings from Hanaoka in order to create an effective information processing apparatus, method, and program. At the time the invention was filed, one of ordinary skill in the art would have been motivated to incorporate Takahashi’s information processing device, information processing method, program, and mobile body with Hanaoka’s surrounding environment recognition device, method, and autonomous mobile system in order to detect an obstacle location that becomes obstruction for traveling in a traveling direction on a basis of a two-dimensional point cloud corresponding to the traveling direction. Combining Takahashi and Hanaoka would thus “enhance accuracy of recognizing a surrounding environment by detecting a level difference reliably and accurately even when there is a blind spot. Moreover, in the autonomous mobile system of the present invention, it is possible to prevent falling into a level difference by detecting the level difference reliably.” (Hanaoka: Summary of the Invention – 26th-27th paragraphs)
Regarding Claim 5:
Takahashi in view of Hanaoka, as shown in the rejection above, discloses the limitations of claim 3. Takahashi further teaches:
[…] the detection unit determines that the target area includes the obstacle location., (“The obstacle determination unit 33 acquires three-dimensional point cloud data detected by the distance sensor 25 (step 103). The obstacle determination unit 33 determines an obstacle based on the calculated determination region D and the detected three-dimensional point cloud data (step 104). The obstacle determination unit 33 determines an obstacle based on the relationship between the three-dimensional point cloud data and the determination area D. Specifically, it is determined whether or not an obstacle exists by determining whether or not the point data included in the three-dimensional point cloud data is included in the determination region D. As shown in FIG. 6, for example, points included in the determination area D (D0) are determined as points constituting an obstacle. Then, an object including a point constituting the obstacle is determined as the obstacle 3. In the example shown in FIG. 6, the object 4 a existing on the floor surface 1 and the object 4 b installed on the ceiling surface 2 are both determined as the obstacle 3. That is, in this embodiment, the three-dimensional point cloud data obtained from the distance sensor 25 is filtered based on the first and second identification surfaces 35 and 26. And when at least one part is contained in the determination area | region D (D0), it determines with the obstruction 3. FIG. Note that only points included in the determination region D (D0) may be determined as the obstacle 3. When it is determined that the obstacle 3 exists, for example, an avoidance operation such as detour or stop, and output of a warning sound or a warning message are executed. These processes are executed, for example, by the cooperation of the action plan processing unit 15 and the action control processing unit 16 illustrated in FIG. Moreover, the situation analysis unit 133, the planning unit 134, and the operation control unit 135 illustrated in FIG. In the present embodiment, these blocks correspond to a drive control unit that controls the drive unit based on the determination result by the determination unit. In the present embodiment, the height position calculation unit 30 functions as an acquisition unit that acquires information about the height position of the sensor that can detect the peripheral information of the moving object. The posture calculation unit 31 functions as an acquisition unit that acquires information regarding the tilt of the sensor. The attitude of the sensor will be described later. The determination region calculation unit 32 functions as a calculation unit that calculates a determination region for determining the situation around the moving body based on the acquired information on the height position of the sensor. The obstacle determination unit 33 functions as a determination unit that determines the situation around the moving body based on the calculated determination region and the peripheral information detected by the sensor.” (Takahashi: Description) Takahashi further mentions “As a result, as shown in FIG. 10, the floor surface 1 is included in the determination region D ′ and is determined as an obstacle. In addition, the object 4b existing on the ceiling surface 2 deviates from the determination area D ′, and an obstacle is lost. As shown in FIG. 11, based on the inclination amount Δθ, the first and second identification surfaces 35 ″ and 36 ″ are corrected so that each is a horizontal plane, and the determination region D ″ is calculated. To do. Even in this case, the floor surface 1 is included in the determination region D ′ and is determined as an obstacle. In addition, the object 4b existing on the ceiling surface 2 deviates from the determination area D ′, and an obstacle is lost. That is, erroneous detection due to the fluctuation amount Δt in the height direction remains. As shown in FIG. 12, the determination region D is calculated based on the height position H2 of the distance sensor 25 and the inclination of the distance sensor 25. That is, the height position and the inclination of the determination region D are changed according to the change in the height position and the inclination of the distance sensor 25. The inclination of the determination area D corresponds to the inclination of the position reference axis of the determination area D (the same axis as the detection axis L of the distance sensor 25). For example, with reference to the reference determination area D0 shown in FIG. 6, the determination area D is set to have the same height position as the height position of the reference determination area D0 and the same inclination as the inclination of the reference determination area D0. Calculated. That is, the first identification surface 35 is calculated so as to coincide with the height position and inclination of the first identification surface 35 in the reference determination area D0. Similarly, the second identification surface 36 is calculated so as to coincide with the height position and inclination of the second identification surface 36 in the reference determination region D0. As a result, as shown in FIG. 12, the first identification surface 35 is set at a position higher than the floor surface 1 by the offset hf. The second identification surface 36 is set at a position lower than the ceiling surface 2 by the offset hc. The plane formula of the first and second identification surfaces 35 and 36 is different from the first and second identification surfaces 35 and 36 in the reference state. Thus, compared with the state shown in FIG. 10, the first and second identification surfaces 35 and 36 are offset upward by the fluctuation amount Δt of the distance sensor 25, and the reverse rotation direction (−θ direction) by the inclination amount Δθ. ). Thereby, in the obstacle determination process in step 104, the objects 4a and 4b can be appropriately determined as the obstacle 3 as in the reference state shown in FIG. As a result, the influence of the swing of the distance sensor 25 in the height direction can be sufficiently prevented, and the floor surface 1, the ceiling surface 2, and the obstacle 3 can be identified with high accuracy. Further, since the height position and inclination of the determination region D (first and second identification surfaces 35 and 36) may be changed based on the height position of the distance sensor 25, a complicated algorithm is not required, and the amount of calculation is small. Thus, the obstacle 3 can be easily determined.” (Takahashi: Description, FIG. 4-10))
Takahashi does not teach but Hanaoka teaches:
The information processing apparatus according to claim 3, wherein, in a case where all of the divided areas around a target area that is a divided area to be a target among the divided areas do not include the normal line information,, (See (Hanaoka: Description of Embodiments – 61st-70th and 72nd-79th paragraphs))
It would have been obvious to one of ordinary skill in the art at the time of filing, before the effective filing date of the claimed invention, to modify Takahashi with these above aforementioned teachings from Hanaoka in order to create an effective information processing apparatus, method, and program. At the time the invention was filed, one of ordinary skill in the art would have been motivated to incorporate Takahashi’s information processing device, information processing method, program, and mobile body with Hanaoka’s surrounding environment recognition device, method, and autonomous mobile system in order to detect an obstacle location that becomes obstruction for traveling in a traveling direction on a basis of a two-dimensional point cloud corresponding to the traveling direction. Combining Takahashi and Hanaoka would thus “enhance accuracy of recognizing a surrounding environment by detecting a level difference reliably and accurately even when there is a blind spot. Moreover, in the autonomous mobile system of the present invention, it is possible to prevent falling into a level difference by detecting the level difference reliably.” (Hanaoka: Summary of the Invention – 26th-27th paragraphs)
Regarding Claim 6:
Takahashi in view of Hanaoka, as shown in the rejection above, discloses the limitations of claim 2. Takahashi further teaches:
The information processing apparatus according to claim 2, wherein the detection unit determines the traveling surface further on a basis of, (“The calculation unit may calculate one or more determination planes that define the determination area. In this case, the one or more determination surfaces may include at least one of a first determination surface corresponding to the ground and a second determination surface corresponding to the ceiling surface. The calculation unit may change the height position of at least one of the first determination surface and the second determination surface according to a change in the height position of the sensor.” (Takahashi: Description))
Takahashi does not teach but Hanaoka teaches:
[…] a component in a gravity direction of the normal line., (See (Hanaoka: Description of Embodiments – 61st-64th, 125th-127th, and 180th paragraphs))
It would have been obvious to one of ordinary skill in the art at the time of filing, before the effective filing date of the claimed invention, to modify Takahashi with these above aforementioned teachings from Hanaoka in order to create an effective information processing apparatus, method, and program. At the time the invention was filed, one of ordinary skill in the art would have been motivated to incorporate Takahashi’s information processing device, information processing method, program, and mobile body with Hanaoka’s surrounding environment recognition device, method, and autonomous mobile system in order to detect an obstacle location that becomes obstruction for traveling in a traveling direction on a basis of a two-dimensional point cloud corresponding to the traveling direction. Combining Takahashi and Hanaoka would thus “enhance accuracy of recognizing a surrounding environment by detecting a level difference reliably and accurately even when there is a blind spot. Moreover, in the autonomous mobile system of the present invention, it is possible to prevent falling into a level difference by detecting the level difference reliably.” (Hanaoka: Summary of the Invention – 26th-27th paragraphs)
Regarding Claim 8:
Takahashi in view of Hanaoka, as shown in the rejection above, discloses the limitations of claim 1. Takahashi further teaches:
The information processing apparatus according to claim 1, wherein the three-dimensional point cloud is a point cloud obtained from one or more three-dimensional distance measurement sensors,, (“The acquisition unit may acquire three-dimensional point cloud data related to the periphery of the moving body. In this case, the determination unit may determine the situation around the moving body by determining whether or not the point data included in the three-dimensional point cloud data is included in the determination region.” (Takahashi: Description) Takahashi further mentions “A distance sensor 25 (not shown in FIG. 3) is disposed on the front side of the head 22 of the robot 20, and it is possible to measure the distance to an object existing in the measurement range M extending on the front side. ing. In this embodiment, LiDAR is used as the distance sensor 25, and the distance to the object existing in the measurement range M is acquired as three-dimensional point cloud data. The shape within the measurement range M can be determined from the three-dimensional point cloud data.” (Takahashi: Description))
Takahashi does not teach but Hanaoka teaches:
[…] and the two-dimensional point cloud is a point cloud obtained from one or more two-dimensional distance measurement sensors., (See (Hanaoka: Summary of Invention – 11-25th paragraphs, Description of Embodiments – 62nd paragraph, and Other Embodiment – 155th-160th paragraphs))
It would have been obvious to one of ordinary skill in the art at the time of filing, before the effective filing date of the claimed invention, to modify Takahashi with these above aforementioned teachings from Hanaoka in order to create an effective information processing apparatus, method, and program. At the time the invention was filed, one of ordinary skill in the art would have been motivated to incorporate Takahashi’s information processing device, information processing method, program, and mobile body with Hanaoka’s surrounding environment recognition device, method, and autonomous mobile system in order to detect an obstacle location that becomes obstruction for traveling in a traveling direction on a basis of a two-dimensional point cloud corresponding to the traveling direction. Combining Takahashi and Hanaoka would thus “enhance accuracy of recognizing a surrounding environment by detecting a level difference reliably and accurately even when there is a blind spot. Moreover, in the autonomous mobile system of the present invention, it is possible to prevent falling into a level difference by detecting the level difference reliably.” (Hanaoka: Summary of the Invention – 26th-27th paragraphs)
Regarding Claim 9:
Takahashi in view of Hanaoka, as shown in the rejection above, discloses the limitations of claim 8. Takahashi further teaches:
The information processing apparatus according to claim 8, wherein the three-dimensional point cloud is a point cloud generated on a basis of, (“The acquisition unit may acquire three-dimensional point cloud data related to the periphery of the moving body. In this case, the determination unit may determine the situation around the moving body by determining whether or not the point data included in the three-dimensional point cloud data is included in the determination region.” (Takahashi: Description) Takahashi further mentions “The obstacle determination unit 33 acquires three-dimensional point cloud data detected by the distance sensor 25 (step 103). The obstacle determination unit 33 determines an obstacle based on the calculated determination region D and the detected three-dimensional point cloud data (step 104). The obstacle determination unit 33 determines an obstacle based on the relationship between the three-dimensional point cloud data and the determination area D. Specifically, it is determined whether or not an obstacle exists by determining whether or not the point data included in the three-dimensional point cloud data is included in the determination region D. As shown in FIG. 6, for example, points included in the determination area D (D0) are determined as points constituting an obstacle. Then, an object including a point constituting the obstacle is determined as the obstacle 3. In the example shown in FIG. 6, the object 4 a existing on the floor surface 1 and the object 4 b installed on the ceiling surface 2 are both determined as the obstacle 3. That is, in this embodiment, the three-dimensional point cloud data obtained from the distance sensor 25 is filtered based on the first and second identification surfaces 35 and 26. And when at least one part is contained in the determination area | region D (D0), it determines with the obstruction 3. FIG. Note that only points included in the determination region D (D0) may be determined as the obstacle 3.” (Takahashi: Description))
[…] a signal acquired by each of reception elements arranged in a predetermined array and included in the three-dimensional distance measurement sensor., (“A distance sensor 25 (not shown in FIG. 3) is disposed on the front side of the head 22 of the robot 20, and it is possible to measure the distance to an object existing in the measurement range M extending on the front side. ing. In this embodiment, LiDAR is used as the distance sensor 25, and the distance to the object existing in the measurement range M is acquired as three-dimensional point cloud data. The shape within the measurement range M can be determined from the three-dimensional point cloud data.” (Takahashi: Description) Takahashi further mentions “The output control unit 105 controls the output of various information to the passenger of the moving body or the outside of the moving body. For example, the output control unit 105 generates an output signal including at least one of visual information (for example, image data) and auditory information (for example, audio data), and supplies the output signal to the output unit 106, thereby outputting the output unit The output of visual information and auditory information from 106 is controlled. Specifically, for example, the output control unit 105 synthesizes image data captured by different imaging devices of the data acquisition unit 102 to generate an overhead image or a panoramic image, and outputs an output signal including the generated image. This is supplied to the output unit 106. Further, for example, the output control unit 105 generates sound data including a warning sound or a warning message for a danger such as a collision, contact, or entry into a dangerous zone, and outputs an output signal including the generated sound data to the output unit 106. Supply. The output unit 106 includes a device that can output visual information or auditory information to a passenger of the moving body or the outside of the moving body. For example, the output unit 106 includes a display device, an instrument panel, an audio speaker, headphones, a wearable device such as a glasses-type display worn by a passenger, a projector, a lamp, and the like. In addition to a device having a normal display, the display device included in the output unit 106 may display visual information in the driver's field of view, such as a head-up display, a transmissive display, and a device having an AR (Augmented Reality) display function. It may be a display device. Note that the output control unit 105 and the output unit 106 are not essential components for the autonomous movement process, and may be omitted as necessary. The drive system control unit 107 controls the drive system 108 by generating various control signals and supplying them to the drive system 108. Further, the drive system control unit 107 supplies a control signal to each unit other than the drive system 108 as necessary, and notifies the control state of the drive system 108 and the like.” (Takahashi: Description))
Regarding Claim 10:
Takahashi in view of Hanaoka, as shown in the rejection above, discloses the limitations of claim 8. Takahashi does not teach but Hanaoka teaches:
The information processing apparatus according to claim 8, wherein the two-dimensional point cloud is a point cloud generated by the two-dimensional distance measurement sensor on a basis of, (See (Hanaoka: Summary of Invention – 11-25th paragraphs, Description of Embodiments – 62nd paragraph, and Other Embodiment – 155th-160th paragraphs))
[…] a plane having a predetermined height with respect to the traveling surface as a starting point and being substantially parallel or having a predetermined angle with respect to the traveling surface., (See (Hanaoka: Description of Embodiments – 61st-78th, 128th-133rd, and 139th-145th paragraphs))
It would have been obvious to one of ordinary skill in the art at the time of filing, before the effective filing date of the claimed invention, to modify Takahashi with these above aforementioned teachings from Hanaoka in order to create an effective information processing apparatus, method, and program. At the time the invention was filed, one of ordinary skill in the art would have been motivated to incorporate Takahashi’s information processing device, information processing method, program, and mobile body with Hanaoka’s surrounding environment recognition device, method, and autonomous mobile system in order to detect an obstacle location that becomes obstruction for traveling in a traveling direction on a basis of a two-dimensional point cloud corresponding to the traveling direction. Combining Takahashi and Hanaoka would thus “enhance accuracy of recognizing a surrounding environment by detecting a level difference reliably and accurately even when there is a blind spot. Moreover, in the autonomous mobile system of the present invention, it is possible to prevent falling into a level difference by detecting the level difference reliably.” (Hanaoka: Summary of the Invention – 26th-27th paragraphs)
Regarding Claim 11:
Takahashi in view of Hanaoka, as shown in the rejection above, discloses the limitations of claim 1. Takahashi further teaches:
The information processing apparatus according to claim 1, further comprising a map generation unit that generates a map based on two-dimensional information on a basis of, (“The self-position estimation unit 132 is based on data or signals from each part of the mobile control system 100 such as the mobile external information detection unit 141 and the situation recognition unit 152 of the situation analysis unit 133. Etc. are estimated. In addition, the self-position estimation unit 132 generates a local map (hereinafter referred to as a self-position estimation map) used for self-position estimation as necessary. The self-position estimation map is, for example, a highly accurate map using a technique such as SLAM (SimultaneousultLocalization and Mapping). The self-position estimation unit 132 supplies data indicating the result of the estimation process to the map analysis unit 151 of the situation analysis unit 133, the situation recognition unit 152, and the like. In addition, the self-position estimating unit 132 stores the self-position estimating map in the storage unit 109.” (Takahashi: Description) Takahashi further mentions “The map analysis unit 151 uses various data or signals from each part of the mobile control system 100 such as the self-position estimation unit 132 and the mobile external information detection unit 141 as necessary, and stores various data stored in the storage unit 109. The map is analyzed, and a map including information necessary for the autonomous movement process is constructed. The map analysis unit 151 supplies the constructed map to the situation recognition unit 152, the situation prediction unit 153, the route plan unit 161, the action plan unit 162, the action plan unit 163, and the like of the plan unit 134. The situation recognition unit 152 is a part of the mobile body control system 100 such as a self-position estimation unit 132, a mobile body external information detection unit 141, a mobile body internal information detection unit 142, a mobile body state detection unit 143, and a map analysis unit 151. Based on the data or signal from, the recognition process of the situation regarding a moving body is performed. For example, the situation recognition unit 152 performs recognition processing such as the situation of the moving body, the situation around the moving body, and the situation of the driver of the moving body. In addition, the situation recognition unit 152 generates a local map (hereinafter referred to as a situation recognition map) used for recognition of the situation around the moving object, as necessary. The situation recognition map is, for example, an occupation grid map (OccupancyccMap Map), a road map (Lane Map), or a point cloud map (Point Cloud Map).” (Takahashi: Description))
[…] information indicating the obstacle location detected by the detection unit., (“The obstacle determination unit 33 acquires three-dimensional point cloud data detected by the distance sensor 25 (step 103). The obstacle determination unit 33 determines an obstacle based on the calculated determination region D and the detected three-dimensional point cloud data (step 104). The obstacle determination unit 33 determines an obstacle based on the relationship between the three-dimensional point cloud data and the determination area D. Specifically, it is determined whether or not an obstacle exists by determining whether or not the point data included in the three-dimensional point cloud data is included in the determination region D.” (Takahashi: Description))
Regarding Claim 12:
Takahashi in view of Hanaoka, as shown in the rejection above, discloses the limitations of claim 11. Takahashi further teaches:
The information processing apparatus according to claim 11, further comprising a travel control unit that controls traveling of a mobile body on a basis of, (“A moving body according to an embodiment of the present technology includes a drive unit, a sensor, a detection unit, a calculation unit, and a drive control unit. The sensor can detect surrounding information. The said detection part detects the information regarding the height position of the said sensor. The calculation unit calculates a determination region for determining a surrounding situation based on the detected information on the height position of the sensor. The determination unit determines a surrounding situation based on the calculated determination region and the peripheral information detected by the sensor. The drive control unit controls the drive unit based on a determination result by the determination unit.” (Takahashi: Description) Takahashi further mentions “FIG. 14 is an external view showing a configuration example of a vehicle equipped with an automatic driving control unit according to an embodiment of the present technology. FIG. 14A is a perspective view illustrating a configuration example of the vehicle 290, and FIG. 1B is a schematic view when the vehicle 290 is viewed from above. The vehicle 290 has an automatic driving function capable of automatic traveling (autonomous movement) to a destination. The vehicle 290 is an example of a moving body.” (Takahashi: Description))
[…] the map generated by the map generation unit., (“The self-position estimation unit 132 is based on data or signals from each part of the mobile control system 100 such as the mobile external information detection unit 141 and the situation recognition unit 152 of the situation analysis unit 133. Etc. are estimated. In addition, the self-position estimation unit 132 generates a local map (hereinafter referred to as a self-position estimation map) used for self-position estimation as necessary. The self-position estimation map is, for example, a highly accurate map using a technique such as SLAM (SimultaneousultLocalization and Mapping). The self-position estimation unit 132 supplies data indicating the result of the estimation process to the map analysis unit 151 of the situation analysis unit 133, the situation recognition unit 152, and the like. In addition, the self-position estimating unit 132 stores the self-position estimating map in the storage unit 109.” (Takahashi: Description) Takahashi further mentions “The map analysis unit 151 uses various data or signals from each part of the mobile control system 100 such as the self-position estimation unit 132 and the mobile external information detection unit 141 as necessary, and stores various data stored in the storage unit 109. The map is analyzed, and a map including information necessary for the autonomous movement process is constructed. The map analysis unit 151 supplies the constructed map to the situation recognition unit 152, the situation prediction unit 153, the route plan unit 161, the action plan unit 162, the action plan unit 163, and the like of the plan unit 134. The situation recognition unit 152 is a part of the mobile body control system 100 such as a self-position estimation unit 132, a mobile body external information detection unit 141, a mobile body internal information detection unit 142, a mobile body state detection unit 143, and a map analysis unit 151. Based on the data or signal from, the recognition process of the situation regarding a moving body is performed. For example, the situation recognition unit 152 performs recognition processing such as the situation of the moving body, the situation around the moving body, and the situation of the driver of the moving body. In addition, the situation recognition unit 152 generates a local map (hereinafter referred to as a situation recognition map) used for recognition of the situation around the moving object, as necessary. The situation recognition map is, for example, an occupation grid map (OccupancyccMap Map), a road map (Lane Map), or a point cloud map (Point Cloud Map).” (Takahashi: Description))
Regarding Claim 13:
Takahashi teaches:
An information processing method executed by a processor, the method comprising a detection step of detecting an obstacle location that becomes obstruction for traveling in a traveling direction on a basis of a three-dimensional point cloud determined to be a traveling surface, (“In order to achieve the above object, an information processing apparatus according to an embodiment of the present technology includes an acquisition unit, a calculation unit, and a determination unit. The acquisition unit acquires information related to a height position of a sensor capable of detecting peripheral information of the moving body. The calculation unit calculates a determination region for determining a situation around the moving body based on the acquired information on the height position of the sensor. The determination unit determines a state of the periphery of the moving body based on the calculated determination area and the peripheral information detected by the sensor. In this information processing apparatus, a determination region for determining the situation around the moving body is calculated based on information on the height position of the sensor. This makes it possible to easily and accurately determine the situation around the moving body. The calculation unit may determine whether there is an obstacle around the moving body. The acquisition unit may acquire shape data related to the periphery of the moving body. In this case, the determination unit may determine a situation around the moving body based on the relationship between the detected shape data and the determination region. The acquisition unit may acquire three-dimensional point cloud data related to the periphery of the moving body. In this case, the determination unit may determine the situation around the moving body by determining whether or not the point data included in the three-dimensional point cloud data is included in the determination region. The calculation unit may change the height position of the determination region according to a change in the height position of the sensor. The calculation unit may calculate one or more determination planes that define the determination area. In this case, the one or more determination surfaces may include at least one of a first determination surface corresponding to the ground and a second determination surface corresponding to the ceiling surface.” (Takahashi: Description) Takahashi further mentions “An information processing method according to an embodiment of the present technology is an information processing method executed by a computer system, and includes acquiring information related to a height position of a sensor capable of detecting peripheral information of a moving object. Based on the acquired information on the height position of the sensor, a determination region for determining a situation around the moving body is calculated. Based on the calculated determination area and the surrounding information detected by the sensor, a situation around the moving body is determined.” (Takahashi: Description))
Takahashi does not teach but Hanaoka teaches:
and a two-dimensional point cloud corresponding to the traveling direction., (See (Hanaoka: Background Art – 4th-5th paragraphs, Description of Embodiments – 47th, 59th, and 62nd paragraphs, and Other Embodiment – 155th paragraph))
It would have been obvious to one of ordinary skill in the art at the time of filing, before the effective filing date of the claimed invention, to modify Takahashi with these above aforementioned teachings from Hanaoka in order to create an effective information processing apparatus, method, and program. At the time the invention was filed, one of ordinary skill in the art would have been motivated to incorporate Takahashi’s information processing device, information processing method, program, and mobile body with Hanaoka’s surrounding environment recognition device, method, and autonomous mobile system in order to detect an obstacle location that becomes obstruction for traveling in a traveling direction on a basis of a two-dimensional point cloud corresponding to the traveling direction. Combining Takahashi and Hanaoka would thus “enhance accuracy of recognizing a surrounding environment by detecting a level difference reliably and accurately even when there is a blind spot. Moreover, in the autonomous mobile system of the present invention, it is possible to prevent falling into a level difference by detecting the level difference reliably.” (Hanaoka: Summary of the Invention – 26th-27th paragraphs)
Regarding Claim 14:
Takahashi teaches:
An information processing program causing a computer to execute a detection step of detecting an obstacle location that becomes obstruction for traveling in a traveling direction on a basis of a three-dimensional point cloud determined to be a traveling surface, (“In order to achieve the above object, an information processing apparatus according to an embodiment of the present technology includes an acquisition unit, a calculation unit, and a determination unit. The acquisition unit acquires information related to a height position of a sensor capable of detecting peripheral information of the moving body. The calculation unit calculates a determination region for determining a situation around the moving body based on the acquired information on the height position of the sensor. The determination unit determines a state of the periphery of the moving body based on the calculated determination area and the peripheral information detected by the sensor. In this information processing apparatus, a determination region for determining the situation around the moving body is calculated based on information on the height position of the sensor. This makes it possible to easily and accurately determine the situation around the moving body. The calculation unit may determine whether there is an obstacle around the moving body. The acquisition unit may acquire shape data related to the periphery of the moving body. In this case, the determination unit may determine a situation around the moving body based on the relationship between the detected shape data and the determination region. The acquisition unit may acquire three-dimensional point cloud data related to the periphery of the moving body. In this case, the determination unit may determine the situation around the moving body by determining whether or not the point data included in the three-dimensional point cloud data is included in the determination region. The calculation unit may change the height position of the determination region according to a change in the height position of the sensor. The calculation unit may calculate one or more determination planes that define the determination area. In this case, the one or more determination surfaces may include at least one of a first determination surface corresponding to the ground and a second determination surface corresponding to the ceiling surface.” (Takahashi: Description) Takahashi further mentions “An information processing method according to an embodiment of the present technology is an information processing method executed by a computer system, and includes acquiring information related to a height position of a sensor capable of detecting peripheral information of a moving object. Based on the acquired information on the height position of the sensor, a determination region for determining a situation around the moving body is calculated. Based on the calculated determination area and the surrounding information detected by the sensor, a situation around the moving body is determined.” (Takahashi: Description))
Takahashi does not teach but Hanaoka teaches:
and a two-dimensional point cloud corresponding to the traveling direction., (See (Hanaoka: Background Art – 4th-5th paragraphs, Description of Embodiments – 47th, 59th, and 62nd paragraphs, and Other Embodiment – 155th paragraph))
It would have been obvious to one of ordinary skill in the art at the time of filing, before the effective filing date of the claimed invention, to modify Takahashi with these above aforementioned teachings from Hanaoka in order to create an effective information processing apparatus, method, and program. At the time the invention was filed, one of ordinary skill in the art would have been motivated to incorporate Takahashi’s information processing device, information processing method, program, and mobile body with Hanaoka’s surrounding environment recognition device, method, and autonomous mobile system in order to detect an obstacle location that becomes obstruction for traveling in a traveling direction on a basis of a two-dimensional point cloud corresponding to the traveling direction. Combining Takahashi and Hanaoka would thus “enhance accuracy of recognizing a surrounding environment by detecting a level difference reliably and accurately even when there is a blind spot. Moreover, in the autonomous mobile system of the present invention, it is possible to prevent falling into a level difference by detecting the level difference reliably.” (Hanaoka: Summary of the Invention – 26th-27th paragraphs)
Claim 7 is rejected under 35 U.S.C. 103 as being unpatentable over Takahashi (WO 2019176278 A1) in view of Hanaoka (U.S. Pub. No. 2015/0362921 A1) in further view of Sabe (U.S. Pub. No. 2005/0131581 A1).
Regarding Claim 7:
Takahashi in view of Hanaoka, as shown in the rejection above, discloses the limitations of claim 2. Takahashi further teaches:
The information processing apparatus according to claim 2, wherein the detection unit specifies the plane on a basis of, (“The acquisition unit may acquire three-dimensional point cloud data related to the periphery of the moving body. In this case, the determination unit may determine the situation around the moving body by determining whether or not the point data included in the three-dimensional point cloud data is included in the determination region. The calculation unit may change the height position of the determination region according to a change in the height position of the sensor. The calculation unit may calculate one or more determination planes that define the determination area. In this case, the one or more determination surfaces may include at least one of a first determination surface corresponding to the ground and a second determination surface corresponding to the ceiling surface.” (Takahashi: Description) Takahashi further mentions “For example, a method of extracting a plane from a three-dimensional point group obtained from a distance sensor and recognizing an object other than the plane as an obstacle is conceivable. By extracting the floor / ceiling surface from only the sensor signal, the floor surface can be removed only from the signal without depending on the posture of the sensor with respect to the outside world. However, this method requires a complicated procedure for detecting the floor surface from the point cloud, and is difficult to apply to a small robot with a scarce computer resource.” (Takahashi: Description))
Takahashi does not teach but Sabe teaches:
[…] a first principal component and a second principal component acquired by principal component analysis on the three-dimensional point cloud., (See (Sabe: Detailed Description of the Preferred Embodiments – 89th paragraph))
It would have been obvious to one of ordinary skill in the art at the time of filing, before the effective filing date of the claimed invention, to modify Takahashi in view of Hanaoka with these above aforementioned teachings from Sabe in order to create a smart information processing apparatus, method, and program. At the time the invention was filed, one of ordinary skill in the art would have been motivated to incorporate Takahashi’s information processing device, information processing method, program, and mobile body with Sabe’s environment recognizing device and method, route planning device and method, and robot in order to specify a plane on a basis of a first principal component and a second principal component acquired by principal component analysis on a three-dimensional point cloud. Combining Takahashi and Sabe would thus provide “an environment recognizing device and an environment recognizing method that can draw an environment map for judging if it is possible to move a region where one or more than one steps are found above or below a floor, a route planning device and a route planning method that can appropriately plan a moving route, using such an environment map and a robot equipped with such an environment recognizing device and a route planning device.” (Sabe: Summary of the Invention – 11th paragraph)
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Jeffrey Chalhoub whose telephone number is (571) 272-9754. The examiner can normally be reached Mon-Fri 8:30-5:30.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Vivek Koppikar can be reached on (571) 272-5109. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/J.R.C./Examiner, Art Unit 3667
/VIVEK D KOPPIKAR/Supervisory Patent Examiner
Art Unit 3667
December 22, 2025