DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55.
Information Disclosure Statement
The information disclosure statement (IDS) filed 06/27/2025 has been received and considered by the examiner. The submission is in compliance with the provisions of 37 CFR 1.97.
The information disclosure statement (IDS) filed 11/22/2024 has been received and considered by the examiner. The submission is in compliance with the provisions of 37 CFR 1.97.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
In January, 2019 (updated October 2019), the USPTO released new examination guidelines setting forth a two-step inquiry for determining whether a claim is directed to non-statutory subject matter. According to the guidelines, a claim is directed to non-statutory subject matter if:
STEP 1: the claim does not fall within one of the four statutory categories of invention (process, machine, manufacture or composition of matter), or
STEP 2: the claim recites a judicial exception, e.g. an abstract idea, without reciting additional elements that amount to significantly more than the judicial exception, as determined using the following analysis:
STEP 2A (PRONG 1): Does the claim recite an abstract idea, law of nature, or natural phenomenon?
STEP 2A (PRONG 2): Does the claim recite additional elements that integrate the judicial exception into a practical application?
STEP 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception?
Using the two-step inquiry, it is clear that claims 1 and 11 are directed toward non-statutory subject matter, as shown below:
STEP 1: Do claims 1 and 11 fall within one of the statutory categories? Yes. The claims are directed toward an apparatus and a method including at least one step.
STEP 2A (PRONG 1): Is the claim directed to a law of nature, a natural phenomenon or an abstract idea? Yes, the claims are directed to an abstract idea.
With regard to STEP 2A (PRONG 1), the guidelines provide three groupings of subject matter that are considered abstract ideas:
Mathematical concepts – mathematical relationships, mathematical formulas or equations, mathematical calculations;
Certain methods of organizing human activity – fundamental economic principles or practices (including hedging, insurance, mitigating risk); commercial or legal interactions (including agreements in the form of contracts; legal obligations; advertising, marketing or sales activities or behaviors; business relations); managing personal behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions); and
Mental processes – concepts that are practicably performed in the human mind (including an observation, evaluation, judgment, opinion).
Claim 1. An apparatus for controlling a mobility apparatus, the apparatus comprising:
an imaging device on the mobility apparatus;
at least one processor communicably coupled to the imaging device;
and a storage medium storing computer-readable instructions that, when executed by the at least one processor, enable the at least one processor to:
detect a reference object from an image obtained by the imaging device in response to certain non-image-based position information of the mobility apparatus being unavailable,
obtain image-based position information of the mobility apparatus based on a relative position of the mobility apparatus with respect to the reference object,
and determine whether the mobility apparatus departs from a course based on the image-based position information of the mobility apparatus.
The method in claim 1, specifically the limitations emphasized above, is a mental process that can be practicably performed in the human mind and, therefore, an abstract idea. It merely consists of detecting a reference object and determining whether the mobility apparatus departs from a course. This is equivalent to a person mentally viewing the environment, determining a reference object, and deciding if the mobility apparatus departed from a course.
Claim 11. A method for controlling a mobility apparatus, the method comprising:
detecting a reference object from an image obtained by an imaging device of the mobility apparatus, in response to certain non-image-based position information of the mobility apparatus being unavailable;
determining a relative position of the mobility apparatus with respect to the reference object;
obtaining image-based position information of the mobility apparatus, based on the relative position of the mobility apparatus with respect to the reference object;
and determining whether the mobility apparatus departs from a course, based on the image-based position information of the mobility apparatus.
The method in claim 11, specifically the limitations emphasized above, is a mental process that can be practicably performed in the human mind and, therefore, an abstract idea. It merely consists of detecting a reference object, determining a relative position, and determining whether the mobility apparatus departs from a course. This is equivalent to a person mentally viewing the environment, determining a reference object, deciding a relative position, and deciding if the mobility apparatus departed from a course.
STEP 2A (PRONG 2): Does the claim recite additional elements that integrate the judicial exception into a practical application? No, the claims do not recite additional elements that integrate the judicial exception into a practical application.
With regard to STEP 2A (prong 2), whether the claim recites additional elements that integrate the judicial exception into a practical application, the guidelines provide the following exemplary considerations that are indicative that an additional element (or combination of elements) may have integrated the judicial exception into a practical application:
an additional element reflects an improvement in the functioning of a computer, or an improvement to other technology or technical field;
an additional element that applies or uses a judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition;
an additional element implements a judicial exception with, or uses a judicial exception in conjunction with, a particular machine or manufacture that is integral to the claim;
an additional element effects a transformation or reduction of a particular article to a different state or thing; and
an additional element applies or uses the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception.
While the guidelines further state that the exemplary considerations are not an exhaustive list and that there may be other examples of integrating the exception into a practical application, the guidelines also list examples in which a judicial exception has not been integrated into a practical application:
an additional element merely recites the words “apply it” (or an equivalent) with the judicial exception, or merely includes instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea;
an additional element adds insignificant extra-solution activity to the judicial exception; and
an additional element does no more than generally link the use of a judicial exception to a particular technological environment or field of use.
In the present case, the additional limitations beyond the above-noted abstract ideas are as follows (where the underlined portions are the “additional limitations” while the bolded portions continue to represent the abstract “idea”).
Claim 1. An apparatus for controlling a mobility apparatus, the apparatus comprising:
an imaging device on the mobility apparatus;
at least one processor communicably coupled to the imaging device;
and a storage medium storing computer-readable instructions that, when executed by the at least one processor, enable the at least one processor to:
detect a reference object from an image obtained by the imaging device in response to certain non-image-based position information of the mobility apparatus being unavailable,
obtain image-based position information of the mobility apparatus based on a relative position of the mobility apparatus with respect to the reference object,
and determine whether the mobility apparatus departs from a course based on the image-based position information of the mobility apparatus.
Claim 1 does not recite any of the exemplary considerations that are indicative of an abstract idea having been integrated into a practical application. The step of “obtain image-based position information…” is recited at a high level of generality and amounts to mere data gathering, which is a form of extra solution activity. The limitation “An apparatus for controlling a mobility apparatus, the apparatus comprising: an imaging device on the mobility apparatus; at least one processor communicably coupled to the imaging device; and a storage medium storing computer-readable instructions that, when executed by the at least one processor, enable the at least one processor to…” is claimed generically and are operating in their ordinary capacity such that they do not use the judicial exception in a manner that imposes a meaningful limit on the judicial exception. The apparatus for controlling a mobility apparatus, the apparatus comprising: an imaging device on the mobility apparatus; at least one processor communicably coupled to the imaging device; and a storage medium storing computer-readable instructions that, when executed by the at least one processor, enable the at least one processor merely describe how to generally “apply” the otherwise mental judgments in a generic or general purpose computing environment. The apparatus for controlling a mobility apparatus, the apparatus comprising: an imaging device on the mobility apparatus; at least one processor communicably coupled to the imaging device; and a storage medium storing computer-readable instructions that, when executed by the at least one processor, enable the at least one processor are recited at a high level of generality and merely automate the detecting, obtaining, and determining steps. These limitations can also be viewed as nothing more than an attempt to generally link the use of the judicial exception to the technological environment of a computer. It should be noted that because the courts have made it clear that mere physicality or tangibility of an additional element or elements is not a relevant consideration in the eligibility analysis, the physical nature of these computer components does not affect this analysis. See MPEP 2106.05(I). Accordingly, even in combination, these additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea.
Claim 11. A method for controlling a mobility apparatus, the method comprising:
detecting a reference object from an image obtained by an imaging device of the mobility apparatus, in response to certain non-image-based position information of the mobility apparatus being unavailable;
determining a relative position of the mobility apparatus with respect to the reference object;
obtaining image-based position information of the mobility apparatus, based on the relative position of the mobility apparatus with respect to the reference object;
and determining whether the mobility apparatus departs from a course, based on the image-based position information of the mobility apparatus.
Claim 11 does not recite any of the exemplary considerations that are indicative of an abstract idea having been integrated into a practical application. The step of “obtaining image-based position information…” is recited at a high level of generality and amounts to mere data gathering, which is a form of extra solution activity.
STEP 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception? No, the claims do not recite additional elements that amount to significantly more than the judicial exception.
With regard to STEP 2B, whether the claims recite additional elements that provide significantly more than the recited judicial exception, the guidelines specify that the pre-guideline procedure is still in effect. Specifically, that examiners should continue to consider whether an additional element or combination of elements:
adds a specific limitation or combination of limitations that are not well-understood, routine, conventional activity in the field, which is indicative that an inventive concept may be present; or
simply appends well-understood, routine, conventional activities previously known to the industry, specified at a high level of generality, to the judicial exception, which is indicative that an inventive concept may not be present.
Regarding Step 2B of the 2019 PEG, independent claims 1 and 11 do not include additional elements (considered both individually and as an ordered combination) that are sufficient to amount to significantly more than the judicial exception for the same reasons to those discussed above with respect to determining that the claims do not integrate the abstract idea into a practical application.
As discussed above with respect to integration of the abstract idea into a practical application, the additional limitation(s) of “An apparatus for controlling a mobility apparatus, the apparatus comprising: an imaging device on the mobility apparatus; at least one processor communicably coupled to the imaging device; and a storage medium storing computer-readable instructions that, when executed by the at least one processor, enable the at least one processor to…” is/are merely means to apply the exception and do not amount to “significantly more”, as adding the words "apply it" (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, e.g., a limitation indicating that a particular function such as creating and maintaining electronic records is performed by a computer, as discussed in Alice Corp., 573 U.S. at 225-26, 110 USPQ2d at 1984, are not sufficient to amount to significantly more than the judicial exception.
Further, a conclusion that an additional element is insignificant extra-solution activity in Step 2A should be re-evaluated in Step 2B to determine if they are more than what is well-understood, routine, conventional activity in the field. The additional limitations of “obtain image-based position information…” and “obtaining image-based position information…” are well-understood, routine, and conventional activities because the specification does not provide any indication that the detecting, determining, and obtaining steps are performed using anything other than a conventional computer. See also MPEP 2106.05(d)(II), and the cases cited therein, including Intellectual Ventures |, LLC v. Symantec Corp., 838 F.3d 1307, 1321 (Fed. Cir. 2016), TL! Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610 (Fed. Cir. 2016), and O/P Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363 (Fed. Cir. 2015), indicate that mere performance of an action is a well-understood, routine, and conventional function when it is claimed in a merely generic manner (as it is here). Hence, the claim is not patent eligible.
Thus, since claims 1 and 11 are: (a) directed toward an abstract idea, (b) do not recite additional elements that integrate the judicial exception into a practical application, and (c) do not recite additional elements that amount to significantly more than the judicial exception, it is clear that claims 1 and 11 are directed towards non-statutory subject matter.
Dependent claims 2-10 and 12-20 further limit the abstract idea without integrating the abstract idea into practical application or adding significantly more, such as the limitations in claim 10 that amount to insignificant extra solution activity using a similar analysis applied to claim 1 above.
As such, claims 1-20 are rejected under 35 USC 101 as being drawn to an abstract idea without significantly more, and thus are ineligible.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-5, 11-15 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by CHEN (CN 114564034 A).
Regarding Claim 1, CHEN teaches An apparatus for controlling a mobility apparatus, the apparatus comprising: an imaging device on the mobility apparatus (See at least Fig. 2, [n0014], “The rotation matrix from the navigation coordinate system to the UAV body coordinate system is known and can be expressed as follows: Establish the camera coordinate system by taking the optical center of the camera. The Z<sup>c</sup> axis is along the optical axis, and the X<sup>c</sup> and Y<sup>c</sup> axes are determined by the right-hand rule. The rotation matrix from this coordinate system to the navigation coordinate system is determined by the camera's horizontal rotation angle ψ, roll angle φ, and pitch angle θ. Furthermore, the rotation matrix from the navigation coordinate system to the camera coordinate system is known and can be expressed as follows: The rotation matrix between the UAV body coordinate system and the camera coordinate system is.”); at least one processor communicably coupled to the imaging device (See at least Fig. 2, paragraph [n0053], “The system uses Kalman filtering, using the relative coordinate information calculated by PTBVS as the measurement value, integrating the IMU data loaded by the drone to obtain the drone's displacement between two frames, obtaining the estimated value of relative information based on the relative position information of the last frame, and using the correction value output by the filter as the input of the drone controller; the drone's horizontal movement path should be between P<sub>1</sub>′ and P′<sub>2</sub>; when approaching the landing point, the drone flies at the set maximum horizontal speed. During flight, the controller adjusts the speed and direction based on relative position information.” The system processes image-derived PTBVS relative coordinate information and uses the resulting output as input to a drone controller, which necessarily requires at least one processor communicably coupled to the imaging device to receive and process image information.); and a storage medium storing computer-readable instructions that, when executed by the at least one processor (See at least Fig. 2, paragraph [n0053], “The system uses Kalman filtering, using the relative coordinate information calculated by PTBVS as the measurement value, integrating the IMU data loaded by the drone to obtain the drone's displacement between two frames, obtaining the estimated value of relative information based on the relative position information of the last frame, and using the correction value output by the filter as the input of the drone controller; the drone's horizontal movement path should be between P<sub>1</sub>′ and P′<sub>2</sub>; when approaching the landing point, the drone flies at the set maximum horizontal speed. During flight, the controller adjusts the speed and direction based on relative position information.” The system executes PTBVS processing and Kalman filtering as part of its control operation, which necessarily requires computer-readable instructions stored on a storage medium and executed by at least one processor.), enable the at least one processor to: detect a reference object from an image obtained by the imaging device in response to certain non-image-based position information of the mobility apparatus being unavailable (See at least Fig. 2, paragraph [n0040], “In some cases, although GPS signals can be received normally, the positioning information may have large errors or be completely wrong. The geometrical factor of precision (GDOP) reflects the accuracy of GPS positioning. GDOP enables optimal positioning using four satellites for both all-view positioning and GPS-PRN code ranging. Therefore, GPS signals are considered accurate when GDOP ≤ 2.7, otherwise unreliable. Once a sufficiently reliable signal is received, the UAV will compare the absolute position information of the UAV and the landing point to determine their initial relative position, and then begin landing. During the landing process, the positioning method proposed by the algorithm for obtaining positioning information through the PTBVS system is a gimbal-based visual servo system used for UAV navigation. The UAV uses visual information and sensor data to obtain the relative position information between the landing platform and itself, unaffected by GPS signal interference.”), obtain image-based position information of the mobility apparatus based on a relative position of the mobility apparatus with respect to the reference object (See at least Figs. 6-14, paragraph [n0040], “In some cases, although GPS signals can be received normally, the positioning information may have large errors or be completely wrong. The geometrical factor of precision (GDOP) reflects the accuracy of GPS positioning. GDOP enables optimal positioning using four satellites for both all-view positioning and GPS-PRN code ranging. Therefore, GPS signals are considered accurate when GDOP ≤ 2.7, otherwise unreliable. Once a sufficiently reliable signal is received, the UAV will compare the absolute position information of the UAV and the landing point to determine their initial relative position, and then begin landing. During the landing process, the positioning method proposed by the algorithm for obtaining positioning information through the PTBVS system is a gimbal-based visual servo system used for UAV navigation. The UAV uses visual information and sensor data to obtain the relative position information between the landing platform and itself, unaffected by GPS signal interference.”), and determine whether the mobility apparatus departs from a course based on the image-based position information of the mobility apparatus (See at least paragraph [n0065], “The return and landing trajectory diagram is shown in Figure 14(a). The blue curve represents the actual flight trajectory of the UAV, the red curve represents the navigation prediction trajectory of the PTBVS system, and the black dashed line represents the ideal trajectory obtained by the proposed rapid landing. It can be seen that in the actual environment, due to factors such as wind and sensor noise, the relative position error between the UAV and the landing point obtained by PTBVS increases. The actual landing trajectory deviates from the ideal landing trajectory. However, this deviation is tolerable, and the drone can still successfully land at the target landing point via PTBVS, solely for navigation purposes.”).
Regarding Claim 2, CHEN teaches The apparatus of claim 1, as set forth in the anticipation rejection above. CHEN teaches wherein the instructions further enable the at least one processor to: detect objects from the image (See at least paragraph [n0040], “In some cases, although GPS signals can be received normally, the positioning information may have large errors or be completely wrong. The geometrical factor of precision (GDOP) reflects the accuracy of GPS positioning. GDOP enables optimal positioning using four satellites for both all-view positioning and GPS-PRN code ranging. Therefore, GPS signals are considered accurate when GDOP ≤ 2.7, otherwise unreliable. Once a sufficiently reliable signal is received, the UAV will compare the absolute position information of the UAV and the landing point to determine their initial relative position, and then begin landing. During the landing process, the positioning method proposed by the algorithm for obtaining positioning information through the PTBVS system is a gimbal-based visual servo system used for UAV navigation. The UAV uses visual information and sensor data to obtain the relative position information between the landing platform and itself, unaffected by GPS signal interference.”); and determine a landmark to which a coordinates value is matched among the objects as the reference object (See at least Figs. 2, 8, 13, paragraph [n0040], “In some cases, although GPS signals can be received normally, the positioning information may have large errors or be completely wrong. The geometrical factor of precision (GDOP) reflects the accuracy of GPS positioning. GDOP enables optimal positioning using four satellites for both all-view positioning and GPS-PRN code ranging. Therefore, GPS signals are considered accurate when GDOP ≤ 2.7, otherwise unreliable. Once a sufficiently reliable signal is received, the UAV will compare the absolute position information of the UAV and the landing point to determine their initial relative position, and then begin landing. During the landing process, the positioning method proposed by the algorithm for obtaining positioning information through the PTBVS system is a gimbal-based visual servo system used for UAV navigation. The UAV uses visual information and sensor data to obtain the relative position information between the landing platform and itself, unaffected by GPS signal interference.”).
With respect to claim 12, please see the rejection above with respect to claim 2, which is commensurate in scope to claim 12, with claim 2 being drawn to a mobility system and claim 12 being drawn to a corresponding method.
Regarding Claim 3, CHEN teaches The apparatus of claim 1, as set forth in the anticipation rejection above. CHEN teaches wherein the instructions further enable the at least one processor to: transform the image into a top-view image (See at least Figs. 7, 12, 14, paragraph [n0014], “The rotation matrix from the navigation coordinate system to the UAV body coordinate system is known and can be expressed as follows: Establish the camera coordinate system by taking the optical center of the camera. The Z<sup>c</sup> axis is along the optical axis, and the X<sup>c</sup> and Y<sup>c</sup> axes are determined by the right-hand rule. The rotation matrix from this coordinate system to the navigation coordinate system is determined by the camera's horizontal rotation angle ψ, roll angle φ, and pitch angle θ. Furthermore, the rotation matrix from the navigation coordinate system to the camera coordinate system is known and can be expressed as follows: The rotation matrix between the UAV body coordinate system and the camera coordinate system is.” The system transforms image information from a camera coordinate system into a navigation coordinate system corresponding to a top-view representation.); and determine the relative position of the mobility apparatus with respect to the reference object on the top-view image (See at least Figs. 7, 12, 14, paragraph [n0040], “In some cases, although GPS signals can be received normally, the positioning information may have large errors or be completely wrong. The geometrical factor of precision (GDOP) reflects the accuracy of GPS positioning. GDOP enables optimal positioning using four satellites for both all-view positioning and GPS-PRN code ranging. Therefore, GPS signals are considered accurate when GDOP ≤ 2.7, otherwise unreliable. Once a sufficiently reliable signal is received, the UAV will compare the absolute position information of the UAV and the landing point to determine their initial relative position, and then begin landing. During the landing process, the positioning method proposed by the algorithm for obtaining positioning information through the PTBVS system is a gimbal-based visual servo system used for UAV navigation. The UAV uses visual information and sensor data to obtain the relative position information between the landing platform and itself, unaffected by GPS signal interference.” The system determines the relative position of the UAV with respect to the landing platform based on the transformed image information.).
With respect to claim 13, please see the rejection above with respect to claim 3, which is commensurate in scope to claim 13, with claim 3 being drawn to a mobility system and claim 13 being drawn to a corresponding method.
Regarding Claim 4, CHEN teaches The apparatus of claim 3, as set forth in the anticipation rejection above. CHEN teaches wherein the instructions further enable the at least one processor to: determine a height of the mobility apparatus from a ground on which the reference object is located on the top-view image (See at least Figs. 7, 12, 14, paragraph [n0053], “The camera's attitude angle will change according to the drone's position; the drone's control method is as follows: the drone flies to a safe landing altitude and then descends vertically, first approaching the landing platform to further improve the accuracy of navigation information. The system uses Kalman filtering, using the relative coordinate information calculated by PTBVS as the measurement value, integrating the IMU data loaded by the drone to obtain the drone's displacement between two frames, obtaining the estimated value of relative information based on the relative position information of the last frame, and using the correction value output by the filter as the input of the drone controller; the drone's horizontal movement path should be between P<sub>1</sub>′ and P′<sub>2</sub>; when approaching the landing point, the drone flies at the set maximum horizontal speed. During flight, the controller adjusts the speed and direction based on relative position information.”); and determine the relative position of the mobility apparatus with respect to the reference object, based on determining an x-axis separation distance and a y-axis separation distance from the reference object to the mobility apparatus (See at least Figs. 7, 12, 14, paragraph [n0053], “The camera's attitude angle will change according to the drone's position; the drone's control method is as follows: the drone flies to a safe landing altitude and then descends vertically, first approaching the landing platform to further improve the accuracy of navigation information. The system uses Kalman filtering, using the relative coordinate information calculated by PTBVS as the measurement value, integrating the IMU data loaded by the drone to obtain the drone's displacement between two frames, obtaining the estimated value of relative information based on the relative position information of the last frame, and using the correction value output by the filter as the input of the drone controller; the drone's horizontal movement path should be between P<sub>1</sub>′ and P′<sub>2</sub>; when approaching the landing point, the drone flies at the set maximum horizontal speed. During flight, the controller adjusts the speed and direction based on relative position information. Simultaneously, to ensure the drone's safety, the vertical speed needs to be determined based on the horizontal distance. When the horizontal distance between the drone and the landing point is within 2 meters, vertical landing is achieved using landmark navigation…At this stage, the drone will no longer fly at its maximum horizontal speed, but will adjust its horizontal speed using a PID controller that takes the pixel coordinates of the landing point as input.” The system determines an altitude of the UAV relative to the landing platform and determines relative position information using horizontal distance and pixel-coordinate-based navigation corresponding to x-axis and y-axis separation distances from the landing platform.).
With respect to claim 14, please see the rejection above with respect to claim 4, which is commensurate in scope to claim 14, with claim 4 being drawn to a mobility system and claim 14 being drawn to a corresponding method.
Regarding Claim 5, CHEN teaches The apparatus of claim 4, as set forth in the anticipation rejection above. CHEN teaches wherein the instructions further enable the at least one processor to determine position coordinates of the mobility apparatus on the top-view image, based on the height of the mobility apparatus, the x-axis separation distance, and the y-axis separation distance (See at least Figs. 7, 12, 14, paragraph [n0053], “The camera's attitude angle will change according to the drone's position; the drone's control method is as follows: the drone flies to a safe landing altitude and then descends vertically, first approaching the landing platform to further improve the accuracy of navigation information. The system uses Kalman filtering, using the relative coordinate information calculated by PTBVS as the measurement value, integrating the IMU data loaded by the drone to obtain the drone's displacement between two frames, obtaining the estimated value of relative information based on the relative position information of the last frame, and using the correction value output by the filter as the input of the drone controller; the drone's horizontal movement path should be between P<sub>1</sub>′ and P′<sub>2</sub>; when approaching the landing point, the drone flies at the set maximum horizontal speed. During flight, the controller adjusts the speed and direction based on relative position information. Simultaneously, to ensure the drone's safety, the vertical speed needs to be determined based on the horizontal distance. When the horizontal distance between the drone and the landing point is within 2 meters, vertical landing is achieved using landmark navigation…At this stage, the drone will no longer fly at its maximum horizontal speed, but will adjust its horizontal speed using a PID controller that takes the pixel coordinates of the landing point as input.” The system determines position coordinates of the UAV by combining altitude information with horizontal separation distances derived from image-based relative coordinate and pixel-coordinate information corresponding to x-axis and y-axis components.).
With respect to claim 15, please see the rejection above with respect to claim 5, which is commensurate in scope to claim 15, with claim 5 being drawn to a mobility system and claim 15 being drawn to a corresponding method.
Regarding Claim 11, CHEN teaches A method for controlling a mobility apparatus, the method comprising: detecting a reference object from an image obtained by an imaging device of the mobility apparatus, in response to certain non-image-based position information of the mobility apparatus being unavailable (See at least Fig. 2, paragraph [n0040], “In some cases, although GPS signals can be received normally, the positioning information may have large errors or be completely wrong. The geometrical factor of precision (GDOP) reflects the accuracy of GPS positioning. GDOP enables optimal positioning using four satellites for both all-view positioning and GPS-PRN code ranging. Therefore, GPS signals are considered accurate when GDOP ≤ 2.7, otherwise unreliable. Once a sufficiently reliable signal is received, the UAV will compare the absolute position information of the UAV and the landing point to determine their initial relative position, and then begin landing. During the landing process, the positioning method proposed by the algorithm for obtaining positioning information through the PTBVS system is a gimbal-based visual servo system used for UAV navigation. The UAV uses visual information and sensor data to obtain the relative position information between the landing platform and itself, unaffected by GPS signal interference.”); determining a relative position of the mobility apparatus with respect to the reference object (See at least Figs. 6-14, paragraph [n0040], “In some cases, although GPS signals can be received normally, the positioning information may have large errors or be completely wrong. The geometrical factor of precision (GDOP) reflects the accuracy of GPS positioning. GDOP enables optimal positioning using four satellites for both all-view positioning and GPS-PRN code ranging. Therefore, GPS signals are considered accurate when GDOP ≤ 2.7, otherwise unreliable. Once a sufficiently reliable signal is received, the UAV will compare the absolute position information of the UAV and the landing point to determine their initial relative position, and then begin landing. During the landing process, the positioning method proposed by the algorithm for obtaining positioning information through the PTBVS system is a gimbal-based visual servo system used for UAV navigation. The UAV uses visual information and sensor data to obtain the relative position information between the landing platform and itself, unaffected by GPS signal interference.”); obtaining image-based position information of the mobility apparatus, based on the relative position of the mobility apparatus with respect to the reference object (See at least Figs. 6-14, paragraph [n0040], “In some cases, although GPS signals can be received normally, the positioning information may have large errors or be completely wrong. The geometrical factor of precision (GDOP) reflects the accuracy of GPS positioning. GDOP enables optimal positioning using four satellites for both all-view positioning and GPS-PRN code ranging. Therefore, GPS signals are considered accurate when GDOP ≤ 2.7, otherwise unreliable. Once a sufficiently reliable signal is received, the UAV will compare the absolute position information of the UAV and the landing point to determine their initial relative position, and then begin landing. During the landing process, the positioning method proposed by the algorithm for obtaining positioning information through the PTBVS system is a gimbal-based visual servo system used for UAV navigation. The UAV uses visual information and sensor data to obtain the relative position information between the landing platform and itself, unaffected by GPS signal interference.”); and determining whether the mobility apparatus departs from a course, based on the image-based position information of the mobility apparatus (See at least paragraph [n0065], “The return and landing trajectory diagram is shown in Figure 14(a). The blue curve represents the actual flight trajectory of the UAV, the red curve represents the navigation prediction trajectory of the PTBVS system, and the black dashed line represents the ideal trajectory obtained by the proposed rapid landing. It can be seen that in the actual environment, due to factors such as wind and sensor noise, the relative position error between the UAV and the landing point obtained by PTBVS increases. The actual landing trajectory deviates from the ideal landing trajectory. However, this deviation is tolerable, and the drone can still successfully land at the target landing point via PTBVS, solely for navigation purposes.”).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim(s) 6-10, 16-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over CHEN (CN 114564034 A) in view of GOUDY (US 20160027304 A1).
Regarding Claim 6, CHEN teaches The apparatus of claim 3, as set forth in the anticipation rejection above. CHEN does not explicitly disclose, however, GOUDY, in the same field of endeavor, teaches wherein the instructions further enable the at least one processor to: obtain a first straight line based on first and second reference points matched to a reference target (See at least paragraph [0008], “In accordance with one aspect of the present invention, a condition monitoring system and method are provided which employ a storage device and a controller. The storage device stores information representing a plurality of boundary points of a boundary that circumscribes an area of interest in which the boundary points are defined by two prescribed parameters, and the controller obtains at least one condition point defined by current values of the prescribed parameters, determines a first boundary point of the boundary points that is closest to the condition point, and generates geometric data representing a geometric relationship between the first boundary point, the condition point and a second boundary point of the boundary points. The geometric relationship includes a first straight line connecting the first boundary point and the condition point, a second straight line connecting the second boundary point and the condition point and a third straight line connecting the first boundary point and the second boundary point. The controller calculates reference point data representing a reference point based on the geo metric data, determines coordinate condition databased on an angle between a predetermined direction and a reference line connecting the first boundary point and the reference point, and determines whether the condition point lies within the area of interest based on a comparison between coordinates of the condition point and the coordinate condition data.”); obtain a second straight line based on third and fourth reference points matched to the reference targe(See at least paragraph [0008], “In accordance with one aspect of the present invention, a condition monitoring system and method are provided which employ a storage device and a controller. The storage device stores information representing a plurality of boundary points of a boundary that circumscribes an area of interest in which the boundary points are defined by two prescribed parameters, and the controller obtains at least one condition point defined by current values of the prescribed parameters, determines a first boundary point of the boundary points that is closest to the condition point, and generates geometric data representing a geometric relationship between the first boundary point, the condition point and a second boundary point of the boundary points. The geometric relationship includes a first straight line connecting the first boundary point and the condition point, a second straight line connecting the second boundary point and the condition point and a third straight line connecting the first boundary point and the second boundary point. The controller calculates reference point data representing a reference point based on the geo metric data, determines coordinate condition databased on an angle between a predetermined direction and a reference line connecting the first boundary point and the reference point, and determines whether the condition point lies within the area of interest based on a comparison between coordinates of the condition point and the coordinate condition data.”); and determine whether the mobility apparatus departs from the course based on determining an area in the first straight line and the second straight line as the course (See at least paragraph [0008], “In accordance with one aspect of the present invention, a condition monitoring system and method are provided which employ a storage device and a controller. The storage device stores information representing a plurality of boundary points of a boundary that circumscribes an area of interest in which the boundary points are defined by two prescribed parameters, and the controller obtains at least one condition point defined by current values of the prescribed parameters, determines a first boundary point of the boundary points that is closest to the condition point, and generates geometric data representing a geometric relationship between the first boundary point, the condition point and a second boundary point of the boundary points. The geometric relationship includes a first straight line connecting the first boundary point and the condition point, a second straight line connecting the second boundary point and the condition point and a third straight line connecting the first boundary point and the second boundary point. The controller calculates reference point data representing a reference point based on the geo metric data, determines coordinate condition databased on an angle between a predetermined direction and a reference line connecting the first boundary point and the reference point, and determines whether the condition point lies within the area of interest based on a comparison between coordinates of the condition point and the coordinate condition data.”).
Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date to combine the invention of CHEN with the teachings of GOUDY such that the landing system of CHEN is further configured to utilize obtaining a first straight line based on first and second reference points matched to a reference target; obtaining a second straight line based on third and fourth reference points matched to the reference target; and determining whether the mobility apparatus departs from the course based on determining an area in the first straight line and the second straight line as the course, as taught by GOUDY (See paragraph [0008].), with a reasonable expectation of success. The motivation for doing so would be warning drivers to take appropriate action when possibility of contact exists, as taught by GOUDY (See paragraph [0006].).
With respect to claim 16, please see the rejection above with respect to claim 6, which is commensurate in scope to claim 16, with claim 6 being drawn to a mobility system and claim 16 being drawn to a corresponding method.
Regarding Claim 7, CHEN and GOUDY teach The apparatus of claim 6, as set forth in the obviousness rejection above. CHEN teaches in response to setting a position of the mobility apparatus to an origin on the top-view image (See at least Figs. 7, 12, 14, paragraph [n0053], “The camera's attitude angle will change according to the drone's position; the drone's control method is as follows: the drone flies to a safe landing altitude and then descends vertically, first approaching the landing platform to further improve the accuracy of navigation information. The system uses Kalman filtering, using the relative coordinate information calculated by PTBVS as the measurement value, integrating the IMU data loaded by the drone to obtain the drone's displacement between two frames, obtaining the estimated value of relative information based on the relative position information of the last frame, and using the correction value output by the filter as the input of the drone controller; the drone's horizontal movement path should be between P<sub>1</sub>′ and P′<sub>2</sub>; when approaching the landing point, the drone flies at the set maximum horizontal speed. During flight, the controller adjusts the speed and direction based on relative position information. Simultaneously, to ensure the drone's safety, the vertical speed needs to be determined based on the horizontal distance. When the horizontal distance between the drone and the landing point is within 2 meters, vertical landing is achieved using landmark navigation…At this stage, the drone will no longer fly at its maximum horizontal speed, but will adjust its horizontal speed using a PID controller that takes the pixel coordinates of the landing point as input.”).
CHEN does not explicitly disclose, however, GOUDY, in the same field of endeavor, teaches wherein the instructions further enable the at least one processor to determine whether the mobility apparatus departs from the course based on that a first x-intercept of the first straight line and a second x-intercept of the second straight line are a same sign (See at least paragraph [0008], “In accordance with one aspect of the present invention, a condition monitoring system and method are provided which employ a storage device and a controller. The storage device stores information representing a plurality of boundary points of a boundary that circumscribes an area of interest in which the boundary points are defined by two prescribed parameters, and the controller obtains at least one condition point defined by current values of the prescribed parameters, determines a first boundary point of the boundary points that is closest to the condition point, and generates geometric data representing a geometric relationship between the first boundary point, the condition point and a second boundary point of the boundary points. The geometric relationship includes a first straight line connecting the first boundary point and the condition point, a second straight line connecting the second boundary point and the condition point and a third straight line connecting the first boundary point and the second boundary point. The controller calculates reference point data representing a reference point based on the geo metric data, determines coordinate condition databased on an angle between a predetermined direction and a reference line connecting the first boundary point and the reference point, and determines whether the condition point lies within the area of interest based on a comparison between coordinates of the condition point and the coordinate condition data.” The system determines whether the mobility apparatus departs from the course by evaluating straight-line boundary relationships in a coordinate system, including intercept information, to determine whether the mobility apparatus lies within the course area.).
Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date to combine the invention of CHEN with the teachings of GOUDY such that the landing system of CHEN is further configured to utilize obtaining a first straight line based on first and second reference points matched to a reference target; obtaining a second straight line based on third and fourth reference points matched to the reference target; determining whether the mobility apparatus departs from the course based on determining an area in the first straight line and the second straight line as the course; and determining whether the mobility apparatus departs from the course based on that a first x-intercept of the first straight line and a second x-intercept of the second straight line are a same sign, in response to setting a position of the mobility apparatus to an origin on the top-view image, as taught by GOUDY (See paragraph [0008].), with a reasonable expectation of success. The motivation for doing so would be warning drivers to take appropriate action when possibility of contact exists, as taught by GOUDY (See paragraph [0006].).
With respect to claim 17, please see the rejection above with respect to claim 7, which is commensurate in scope to claim 17, with claim 7 being drawn to a mobility system and claim 17 being drawn to a corresponding method.
Regarding Claim 8, CHEN and GOUDY teach The apparatus of claim 6, as set forth in the obviousness rejection above. CHEN teaches control the mobility apparatus to turn in a first direction toward the center line during a first duration (See at least paragraph [n0053], “The camera's attitude angle will change according to the drone's position; the drone's control method is as follows: the drone flies to a safe landing altitude and then descends vertically, first approaching the landing platform to further improve the accuracy of navigation information. The system uses Kalman filtering, using the relative coordinate information calculated by PTBVS as the measurement value, integrating the IMU data loaded by the drone to obtain the drone's displacement between two frames, obtaining the estimated value of relative information based on the relative position information of the last frame, and using the correction value output by the filter as the input of the drone controller; the drone's horizontal movement path should be between P<sub>1</sub>′ and P′<sub>2</sub>; when approaching the landing point, the drone flies at the set maximum horizontal speed. During flight, the controller adjusts the speed and direction based on relative position information. Simultaneously, to ensure the drone's safety, the vertical speed needs to be determined based on the horizontal distance. When the horizontal distance between the drone and the landing point is within 2 meters, vertical landing is achieved using landmark navigation…At this stage, the drone will no longer fly at its maximum horizontal speed, but will adjust its horizontal speed using a PID controller that takes the pixel coordinates of the landing point as input.”); control the mobility apparatus to turn in a second direction which is opposite to the first direction during a second duration (See at least paragraph [n0053], “The camera's attitude angle will change according to the drone's position; the drone's control method is as follows: the drone flies to a safe landing altitude and then descends vertically, first approaching the landing platform to further improve the accuracy of navigation information. The system uses Kalman filtering, using the relative coordinate information calculated by PTBVS as the measurement value, integrating the IMU data loaded by the drone to obtain the drone's displacement between two frames, obtaining the estimated value of relative information based on the relative position information of the last frame, and using the correction value output by the filter as the input of the drone controller; the drone's horizontal movement path should be between P<sub>1</sub>′ and P′<sub>2</sub>; when approaching the landing point, the drone flies at the set maximum horizontal speed. During flight, the controller adjusts the speed and direction based on relative position information. Simultaneously, to ensure the drone's safety, the vertical speed needs to be determined based on the horizontal distance. When the horizontal distance between the drone and the landing point is within 2 meters, vertical landing is achieved using landmark navigation…At this stage, the drone will no longer fly at its maximum horizontal speed, but will adjust its horizontal speed using a PID controller that takes the pixel coordinates of the landing point as input.”); and control the mobility apparatus to perform straight line motion along the center line from a return point at which the mobility apparatus reaches the center line (See at least paragraph [n0053], “The camera's attitude angle will change according to the drone's position; the drone's control method is as follows: the drone flies to a safe landing altitude and then descends vertically, first approaching the landing platform to further improve the accuracy of navigation information. The system uses Kalman filtering, using the relative coordinate information calculated by PTBVS as the measurement value, integrating the IMU data loaded by the drone to obtain the drone's displacement between two frames, obtaining the estimated value of relative information based on the relative position information of the last frame, and using the correction value output by the filter as the input of the drone controller; the drone's horizontal movement path should be between P<sub>1</sub>′ and P′<sub>2</sub>; when approaching the landing point, the drone flies at the set maximum horizontal speed. During flight, the controller adjusts the speed and direction based on relative position information. Simultaneously, to ensure the drone's safety, the vertical speed needs to be determined based on the horizontal distance. When the horizontal distance between the drone and the landing point is within 2 meters, vertical landing is achieved using landmark navigation…At this stage, the drone will no longer fly at its maximum horizontal speed, but will adjust its horizontal speed using a PID controller that takes the pixel coordinates of the landing point as input.”).
CHEN does not explicitly disclose, however, GOUDY, in the same field of endeavor, teaches wherein the instructions further enable the at least one processor to: determine a center line connecting x-axis average values of the first straight line and the second straight line (See at least paragraph [0008], “In accordance with one aspect of the present invention, a condition monitoring system and method are provided which employ a storage device and a controller. The storage device stores information representing a plurality of boundary points of a boundary that circumscribes an area of interest in which the boundary points are defined by two prescribed parameters, and the controller obtains at least one condition point defined by current values of the prescribed parameters, determines a first boundary point of the boundary points that is closest to the condition point, and generates geometric data representing a geometric relationship between the first boundary point, the condition point and a second boundary point of the boundary points. The geometric relationship includes a first straight line connecting the first boundary point and the condition point, a second straight line connecting the second boundary point and the condition point and a third straight line connecting the first boundary point and the second boundary point. The controller calculates reference point data representing a reference point based on the geo metric data, determines coordinate condition databased on an angle between a predetermined direction and a reference line connecting the first boundary point and the reference point, and determines whether the condition point lies within the area of interest based on a comparison between coordinates of the condition point and the coordinate condition data.” The system determines a center line between the first and second straight line boundaries by calculating an average position between the boundary lines along a coordinate axis, corresponding to averaging x-axis values of the straight lines to obtain a center line.).
Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date to combine the invention of CHEN with the teachings of GOUDY such that the landing system of CHEN is further configured to utilize obtaining a first straight line based on first and second reference points matched to a reference target; obtaining a second straight line based on third and fourth reference points matched to the reference target; determining whether the mobility apparatus departs from the course based on determining an area in the first straight line and the second straight line as the course; and determining a center line connecting x-axis average values of the first straight line and the second straight line, as taught by GOUDY (See paragraph [0008].), with a reasonable expectation of success. The motivation for doing so would be warning drivers to take appropriate action when possibility of contact exists, as taught by GOUDY (See paragraph [0006].).
With respect to claim 18, please see the rejection above with respect to claim 8, which is commensurate in scope to claim 18, with claim 8 being drawn to a mobility system and claim 18 being drawn to a corresponding method.
Regarding Claim 9, CHEN and GOUDY teach The apparatus of claim 8, as set forth in the obviousness rejection above. CHEN teaches wherein the instructions further enable the at least one processor to set a first section in which the mobility apparatus turns in the first direction and a second section in which the mobility apparatus turns in the second direction to be equivalent to each other (See at least paragraph [n0053], “The camera's attitude angle will change according to the drone's position; the drone's control method is as follows: the drone flies to a safe landing altitude and then descends vertically, first approaching the landing platform to further improve the accuracy of navigation information. The system uses Kalman filtering, using the relative coordinate information calculated by PTBVS as the measurement value, integrating the IMU data loaded by the drone to obtain the drone's displacement between two frames, obtaining the estimated value of relative information based on the relative position information of the last frame, and using the correction value output by the filter as the input of the drone controller; the drone's horizontal movement path should be between P<sub>1</sub>′ and P′<sub>2</sub>; when approaching the landing point, the drone flies at the set maximum horizontal speed. During flight, the controller adjusts the speed and direction based on relative position information. Simultaneously, to ensure the drone's safety, the vertical speed needs to be determined based on the horizontal distance. When the horizontal distance between the drone and the landing point is within 2 meters, vertical landing is achieved using landmark navigation…At this stage, the drone will no longer fly at its maximum horizontal speed, but will adjust its horizontal speed using a PID controller that takes the pixel coordinates of the landing point as input.” The system sets the first and second turning sections to be equivalent as part of a controlled return to the course by applying consistent corrective turning based on relative position information.).
With respect to claim 19, please see the rejection above with respect to claim 9, which is commensurate in scope to claim 19, with claim 9 being drawn to a mobility system and claim 19 being drawn to a corresponding method.
Regarding Claim 10, CHEN and GOUDY teach The apparatus of claim 1, as set forth in the obviousness rejection above. CHEN does not explicitly disclose, however, GOUDY, in the same field of endeavor, teaches wherein the instructions further enable the at least one processor to provide a notification to a display of the image-based position information of the mobility apparatus that obtains the image and the course (See at least paragraph [0027], “The storage device 28 permits a read-out operation of reading out data held in the large-capacity storage medium in response to an instruction from the controller 22 to, for example, acquire the map information and/or the vehicle condition information as needed or desired for use in representing the location of the host vehicle 10, the remote vehicle 14 and other location information and/or vehicle condition information as discussed herein for route guiding, map display, turning indication, and so on as under stood in the art. For instance, the map information can include at least road links indicating connecting states of nodes, locations of branch points (road nodes), names of roads branching from the branch points, place names of the branch destinations, and so on. The information in the storage device 28 can also be updated by the controller 22 or in any suitable manner as discussed herein and as understood in the art.”).
Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date to combine the invention of CHEN with the teachings of GOUDY such that the landing system of CHEN is further configured to utilize providing a notification to a display of the image-based position information of the mobility apparatus that obtains the image and the course, as taught by GOUDY (See paragraph [0027].), with a reasonable expectation of success. The motivation for doing so would be warning drivers to take appropriate action when possibility of contact exists, as taught by GOUDY (See paragraph [0006].).
With respect to claim 20, please see the rejection above with respect to claim 10, which is commensurate in scope to claim 20, with claim 10 being drawn to a mobility system and claim 20 being drawn to a corresponding method.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JEWEL ASHLEY KUNTZ whose telephone number is (571)270-5542. The examiner can normally be reached M-F 8:30am-5:30pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anne Antonucci can be reached at (313) 446-6519. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JEWEL A KUNTZ/Examiner, Art Unit 3666
/ANNE MARIE ANTONUCCI/Supervisory Patent Examiner, Art Unit 3666