Prosecution Insights
Last updated: April 19, 2026
Application No. 16/949,581

METHOD FOR RECOGNIZING A MOTION PATTERN OF A LIMB

Non-Final OA §101§103§112
Filed
Nov 04, 2020
Examiner
LOPEZ, SEVERO ANTON P
Art Unit
3791
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
The Chinese University of Hong Kong
OA Round
8 (Non-Final)
32%
Grant Probability
At Risk
8-9
OA Rounds
3y 6m
To Grant
65%
With Interview

Examiner Intelligence

Grants only 32% of cases
32%
Career Allow Rate
47 granted / 149 resolved
-38.5% vs TC avg
Strong +33% interview lift
Without
With
+33.4%
Interview Lift
resolved cases with interview
Typical timeline
3y 6m
Avg Prosecution
86 currently pending
Career history
235
Total Applications
across all art units

Statute-Specific Performance

§101
14.4%
-25.6% vs TC avg
§103
37.1%
-2.9% vs TC avg
§102
16.5%
-23.5% vs TC avg
§112
27.6%
-12.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 149 resolved cases

Office Action

§101 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 19 November 2025 has been entered. The Examiner acknowledges the amendments to claims 1 and 21-23, as well as the cancellation of claims 2 and 26. Claims 1, 4-7, 10-15, 18-19, 21-23, and 25 are pending. Drawings The drawings were received on 19 September 2025. These drawings are unacceptable. The replacement Fig. 3 includes structure for a “driver of auxiliary device”, which is not supported by the Applicant’s disclosure, which fails to provide sufficient written description support regarding how a driver of a lower limb auxiliary device are controlled by the claimed method, non-transitory machine-readable medium, and data processing system, as well as how the replacement Fig. 3 appears to imply operative connection between the sensor 2 and the driver of the auxiliary device. The drawings are objected to as failing to comply with 37 CFR 1.84(p)(5) because they do not include the following reference sign(s) mentioned in the description: “12” in ¶0053. Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance. Claim Objections Claim(s) 1, 10-11, and 22-23 objected to because of the following informalities: Claim 1 recites “the absolute motion trajectory to ground passes through a predefined boundary condition in a sensor coordinate system” [lines 12-13], “in response to the trigger boundary condition being satisfied” [lines 13-14], “the triggering boundary condition is satisfied” [line 16], “the predefined boundary condition comprises at least one of a circle, a rectangle, or an ellipse” [lines 18-19], “the triggering boundary condition is satisfied when the absolute motion trajectory to ground during the swing stage passes through the circle, the rectangle or ellipse in the sensor coordinate system” [lines 19-20], and “the recognizing is triggered based on the trigger boundary condition” [line 27], wherein the Examiner notes that the Applicant appears to use “predefined boundary condition” and “triggering boundary condition” interchangeably. Each instance of “predefined boundary condition” and “triggering boundary condition” is objected to and the Examiner suggests amending claim 1 to only recite one or the other to maintain consistency. Claims 10-11 and 22-23 are similarly objected to. Appropriate correction is required. Claim Interpretation Examiner Notes: currently, NO limitation invokes interpretation under § 112(f). Other Claim Interpretation Considerations: Claim 1 recites the limitation “wherein the predefined boundary trajectory comprises at least one of a circle, a rectangle, or an ellipse in the sensor coordinate system, and the triggering boundary condition is satisfied when the absolute motion trajectory to ground during the swing stage passes through the circle, the rectangle or ellipse in the sensor coordinate system” [lines 18-21], wherein the Applicant’s Specification discloses that the function defining an elliptical boundary condition is “the following equation (2): Axg2 + Byg2 = 1… A and B are constants, and xg and yg are coordinates of the obtained absolute motion trajectory in the x-axis direction and the y-axis direction” [Applicant’s Specification ¶¶0071-0072, Fig. 6]; wherein the Examiner notes that the Applicant has failed to specifically define the constants A and B used in the elliptical boundary condition. For examination purposes, the Examiner has interpreted any arbitrarily defined ellipse of a prior art reference under § 102 or § 103 that is based at least on absolute motion trajectory in the x-direction and y-direction may be considered to define a “boundary condition”, as any threshold defined by absolute motion trajectory in the x-direction and y-direction may be considered to be part of any ellipse defined by arbitrary constants using the equation “Axg2 + Byg2 = 1” [similar interpretations are considered to be applicable to circular and rectangular boundary conditions (see Applicant’s Specification ¶¶0074-0075)]. Claims 22 and 23 are considered to recite similar subject matter that are similarly interpreted [lines 18-22 of claim 22; lines 20-24 of claim 23]. Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112: The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention. Claim(s) 1, 22-23, and those dependent therefrom is/are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. Claim 1 recites the limitation “controlling, according to the motion pattern, a driver of an auxiliary device of the limb to generate a preset auxiliary torque to enable the limb to complete a required preparation during the swing stage, wherein the limb comprises at least one of a lower limb, a lower limb prosthesis, a lower limb orthosis, or a lower limb exoskeleton of a human body” [lines 23-26, emphasis applied], wherein the Applicant’s Specification is considered to fail to provide written description support for the “auxiliary device of the limb” and corresponding “driver”, as the Examiner notes that the Specification merely recites the “lower limb auxiliary device” and “driver” as being known in the art [Recently, there has been a significant increase in the demand for human power aids or medical rehabilitation training equipment in stroke hemiplegia, impaired motion function of lower limb or disabled persons (Applicant’s Specification ¶0003); It has been noted that in different motion patterns, such as upslope, downslope, upstairs or downstairs, the function performed by each joint of the lower limb of human body and the corresponding biomechanical characteristics vary considerably. Therefore, in order to achieve the desired function more accurately, the lower limb auxiliary device firstly should be able to accurately recognize the motion pattern of the user (wearer), and then control a driver to generate a preset auxiliary torque according to the corresponding motion pattern, thereby assisting the wearer to perform the desired action more easily (Applicant’s Specification ¶0004), wherein the Examiner notes that the only recitation of a “driver” or generation of a preset torque is in the Background section of the Specification; In order to be able to recognize the motion pattern of the lower limb, prosthesis, orthosis or exoskeleton before the next foot contacting the ground, thereby enabling the lower limb, prosthesis, orthosis or exoskeleton to complete the required preparation during the swing stage, a predetermined triggering condition may be used to trigger the pattern recognition decision of the classifier or the pattern recognizer (Applicant’s Specification ¶0070), wherein the Examiner notes that ¶0070 only refers to motion pattern recognition of a lower limb, prosthesis, orthosis, or exoskeleton, and fails to provide any disclosure regarding generating a preset torque], but fails to disclose how the Applicant’s invention structurally incorporates and controls the claimed lower limb auxiliary device or the driver. The Examiner notes that while ¶0070 of the Applicant’s Specification is considered to provide written description support for a prosthesis, orthosis, or exoskeleton, as being the limb that motion pattern recognition is applied to, a separate driver of an auxiliary device of the limb is considered to be broader in scope than what is recited in ¶0070. Claims 22 and 23 are considered to recite similar subject matter that do not have written description support [lines 24-27 of claim 22; lines 26-29 of claim 23]. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claim(s) 1, 4-7, 10-15, 18-19, 21-23, and 25 is/are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception without significantly more. Each claim has been analyzed to determine whether it is directed to any judicial exceptions. Representative claim(s) 23 [representing all independent claims] recite(s): A data processing system comprising: a processor, and a memory coupled to the processor to store instructions executable by the processor to perform operations, the operations comprising: collecting, by a sensor configured to measure motion data of a limb extremity end of a subject, motion data of the limb extremity end of the subject during a swing stage of the extremity end in different motion patterns, wherein the motion data comprises an absolute motion trajectory to ground during the swing stage, an absolute velocity to ground, and an absolute acceleration to ground of the limb extremity end during the swing stage in the different motion patterns; performing data processing on the motion data to recognize the motion pattern, the data processing comprising: determining a slope of the ground based on the absolute motion trajectory to ground of the limb extremity end with a trigger boundary condition, classifying a corresponding type of terrain from a plurality of types of terrain comprising flat ground, slope, and stairway based on the slope, determining that the absolute motion trajectory to ground passes through a predefined boundary trajectory in a sensor coordinate system, and triggering, in response to the trigger boundary condition being satisfied, recognizing the motion pattern under the classified type of terrain, wherein the motion pattern comprises at least one of upslope, downslope, upstairs, downstairs, walking on flat ground, or turning, wherein the triggering boundary condition is satisfied when, in the sensor coordinate system, the absolute motion trajectory to ground during the swing stage passes through the predefined boundary trajectory, wherein the predefined boundary trajectory comprises at least one of a circle, a rectangle, or an ellipse in the sensor coordinate system, and the triggering boundary condition is satisfied when the absolute motion trajectory to ground during the swing stage passes through the circle, the rectangle or the ellipse in the sensor coordinate system, wherein the sensor coordinate system is a two-dimensional coordinate system; and controlling, according to the motion pattern, a driver of an auxiliary device of the limb to generate a preset auxiliary torque to enable the limb to complete a required preparation during the swing stage, wherein the limb comprises at least one of a lower limb, a lower limb prosthesis. a lower limb orthosis, or a lower limb exoskeleton of a human body; wherein after the recognizing is triggered based on the trigger boundary condition, performing, recognizing the motion pattern under the classified type of terrain by: determining that the motion pattern is upslope in response to that the slope of the ground is less than a first slope threshold and greater than a second slope threshold; determining that the motion pattern is walking on flat ground in response to that the slope of the ground is less than the second slope threshold and greater than a third slope threshold; or determining that the motion pattern is downslope in response to that the slope of the ground is less than the third slope threshold and greater than a fourth slope threshold; wherein the first slope threshold is a threshold value for distinguishing the upstairs motion pattern from the upslope motion pattern, the second slope threshold is a threshold value for distinguishing the upslope motion pattern from the flat ground walking motion pattern, and the third slope threshold is a threshold value for distinguishing the flat ground walking motion pattern from the downslope motion pattern, and the fourth slope threshold is a threshold value for distinguishing the downslope motion pattern from the downstairs motion pattern; wherein the operations further comprise recognizing one or more different motion patterns by combining one or more of the absolute motion trajectory to ground, the absolute velocity to ground, and the absolute acceleration to ground with a rotation angle of a lower limb knee joint or ankle joint, and an electroencephalographic signal (EEG) of the subject. (Emphasis added: abstract idea, additional element) Step 2A Prong 1 Representative claim(s) 23 recites the following abstract ideas, which may be performed in the mind or by hand with the assistance of pen and paper: “performing data processing on the motion data to recognize the motion pattern, the data processing comprising: determining a slope of the ground based on the absolute motion trajectory to ground of the limb extremity end with a trigger boundary condition” – may be performed by merely observing known or collected data and applying known mathematical processes on the data, and further drawing conclusions therefrom based on known or derived relationships [Applicant’s Specification ¶0073] “classifying a corresponding type of terrain from a plurality of types of terrain comprising flat ground, slope, and stairway based on the slope,” – may be performed by merely drawing conclusions from known or collected data using known or derived relationships [Applicant’s Specification ¶0047] “determining that the absolute motion trajectory to ground passes through a predefined boundary trajectory in a sensor coordinate system” – may be performed by merely observing known or collected data and drawing mental conclusions therefrom [Applicant’s Specification ¶¶0071-0072, Fig. 6] “triggering, in response to the trigger boundary condition being satisfied, recognizing the motion pattern under the classified type of terrain, wherein the motion pattern comprises at least one of upslope, downslope, upstairs, downstairs, walking on flat ground, or turning” – may be performed by merely drawing conclusions from known or collected data using known or derived relationships [Applicant’s Specification ¶0047] “wherein the triggering boundary condition is satisfied when, in the sensor coordinate system, the absolute motion trajectory to ground during the swing stage passes through the predefined boundary trajectory, wherein the predefined boundary trajectory comprises at least one of a circle, a rectangle, or an ellipse in the sensor coordinate system, and the triggering boundary condition is satisfied when the absolute motion trajectory to ground during the swing stage passes through the circle, the rectangle or the ellipse in the sensor coordinate system, wherein the sensor coordinate system is a two-dimensional coordinate system” – may be performed by merely observing known or collected data and drawing mental conclusions therefrom; wherein the Examiner notes that the predefined boundary trajectory comprising at least one of a circle, a rectangle, or an ellipse in the sensor coordinate system merely further limits how the mental conclusions may drawn “wherein after the recognizing is triggered based on the trigger boundary condition, performing, recognizing the motion pattern under the classified type of terrain by: determining that the motion pattern is upslope in response to that the slope of the ground is less than a first slope threshold and greater than a second slope threshold” – may be performed by merely drawing conclusions from known or collected data using known or derived relationships [Applicant’s Specification ¶¶0080-0081] “wherein after the recognizing is triggered based on the trigger boundary condition, performing, recognizing the motion pattern under the classified type of terrain by:… determining that the motion pattern is walking on flat ground in response to that the slope of the ground is less than the second slope threshold and greater than a third slope threshold” – may be performed by merely drawing conclusions from known or collected data using known or derived relationships [Applicant’s Specification ¶¶0080-0081] “wherein after the recognizing is triggered based on the trigger boundary condition, performing, recognizing the motion pattern under the classified type of terrain by:… determining that the motion pattern is downslope in response to that the slope of the ground is less than the third slope threshold and greater than a fourth slope threshold” – may be performed by merely drawing conclusions from known or collected data using known or derived relationships [Applicant’s Specification ¶¶0080-0081] “wherein the first slope threshold is a threshold value for distinguishing the upstairs motion pattern from the upslope motion pattern, the second slope threshold is a threshold value for distinguishing the upslope motion pattern from the flat ground walking motion pattern, and the third slope threshold is a threshold value for distinguishing the flat ground walking motion pattern from the downslope motion pattern, and the fourth slope threshold is a threshold value for distinguishing the downslope motion pattern from the downstairs motion pattern” – may be performed by merely drawing conclusions from known or collected data using known or derived relationships [Applicant’s Specification ¶¶0080-0081] “wherein the operations further comprise recognizing one or more different motion patterns by combining one or more of the absolute motion trajectory to ground, the absolute velocity to ground, and the absolute acceleration to ground with a rotation angle of a lower limb knee joint or ankle joint, and an electroencephalographic signal (EEG) of the subject” – may be performed by merely drawing conclusions from known or collected data using known or derived relationships [Applicant’s Specification ¶0058]; wherein the Examiner notes that the recitation of “a rotation angle of a lower limb knee joint or ankle joint, and an electroencephalographic signal (EEG)” is not a positive recitation of any data gathering step of measuring the rotation angle or EEG If a claim, under BRI, covers performance of the limitations in the mind but for the mere recitation of extra-solutionary activity (and otherwise generic computer elements) then the claim falls within the “Mental Processes” grouping of abstract ideas. Accordingly, the claim recites an abstract idea under Step 2A Prong 1 of the Mayo framework as set forth in the 2019 PEG. No limitations are provided that would force the complexity of any of the identified evaluation steps to be non-performable by pen-and-paper practice. Alternatively or additionally, these steps describe the concept of using implicit mathematical formula(s) [i.e., determining a slope, determination steps using the trigger boundary condition, determination steps using slope thresholds] to derive a conclusion based on input of data, which corresponds to concepts identified as abstract ideas by the courts [Diamond v. Diehr. 450 U.S. 175, 209 U.S.P.Q. 1 (1981), Parker v. Flook. 437 U.S. 584, 19 U.S.P.Q. 193 (1978), and In re Grams. 888 F.2d 835, 12 U.S.P.Q.2d 1824 (Fed. Cir. 1989)]. The concept of the recited limitations identified as mathematical concepts above is not meaningfully different than those mathematical concepts found by the courts to be abstract ideas. The dependent claims merely include limitations that either further define the abstract idea [e.g. limitations relating to the data gathered or particular steps which are entirely embodied in the mental process] and amount to no more than generally linking the use of the abstract idea to a particular technological environment or field of use because they are merely incidental or token additions to the claims that do not alter or affect how the process steps are performed. Thus, these concepts are similar to court decisions of abstract ideas of itself: collecting, displaying, and manipulating data [Int. Ventures v. Cap One Financial], collecting information, analyzing it, and displaying certain results of the collection and analysis [Electric Power Group], collection, storage, and recognition of data [Smart Systems Innovations]. Step 2A Prong 2 The judicial exception is not integrated into a practical application. Representative claim 23 only recites additional elements of extra-solutionary activity – in particular, extra-solution activity [generic computer functions, data gathering] – without further sufficient detail that would tie the abstract portions of the claim into a specific practical application (2019 PEG p. 55 – the instant claim, for example does not tie into a particular machine, a sufficiently particular form of data or signal collection – via the claimed extra-solution activity, or a sufficiently particular form of display or computing architecture/structure). Dependent claim(s) 4-7, 10-15, 21, and 25 merely add detail to the abstract portions of the claim but do not otherwise encompass any additional elements which tie the claim(s) into a particular application/integration [the dependent claim(s) recite limitations that merely further limit the abstract idea(s) identified in representative claim 23]. Accordingly, the claim(s) are not integrated into a practical application under Step 2A Prong 2. Step 2B The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. Independent claims 1 and 22-23 as individual wholes fail to amount to significantly more than the judicial exception at Step 2B. As discussed above with respect to integration of the abstract idea into a practical application, the additional elements of extra-solutionary activity [i.e., generic computer function, data gathering] and generic computer elements cannot amount to significantly more than an abstract idea [MPEP § 2106.05(f)] and is further considered to merely implement an abstract idea on a generic computer [MPEP § 2106.05(d)(II) establishes computer-based elements which are considered to be well understood, routine, and conventional when recited at a high level of generality]. For the independent claim portions and dependent claims which provide additional elements of extra-solutionary data gathering, MPEP § 2106.05(g) establishes that mere data gathering for determining a result does not amount to significantly more. The extra-solutionary activity of processor steps [acquiring, storing signals, etc. as presently recited, cannot provide an inventive concept which amounts to significantly more than the recited abstract idea. For the independent claims as well as the dependent claims merely reciting generic computer elements and functions [processor and memory recited at a high level of generality and functions therein], MPEP § 2106.05(d)(II) establishes computer-based elements which are considered to be well understood, routine, and conventional when recited at a high level of generality. Accordingly, the generic computer elements and functions, as presently limited, cannot provide an inventive concept since they fall under a generic structure and/or function that does not add a meaningful additional feature to the judicial exception(s) of the claim(s). Claim(s) 1 and 22-23 recite “collecting, by a sensor configured to measure motion data of a limb extremity end of a subject, motion data of the limb extremity end of the subject during a swing stage of the extremity end in different motion patterns, wherein the motion data comprises an absolute motion trajectory to ground during the swing stage, an absolute velocity to ground, and an absolute acceleration to ground of the limb extremity end during the swing stage in the different motion patterns”; wherein claims 4, 21, and 25 further recite “wherein the sensor further comprises an inertial measurement unit fixed to the limb extremity end” [claims 4, 25] / “an angular velocity or an acceleration in the sensor coordinate system measured by an inertial measurement unit fixed to the limb extremity end” [claim 21]; and wherein claim 18 recites “wherein the sensor further includes an inertial measurement unit-combined depth camera mounted to lower legs, thighs, waists, or head of the subject, and wherein the inertial measurement unit-combined depth camera is configured to, measure one or more of the absolute motion trajectory to ground, the absolute velocity to ground, or the acceleration to ground, or measure topographic characteristics in the different motion patterns”. Such a sensor is considered well-understood, routine, and conventional, as known by at least: Applicant’s disclosure is not particular regarding the particular structure of the generically claimed sensor, and recites the sensor at a high level of generality [The sensor 2 may be, for example, an inertial measurement unit, an inertial measurement unit-combined laser displacement sensor, or an inertial measurement unit-combined depth camera (Applicant’s Specification ¶0052), emphasis applied]. This lack of disclosure is acceptable under 35 U.S.C. 112(a) since this hardware performs non-specialized functions known by those of ordinary skill in the medical technology arts. Thus, Applicant's specification essentially admits that this hardware is conventional and performs well understood, routine and conventional activities in the field of motion recognition. In other words, Applicant’s specification demonstrates the well-understood, routine, conventional nature of the above-identified additional element because it describes such an additional element in a manner that indicates that the additional element is sufficiently well-known that the specification does not need to describe the particulars of such additional elements to satisfy 35 U.S.C. 112(a) [see Berkheimer memo from April 19, 2018, Page 3, (III)(A)(1), not attached]. Adding hardware that performs “well understood, routine, conventional activit[ies]’ previously known to the industry” will not make claims patent-eligible [TLI Communications]. Strausser (US-20150045703-A1, previously presented) [Inertial measurement units (IMUs) could be coupled to the leg support 212. An inertial measurement unit is generally composed of an accelerometer and a gyroscope and sometimes a magnetometer as well; in many modern sensors these devices are MEMS (Mico electromechanical systems) that have measurement in all three orthogonal axes on one or more microchips. The behavior of IMUs is well understood in the art (IMUs being used for applications from missile guidance to robotics to cell phones to hobbyist toys); they typically provide measurement of angular orientation with respect to gravity, as well as measurement of angular velocity with respect to earth and linear acceleration, all in three axes (Strausser ¶0025)] Claim(s) 1 and 22-23 recite “controlling, according to the motion pattern, a driver of an auxiliary device of the limb to generate a preset auxiliary torque to enable the limb to complete a required preparation during the swing stage, wherein the limb comprises at least one of a lower limb, a lower limb prosthesis. a lower limb orthosis, or a lower limb exoskeleton of a human body”. Such a driver of an auxiliary device of the limb is considered well-understood, routine, and conventional, as known by at least: Seo (US-20200085666-A1, previously presented) [In operation 630, the walking assistance apparatus 500 may control an assistance torque based on the predicted gait phase. For example, the walking assistance apparatus 500 may calculate an assistance torque corresponding to the gait process (Seo ¶0094); The walking assistance apparatus 500 may further perform operation 640. In operation 640, the walking assistance apparatus 600 may control the driver 550 to output the assistance torque (Seo ¶0095)] Mooney (US-20200016020-A1, previously presented) [a lower limb exoskeleton can measure gait parameters of the user while walking during zero-torque control. Important bio-mechanical parameters can include joint angles, velocities and accelerations, limb accelerations and angular velocities, and the timing of these parameters with respect to periods of the gait cycle (Mooney ¶0074); The exoskeleton controller may also continue to measure the user's gait parameters during the active mode to continuously adjust output of the actuators during use (Mooney ¶0075)] Thorsteinsseon (US-20080039756-A1, previously presented) [the knee actuator is an active device that applies a torque to the knee joint to cause a desired flexion of the orthotic frame at the knee joint (Thorsteinsson ¶0018); Information gathered by the sensor set is used for monitoring purposes and for control of active components of the KAFO. The gathered information may be employed to determine or recognize certain aspects or phases of the gait cycle, and to drive active components of the mechanical orthotic frame to provide assistance at relevant times during the gait cycle (Thorsteinsson ¶0021)] Claim 19 recites “wherein, the sensor further comprises an infrared capture system mounted in an ambient environment of the subject, and an infrared capture marker point is mounted at the limb extremity end of the subject”. Such an infrared capture system and infrared capture marker point is considered well-understood, routine, and conventional, as known by at least: Park (US-20040119716-A1, previously presented) [According to kinds of sensors attached to an actor's joints, conventional motion capture techniques are divided into a magnetic type measuring positions using a variation amount of a magnetic field, a mechanical type directly measuring a bending of joints using a mechanical method, an optical type using images of passive (infrared rays) or active (LED, color) markers obtained by a camera, and an optical fiber type using a variation amount in transmission of light according to a bend degree of joints (Park ¶0005)] Examiner’s Note Regarding Particular Treatment or Prophylaxis: Claim(s) 1 and 22-23 recite subject matter regarding “controlling, according to the motion pattern, a driver of an auxiliary device of the limb to generate a preset auxiliary torque to enable the limb to complete a required preparation during the swing stage, wherein the limb comprises at least one of a lower limb, a lower limb prosthesis. a lower limb orthosis, or a lower limb exoskeleton of a human body”, which the Examiner notes is not considered to be a particular treatment or prophylaxis, as none of the identified claims positively recite or include language that is considered to be a particular treatment or prophylaxis as an additional element to integrate the judicial exception into a practical application or allow the identified claims to amount to significantly more than the judicial exception [the Examiner notes that the identified limitation merely “generates” a value and does not positively recite using the generated value to perform any particular treatment or prophylaxis, as merely reciting that the generated value is “to enable…” a user to perform an action is not a positive recitation of implementing the generated value to perform any particular treatment or prophylaxis] [MPEP § 2106.04(d)(2)]. Accordingly, the claim(s) as whole(s) fail amount to significantly more than the judicial exception under Step 2B. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 1, 7, 10-11, 21-23, and 25 is/are rejected under 35 U.S.C. 103 as being unpatentable over Herr (US-20100179668-A1) in view of Yuen (US-20120084054-A1, previously presented). Regarding claim 1, Herr teaches A method for recognizing a motion pattern of a limb, comprising: collecting, by a sensor, motion data of a limb extremity end of a subject during a swing stage of the extremity end in different motion patterns, wherein the motion data comprises an absolute motion trajectory to ground during the swing stage, an absolute velocity to ground, and an absolute acceleration to ground of the limb extremity end during the swing stage in the different motion patterns [The inertial measurement unit 204 includes a three-axis rate gyro for measuring angular rate and a three-axis accelerometer for measuring acceleration. Placing the inertial measurement unit on the lower leg member 220 collocates the measurement of angular rate and acceleration for all three axes of the lower leg member 220. The inertial measurement unit 204 provides a six-degree-of-freedom estimate of the lower leg member 220 pose, inertial (world frame referenced) orientation and ankle-joint 200 (center of rotation of the ankle-foot) location (Herr ¶0177); The inertial measurement unit 204 is used to calculate the orientation, .sub.ankle.sup.wO, position .sub.ankle.sup.w p, and velocity, .sub.ankle.sup.w v, of the lower-extremity prosthetic apparatus in a ground-referenced world frame (Herr ¶0179), wherein the position, velocity, and acceleration of the extremity end being measured relative to a ground-referenced world frame is considered to read on “absolute” motion trajectory, velocity, and acceleration to ground]; performing data processing on the motion data to recognize the motion pattern, the data processing comprising: determining a slope of the ground based on the absolute motion trajectory to ground of the limb extremity end with a trigger boundary condition [The stair ramp discriminator provides a real-time prediction of the terrain slope angle, .PHI.{circumflex over (()}t). If the discriminator detects a step, including level-ground, then .PHI.{circumflex over (()}t)=0. Otherwise, the slope angle is assumed (Herr ¶0219, see EQN. 31 following ¶0219 not presently reproduced)], classifying a corresponding type of terrain from a plurality of types of terrain comprising flat ground, slope, and stairway based on the slope [FIG. 6A shows the shank trajectories that correspond to five different activities, with additional ramp trajectories to distinguish between steep and shallow ramps. The system can use this information to figure out what activity is being performed by mapping the tracked trajectory onto a set of activities (Herr ¶0018); The trajectory of the ankle joint 600 in the y-z plane (referring to FIG. 6A) could be used in an alternative embodiment of the invention for stair-ramp discrimination (Herr ¶0218, Fig. 6A), wherein as depicted in Fig. 6A, the discriminated types of terrain comprise flat ground, ramp (slope), and stairway], determining that the absolute motion trajectory to ground passes through a predefined boundary trajectory in a sensor coordinate system, and triggering, in response to the trigger boundary condition being satisfied, recognizing the motion pattern under the classified type of terrain, wherein the motion pattern comprises at least one of upslope, downslope, upstairs, downstairs, walking on flat ground, or turning, wherein the triggering boundary condition is satisfied when, in the sensor coordinate system, the absolute motion trajectory to ground during the swing stage passes through the predefined boundary trajectory [wherein the disclosure of Herr ¶¶0018, 0218 regarding the use of ankle trajectory (as depicted in Fig. 6A) to differentiate between each of upslope (up 5° or 10° ramp), downslope (down 5° or 10° ramp), upstairs, downstairs, and flat (level) ground is considered to define satisfying any kind of “trigger boundary condition” of a predefined boundary trajectory as the trajectory itself is used to recognize the motion pattern], wherein the predefined boundary trajectory comprises at least one of a circle, a rectangle, or an ellipse in the sensor coordinate system, and the triggering boundary condition is satisfied when the absolute motion trajectory to ground during the swing stage passes through the circle, the rectangle or the ellipse in the sensor coordinate system, wherein, the sensor coordinate system is a two-dimensional coordinate system [wherein the disclosure of Herr ¶¶0018, 0218 regarding the use of ankle trajectory (as depicted in Fig. 6A) to recognize the motion pattern may be considered to pass through any arbitrarily defined circle, rectangle, or ellipse (see Claim Interpretations above); and wherein the ankle trajectory is assessed two-dimensionally along the y-z plane (depicted in Fig. 6A), the predefined boundary condition is considered to be passed in the sensor coordinate system]; and controlling, according to the motion pattern, a driver of an auxiliary device of the limb to generate a preset auxiliary torque to enable the limb to complete a required preparation during the swing stage, wherein the limb comprises at least one of a lower limb, a lower limb prosthesis, a lower limb orthosis, or a lower limb exoskeleton of a human body [In one embodiment of the invention, the discriminator methodology described above is used to control at least one of joint impedance, position or torque of a lower extremity prosthetic, orthotic, or exoskeleton apparatus worn by a wearer (e.g., the apparatus 1700 of FIG. 17A). The method involves estimating a velocity vector attack angle of the ankle joint of the apparatus throughout a late swing (e.g., the y-axis values of the data in FIG. 6C) (Herr ¶0220)]; wherein the method further comprises recognizing one or more different motion patterns by combining one or more of the absolute motion trajectory to ground, the absolute velocity to ground, and the absolute acceleration to ground with a rotation angle of a lower limb knee joint or ankle joint [Herr ¶¶0177, 0179, wherein the use of an IMU defined by an accelerometer and a gyroscope is considered to read on the combination of absolute motion trajectory/velocity/acceleration to ground with rotation angle of the ankle joint], and an electroencephalographic signal (EEG) of the subject. However, while Herr discloses and depicts that ankle trajectory is indicative of different motion patterns on certain types of terrain [Herr ¶¶0018, 0218, Fig. 6A], Herr fails to explicitly disclose wherein after the recognizing is triggered based on the trigger boundary condition, performing recognizing the motion pattern under the classified type of terrain by: determining that the motion pattern is upslope in response to that the slope of the ground is less than a first slope threshold and greater than a second slope threshold; determining that the motion pattern is walking on flat ground in response to that the slope of the ground is less than the second slope threshold and greater than a third slope threshold; or determining that the motion pattern is downslope in response to that the slope of the ground is less than the third slope threshold and greater than a fourth slope threshold; wherein the first slope threshold is a threshold value for distinguishing the upstairs motion pattern from the upslope motion pattern, the second slope threshold is a threshold value for distinguishing the upslope motion pattern from the flat ground walking motion pattern, and the third slope threshold is a threshold value for distinguishing the flat ground walking motion pattern from the downslope motion pattern, and the fourth slope threshold is a threshold value for distinguishing the downslope motion pattern from the downstairs motion pattern. Yuen discloses systems for monitoring and classifying user motion [Yuen Abstract], wherein Yuen discloses determining that a motion pattern is one of upslope, flat ground, or downslope based on the determination that the slope of the ground being within certain thresholds [Notably, in one embodiment, if .DELTA.H-S exceeds a predetermined threshold, the processing circuitry may determine that the user is traversing stairs, in which case, specific stair estimation algorithms may be employed. With reference to FIG. 4R, the processing circuitry may employ an embodiment in which upstairs walking and running are given specific calorie burn algorithms based on .DELTA.H-S. Downstairs logic may be incorporated therein. Likewise, specific equations and/or logic may be employed for different grade hills, both upwards and downwards in accordance with the preceding linear equations, or alternate nonlinear equations and means (e.g., lookup tables, polynomials, transcendentals, interpolations, neural nets, maximum likelihood estimates, expected value estimates, etc.) (Yuen ¶0124), wherein the Examiner notes that ¶0124 of Yuen discloses applying the thresholds as disclosed to downstairs logic, wherein since Yuen describes thresholds for differentiating between flat ground, upstairs, and upslope, the thresholds as applied to downstairs logic are considered to comprise thresholds for differentiating between flat ground, downstairs, and downslope; As intimated above, data which is representative of the altitude and/or changes in altitude and data which is representative of the motion of the user may also be used to determine and/or classify other activity-related metrics such as, for example, user steps, distance and pace (FIG. 4E)...Notably, other activity-related metrics may be determined by the processing circuitry, including, for example, (i) in the context of running/walking on level or substantially level ground, number of steps, also broken down as walking or running, distance traveled and/or pace (ii) in the context of running/walking on stairs, hills or ground having a grade of greater than about 3%, number of stair and/or hill steps, which may be categorized or broken down, correlated or organized/arranged according to, for example, the speed, pace and/or activity state of the user (for example, as walking, jogging or running), number of flights of stairs, ascent/descent distance on stairs and/or hills, pace, ascent/descent on elevators and/or escalators, surface grade, and/or number of calories expended by walking/jogging/running on stairs and/or hills as well as quantify/compare the additional calories burnt from stairs/hills over level ground (Yuen ¶0126); In one embodiment, the processing circuitry may evaluate the output of the altitude sensor to determine, calculate and/or estimate the activity state of the user by evaluating the altitude sensor data based on algorithms or processes based on the flowchart of FIG. 4K. With reference to FIG. 4K, in one embodiment, the processing circuitry determines the type of activity by evaluating the change in altitude of the user on a change in height or altitude per step basis (".DELTA.H-S") or the use of an elevator by a sustained rate of height change pre time period (for example, per second) (".DELTA.H-t") in the absence of steps. The change in height or altitude per step and change in height or altitude per second are evaluated against a plurality of thresholds and/or ranges to determine whether the user is, for example, moving (for example, running or walking) on level ground, on an escalator or in an elevator, traversing stairs and/or traversing a hill or the like. In one embodiment, Threshold 1, Threshold 2, Threshold 3 and Threshold 4 have the relationship Threshold 1>Threshold 2>Threshold 3>Threshold 4 wherein the process seeks to detect and identify the causes of increases in user altitude. In other embodiments, the flow may be modified to detect and classify decreases or both increases and decreases in user altitude. Thus, in these embodiments, the processing circuitry employs data from the motion sensor to assess the user state based on data from the altitude sensor (Yuen ¶0129, Figures 4K-L), wherein the Examiner notes that traversing a hill may be considered to read on the claimed motion pattern of upslope, wherein in light of the Examiner’s interpretation of ¶0124 of Yuen applying the disclosed thresholds to downstairs logic, traversing down a hill may be considered to read on the claimed motion pattern of downslope; the inventions may have functionality that determines the elevation change and/or slope between two points through, for instance, the use of GPS with an altimeter (Yuen ¶0164)], wherein the thresholds distinguish between upstairs and upslope, upslope and flat ground, flat ground and downslope, and downslope and downstairs [see Yuen ¶0124, ¶0126, ¶0129, Figure 4K]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the method of Herr to employ wherein after the recognizing is triggered based on the trigger boundary condition, performing recognizing the motion pattern under the classified type of terrain by: determining that the motion pattern is upslope in response to that the slope of the ground is less than a first slope threshold and greater than a second slope threshold; determining that the motion pattern is walking on flat ground in response to that the slope of the ground is less than the second slope threshold and greater than a third slope threshold; or determining that the motion pattern is downslope in response to that the slope of the ground is less than the third slope threshold and greater than a fourth slope threshold; wherein the first slope threshold is a threshold value for distinguishing the upstairs motion pattern from the upslope motion pattern, the second slope threshold is a threshold value for distinguishing the upslope motion pattern from the flat ground walking motion pattern, and the third slope threshold is a threshold value for distinguishing the flat ground walking motion pattern from the downslope motion pattern, and the fourth slope threshold is a threshold value for distinguishing the downslope motion pattern from the downstairs motion pattern, so as to allow for distinct differentiation between activity states of the user based on identified thresholds of changes in height per step, allowing for the determination of motion patterns of the user [Yuen ¶0129]. Regarding claim 7, Herr in view of Yuen teaches The method according to claim 1, wherein the collecting comprises: extracting the absolute motion trajectory to ground of the limb extremity end in a sagittal plane [Herr ¶0218, Fig. 6A, wherein the y-z plane is considered to be a sagittal plane of the subject], and deriving terrain slopes corresponding to the different motion patterns from the absolute motion trajectory to ground in the sagittal plane to recognize the motion pattern being performed [Herr ¶¶0218-0219]. Regarding claim 10, Herr in view of Yuen teaches The method according to claim 1, wherein, the trigger boundary condition comprises one or more of a time threshold trigger, a displacement threshold to ground trigger in a forward direction or a direction vertical to ground [wherein Herr ¶0218 disclosing the use of the motion trajectory to recognize the motion pattern is considered to be read on displacement to ground in a vertical and horizontal direction], or an acceleration threshold or angular velocity threshold trigger in the sensor coordinate system. Regarding claim 11, Herr in view of Yuen teaches The method according to claim 1, wherein, the trigger boundary condition comprises: one or more of the angular velocity or acceleration signals of the inertial measurement unit in the sensor coordinate system satisfy a preset trigger condition [wherein the motion trajectory being defined by the IMU measurements of acceleration and angular velocity (Herr ¶0177) is considered to read on the claimed limitation]. Regarding claim 21, Herr in view of Yuen teaches The method according to claim 1, further comprising: recognizing one or more different motion patterns by combining one or more of the absolute motion trajectory to ground, the absolute velocity to ground, and the absolute acceleration to ground with an angular velocity or an acceleration in the sensor coordinate system measured by an inertial measurement unit fixed at the limb extremity end [Herr ¶¶0177, 0179]. Regarding claim 22, Herr teaches A non-transitory machine-readable medium storing instructions executable by a processor to perform operations, the operations comprising: collecting, by a sensor configured to measure motion data of a limb extremity end of a subject, motion data of the limb extremity end of the subject during a swing stage of the extremity end in different motion patterns, wherein the motion data comprises an absolute motion trajectory to ground during the swing stage, an absolute velocity to ground, and an absolute acceleration to ground of the limb extremity end during the swing stage in the different motion patterns [The inertial measurement unit 204 includes a three-axis rate gyro for measuring angular rate and a three-axis accelerometer for measuring acceleration. Placing the inertial measurement unit on the lower leg member 220 collocates the measurement of angular rate and acceleration for all three axes of the lower leg member 220. The inertial measurement unit 204 provides a six-degree-of-freedom estimate of the lower leg member 220 pose, inertial (world frame referenced) orientation and ankle-joint 200 (center of rotation of the ankle-foot) location (Herr ¶0177); The inertial measurement unit 204 is used to calculate the orientation, .sub.ankle.sup.wO, position .sub.ankle.sup.w p, and velocity, .sub.ankle.sup.w v, of the lower-extremity prosthetic apparatus in a ground-referenced world frame (Herr ¶0179), wherein the position, velocity, and acceleration of the extremity end being measured relative to a ground-referenced world frame is considered to read on “absolute” motion trajectory, velocity, and acceleration to ground]; performing data processing on the motion data to recognize the motion pattern, the data processing comprising: determining a slope of the ground based on the absolute motion trajectory to ground of the limb extremity end with a trigger boundary condition [The stair ramp discriminator provides a real-time prediction of the terrain slope angle, .PHI.{circumflex over (()}t). If the discriminator detects a step, including level-ground, then .PHI.{circumflex over (()}t)=0. Otherwise, the slope angle is assumed (Herr ¶0219, see EQN. 31 following ¶0219 not presently reproduced)], classifying a corresponding type of terrain from a plurality of types of terrain comprising flat ground, slope, and stairway based on the slope [FIG. 6A shows the shank trajectories that correspond to five different activities, with additional ramp trajectories to distinguish between steep and shallow ramps. The system can use this information to figure out what activity is being performed by mapping the tracked trajectory onto a set of activities (Herr ¶0018); The trajectory of the ankle joint 600 in the y-z plane (referring to FIG. 6A) could be used in an alternative embodiment of the invention for stair-ramp discrimination (Herr ¶0218, Fig. 6A), wherein as depicted in Fig. 6A, the discriminated types of terrain comprise flat ground, ramp (slope), and stairway], determining that the absolute motion trajectory to ground passes through a predefined boundary trajectory in a sensor coordinate system, and triggering, in response to the trigger boundary condition being satisfied, recognizing the motion pattern under the classified type of terrain, wherein the motion pattern comprises at least one of upslope, downslope, upstairs, downstairs, walking on flat ground, or turning, wherein the triggering boundary condition is satisfied when, in the sensor coordinate system, the absolute motion trajectory during the swing stage to ground passes through the predefined boundary trajectory [wherein the disclosure of Herr ¶¶0018, 0218 regarding the use of ankle trajectory (as depicted in Fig. 6A) to differentiate between each of upslope (up 5° or 10° ramp), downslope (down 5° or 10° ramp), upstairs, downstairs, and flat (level) ground is considered to define satisfying any kind of “trigger boundary condition” of a predefined boundary trajectory as the trajectory itself is used to recognize the motion pattern], wherein the predefined boundary trajectory comprises at least one of a circle, a rectangle; or an ellipse in the sensor coordinate system, and the triggering boundary condition is satisfied when the absolute motion trajectory to ground during the swing stage passes through the circle, the rectangle or the ellipse in the sensor coordinate system, wherein the sensor coordinate system is a two-dimensional coordinate system [wherein the disclosure of Herr ¶¶0018, 0218 regarding the use of ankle trajectory (as depicted in Fig. 6A) to recognize the motion pattern may be considered to pass through any arbitrarily defined circle, rectangle, or ellipse (see Claim Interpretations above); and wherein the ankle trajectory is assessed two-dimensionally along the y-z plane (depicted in Fig. 6A), the predefined boundary condition is considered to be passed in the sensor coordinate system]; and controlling, according to the motion pattern, a driver of an auxiliary device of the limb to generate a preset auxiliary torque to enable the limb to complete a required preparation during the swing stage, wherein the limb comprises at least one of a lower limb, a lower limb prosthesis, a lower limb orthosis, or a lower limb exoskeleton of a human body [In one embodiment of the invention, the discriminator methodology described above is used to control at least one of joint impedance, position or torque of a lower extremity prosthetic, orthotic, or exoskeleton apparatus worn by a wearer (e.g., the apparatus 1700 of FIG. 17A). The method involves estimating a velocity vector attack angle of the ankle joint of the apparatus throughout a late swing (e.g., the y-axis values of the data in FIG. 6C) (Herr ¶0220)]; wherein the operations further comprise recognizing one or more different motion patterns by combining one or more of the absolute motion trajectory to ground, the absolute velocity to ground, and the absolute acceleration to ground with a rotation angle of a lower limb knee joint or ankle joint [Herr ¶¶0177, 0179, wherein the use of an IMU defined by an accelerometer and a gyroscope is considered to read on the combination of absolute motion trajectory/velocity/acceleration to ground with rotation angle of the ankle joint], and an electroencephalographic signal (EEG) of the subject. However, while Herr discloses and depicts that ankle trajectory is indicative of different motion patterns on certain types of terrain [Herr ¶¶0018, 0218, Fig. 6A], Herr fails to explicitly disclose wherein after the recognizing is triggered based on the trigger boundary condition, performing recognizing the motion pattern under the classified type of terrain by: determining that the motion pattern is upslope in response to that the slope of the ground is less than a first slope threshold and greater than a second slope threshold; determining that the motion pattern is walking on flat ground in response to that the slope of the ground is less than the second slope threshold and greater than a third slope threshold; or determining that the motion pattern is downslope in response to that the slope of the ground is less than the third slope threshold and greater than a fourth slope threshold; wherein the first slope threshold is a threshold value for distinguishing the upstairs motion pattern from the upslope motion pattern, the second slope threshold is a threshold value for distinguishing the upslope motion pattern from the flat ground walking motion pattern, and the third slope threshold is a threshold value for distinguishing the flat ground walking motion pattern from the downslope motion pattern, and the fourth slope threshold is a threshold value for distinguishing the downslope motion pattern from the downstairs motion pattern. Yuen discloses systems for monitoring and classifying user motion [Yuen Abstract], wherein Yuen discloses determining that a motion pattern is one of upslope, flat ground, or downslope based on the determination that the slope of the ground being within certain thresholds [Notably, in one embodiment, if .DELTA.H-S exceeds a predetermined threshold, the processing circuitry may determine that the user is traversing stairs, in which case, specific stair estimation algorithms may be employed. With reference to FIG. 4R, the processing circuitry may employ an embodiment in which upstairs walking and running are given specific calorie burn algorithms based on .DELTA.H-S. Downstairs logic may be incorporated therein. Likewise, specific equations and/or logic may be employed for different grade hills, both upwards and downwards in accordance with the preceding linear equations, or alternate nonlinear equations and means (e.g., lookup tables, polynomials, transcendentals, interpolations, neural nets, maximum likelihood estimates, expected value estimates, etc.) (Yuen ¶0124), wherein the Examiner notes that ¶0124 of Yuen discloses applying the thresholds as disclosed to downstairs logic, wherein since Yuen describes thresholds for differentiating between flat ground, upstairs, and upslope, the thresholds as applied to downstairs logic are considered to comprise thresholds for differentiating between flat ground, downstairs, and downslope; As intimated above, data which is representative of the altitude and/or changes in altitude and data which is representative of the motion of the user may also be used to determine and/or classify other activity-related metrics such as, for example, user steps, distance and pace (FIG. 4E)...Notably, other activity-related metrics may be determined by the processing circuitry, including, for example, (i) in the context of running/walking on level or substantially level ground, number of steps, also broken down as walking or running, distance traveled and/or pace (ii) in the context of running/walking on stairs, hills or ground having a grade of greater than about 3%, number of stair and/or hill steps, which may be categorized or broken down, correlated or organized/arranged according to, for example, the speed, pace and/or activity state of the user (for example, as walking, jogging or running), number of flights of stairs, ascent/descent distance on stairs and/or hills, pace, ascent/descent on elevators and/or escalators, surface grade, and/or number of calories expended by walking/jogging/running on stairs and/or hills as well as quantify/compare the additional calories burnt from stairs/hills over level ground (Yuen ¶0126); In one embodiment, the processing circuitry may evaluate the output of the altitude sensor to determine, calculate and/or estimate the activity state of the user by evaluating the altitude sensor data based on algorithms or processes based on the flowchart of FIG. 4K. With reference to FIG. 4K, in one embodiment, the processing circuitry determines the type of activity by evaluating the change in altitude of the user on a change in height or altitude per step basis (".DELTA.H-S") or the use of an elevator by a sustained rate of height change pre time period (for example, per second) (".DELTA.H-t") in the absence of steps. The change in height or altitude per step and change in height or altitude per second are evaluated against a plurality of thresholds and/or ranges to determine whether the user is, for example, moving (for example, running or walking) on level ground, on an escalator or in an elevator, traversing stairs and/or traversing a hill or the like. In one embodiment, Threshold 1, Threshold 2, Threshold 3 and Threshold 4 have the relationship Threshold 1>Threshold 2>Threshold 3>Threshold 4 wherein the process seeks to detect and identify the causes of increases in user altitude. In other embodiments, the flow may be modified to detect and classify decreases or both increases and decreases in user altitude. Thus, in these embodiments, the processing circuitry employs data from the motion sensor to assess the user state based on data from the altitude sensor (Yuen ¶0129, Figures 4K-L), wherein the Examiner notes that traversing a hill may be considered to read on the claimed motion pattern of upslope, wherein in light of the Examiner’s interpretation of ¶0124 of Yuen applying the disclosed thresholds to downstairs logic, traversing down a hill may be considered to read on the claimed motion pattern of downslope; the inventions may have functionality that determines the elevation change and/or slope between two points through, for instance, the use of GPS with an altimeter (Yuen ¶0164)], wherein the thresholds distinguish between upstairs and upslope, upslope and flat ground, flat ground and downslope, and downslope and downstairs [see Yuen ¶0124, ¶0126, ¶0129, Figure 4K]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the non-transitory machine-readable medium of Herr to employ wherein after the recognizing is triggered based on the trigger boundary condition, performing recognizing the motion pattern under the classified type of terrain by: determining that the motion pattern is upslope in response to that the slope of the ground is less than a first slope threshold and greater than a second slope threshold; determining that the motion pattern is walking on flat ground in response to that the slope of the ground is less than the second slope threshold and greater than a third slope threshold; or determining that the motion pattern is downslope in response to that the slope of the ground is less than the third slope threshold and greater than a fourth slope threshold; wherein the first slope threshold is a threshold value for distinguishing the upstairs motion pattern from the upslope motion pattern, the second slope threshold is a threshold value for distinguishing the upslope motion pattern from the flat ground walking motion pattern, and the third slope threshold is a threshold value for distinguishing the flat ground walking motion pattern from the downslope motion pattern, and the fourth slope threshold is a threshold value for distinguishing the downslope motion pattern from the downstairs motion pattern, so as to allow for distinct differentiation between activity states of the user based on identified thresholds of changes in height per step, allowing for the determination of motion patterns of the user [Yuen ¶0129]. Regarding claim 23, Herr teaches A data processing system comprising: a processor [the at least one pattern recognition technique is performed using a processor coupled to at least one sensor and one actuator coupled to a lower-extremity prosthetic, orthotic, or exoskeleton apparatus worn by a wearer. In some embodiments, the at least one pattern recognition technique is selected from the group techniques consisting of Bayesian pattern classification, neural nets, fuzzy logic or hierarchical temporal memory (Herr ¶0094)], and a memory coupled to the processor to store instructions executable by the processor to perform operations [wherein based on ¶0094, a processor capable of perform operations is considered to be coupled to any type of memory], the operations comprising: collecting, by a sensor configured to measure motion data of a limb extremity end of a subject, motion data of the limb extremity end of the subject during a swing stage of the extremity end in different motion patterns, wherein the motion data comprises an absolute motion trajectory to ground during the swing stage, an absolute velocity to ground, and an absolute acceleration to ground of the limb extremity end during the swing stage in the different motion patterns [The inertial measurement unit 204 includes a three-axis rate gyro for measuring angular rate and a three-axis accelerometer for measuring acceleration. Placing the inertial measurement unit on the lower leg member 220 collocates the measurement of angular rate and acceleration for all three axes of the lower leg member 220. The inertial measurement unit 204 provides a six-degree-of-freedom estimate of the lower leg member 220 pose, inertial (world frame referenced) orientation and ankle-joint 200 (center of rotation of the ankle-foot) location (Herr ¶0177); The inertial measurement unit 204 is used to calculate the orientation, .sub.ankle.sup.wO, position .sub.ankle.sup.w p, and velocity, .sub.ankle.sup.w v, of the lower-extremity prosthetic apparatus in a ground-referenced world frame (Herr ¶0179), wherein the position, velocity, and acceleration of the extremity end being measured relative to a ground-referenced world frame is considered to read on “absolute” motion trajectory, velocity, and acceleration to ground]; performing data processing on the motion data to recognize the motion pattern, the data processing comprising: determining a slope of the ground based on the absolute motion trajectory to ground of the limb extremity end with a trigger boundary condition [The stair ramp discriminator provides a real-time prediction of the terrain slope angle, .PHI.{circumflex over (()}t). If the discriminator detects a step, including level-ground, then .PHI.{circumflex over (()}t)=0. Otherwise, the slope angle is assumed (Herr ¶0219, see EQN. 31 following ¶0219 not presently reproduced)], classifying a corresponding type of terrain from a plurality of types of terrain comprising flat ground, slope, and stairway based on the slope [FIG. 6A shows the shank trajectories that correspond to five different activities, with additional ramp trajectories to distinguish between steep and shallow ramps. The system can use this information to figure out what activity is being performed by mapping the tracked trajectory onto a set of activities (Herr ¶0018); The trajectory of the ankle joint 600 in the y-z plane (referring to FIG. 6A) could be used in an alternative embodiment of the invention for stair-ramp discrimination (Herr ¶0218, Fig. 6A), wherein as depicted in Fig. 6A, the discriminated types of terrain comprise flat ground, ramp (slope), and stairway], determining that the absolute motion trajectory to ground passes through a predefined boundary trajectory in a sensor coordinate system, and triggering, in response to the trigger boundary condition being satisfied, recognizing the motion pattern under the classified type of terrain, wherein the motion pattern comprises at least one of upslope, downslope, upstairs, downstairs, walking on flat ground, or turning, wherein the triggering boundary condition is satisfied when, in the sensor coordinate system, the absolute motion trajectory to ground during the swing stage passes through the predefined boundary trajectory [wherein the disclosure of Herr ¶¶0018, 0218 regarding the use of ankle trajectory (as depicted in Fig. 6A) to differentiate between each of upslope (up 5° or 10° ramp), downslope (down 5° or 10° ramp), upstairs, downstairs, and flat (level) ground is considered to define satisfying any kind of “trigger boundary condition” of a predefined boundary trajectory as the trajectory itself is used to recognize the motion pattern], wherein the predefined boundary trajectory comprises at least one of a circle, a rectangle, or an ellipse in the sensor coordinate system, and the triggering boundary condition is satisfied when the absolute motion trajectory to ground during the swing stage passes through the circle, the rectangle or the ellipse in the sensor coordinate system, where the sensor coordinate system is a two-dimensional coordinate system [wherein the disclosure of Herr ¶¶0018, 0218 regarding the use of ankle trajectory (as depicted in Fig. 6A) to recognize the motion pattern may be considered to pass through any arbitrarily defined circle, rectangle, or ellipse (see Claim Interpretations above); and wherein the ankle trajectory is assessed two-dimensionally along the y-z plane (depicted in Fig. 6A), the predefined boundary condition is considered to be passed in the sensor coordinate system]; and controlling, according to the motion pattern, a driver of an auxiliary device of the limb to generate a preset auxiliary torque to enable the limb to complete a required preparation during the swing stage, wherein the limb comprises at least one of a lower limb, a lower limb prosthesis, a lower limb orthosis, or a lower limb exoskeleton of a human body [In one embodiment of the invention, the discriminator methodology described above is used to control at least one of joint impedance, position or torque of a lower extremity prosthetic, orthotic, or exoskeleton apparatus worn by a wearer (e.g., the apparatus 1700 of FIG. 17A). The method involves estimating a velocity vector attack angle of the ankle joint of the apparatus throughout a late swing (e.g., the y-axis values of the data in FIG. 6C) (Herr ¶0220)]; wherein the operations further comprise recognizing one or more different motion patterns by combining one or more of the absolute motion trajectory to ground, the absolute velocity to ground, and the absolute acceleration to ground with a rotation angle of a lower limb knee joint or ankle joint [Herr ¶¶0177, 0179, wherein the use of an IMU defined by an accelerometer and a gyroscope is considered to read on the combination of absolute motion trajectory/velocity/acceleration to ground with rotation angle of the ankle joint], and an electroencephalographic signal (EEG) of the subject. However, while Herr discloses and depicts that ankle trajectory is indicative of different motion patterns on certain types of terrain [Herr ¶¶0018, 0218, Fig. 6A], Herr fails to explicitly disclose wherein after the recognizing is triggered based on the trigger boundary condition, performing, recognizing the motion pattern under the classified type of terrain by: determining that the motion pattern is upslope in response to that the slope of the ground is less than a first slope threshold and greater than a second slope threshold; determining that the motion pattern is walking on flat ground in response to that the slope of the ground is less than the second slope threshold and greater than a third slope threshold; or determining that the motion pattern is downslope in response to that the slope of the ground is less than the third slope threshold and greater than a fourth slope threshold; wherein the first slope threshold is a threshold value for distinguishing the upstairs motion pattern from the upslope motion pattern, the second slope threshold is a threshold value for distinguishing the upslope motion pattern from the flat ground walking motion pattern, and the third slope threshold is a threshold value for distinguishing the flat ground walking motion pattern from the downslope motion pattern, and the fourth slope threshold is a threshold value for distinguishing the downslope motion pattern from the downstairs motion pattern. Yuen discloses systems for monitoring and classifying user motion [Yuen Abstract], wherein Yuen discloses determining that a motion pattern is one of upslope, flat ground, or downslope based on the determination that the slope of the ground being within certain thresholds [Notably, in one embodiment, if .DELTA.H-S exceeds a predetermined threshold, the processing circuitry may determine that the user is traversing stairs, in which case, specific stair estimation algorithms may be employed. With reference to FIG. 4R, the processing circuitry may employ an embodiment in which upstairs walking and running are given specific calorie burn algorithms based on .DELTA.H-S. Downstairs logic may be incorporated therein. Likewise, specific equations and/or logic may be employed for different grade hills, both upwards and downwards in accordance with the preceding linear equations, or alternate nonlinear equations and means (e.g., lookup tables, polynomials, transcendentals, interpolations, neural nets, maximum likelihood estimates, expected value estimates, etc.) (Yuen ¶0124), wherein the Examiner notes that ¶0124 of Yuen discloses applying the thresholds as disclosed to downstairs logic, wherein since Yuen describes thresholds for differentiating between flat ground, upstairs, and upslope, the thresholds as applied to downstairs logic are considered to comprise thresholds for differentiating between flat ground, downstairs, and downslope; As intimated above, data which is representative of the altitude and/or changes in altitude and data which is representative of the motion of the user may also be used to determine and/or classify other activity-related metrics such as, for example, user steps, distance and pace (FIG. 4E)...Notably, other activity-related metrics may be determined by the processing circuitry, including, for example, (i) in the context of running/walking on level or substantially level ground, number of steps, also broken down as walking or running, distance traveled and/or pace (ii) in the context of running/walking on stairs, hills or ground having a grade of greater than about 3%, number of stair and/or hill steps, which may be categorized or broken down, correlated or organized/arranged according to, for example, the speed, pace and/or activity state of the user (for example, as walking, jogging or running), number of flights of stairs, ascent/descent distance on stairs and/or hills, pace, ascent/descent on elevators and/or escalators, surface grade, and/or number of calories expended by walking/jogging/running on stairs and/or hills as well as quantify/compare the additional calories burnt from stairs/hills over level ground (Yuen ¶0126); In one embodiment, the processing circuitry may evaluate the output of the altitude sensor to determine, calculate and/or estimate the activity state of the user by evaluating the altitude sensor data based on algorithms or processes based on the flowchart of FIG. 4K. With reference to FIG. 4K, in one embodiment, the processing circuitry determines the type of activity by evaluating the change in altitude of the user on a change in height or altitude per step basis (".DELTA.H-S") or the use of an elevator by a sustained rate of height change pre time period (for example, per second) (".DELTA.H-t") in the absence of steps. The change in height or altitude per step and change in height or altitude per second are evaluated against a plurality of thresholds and/or ranges to determine whether the user is, for example, moving (for example, running or walking) on level ground, on an escalator or in an elevator, traversing stairs and/or traversing a hill or the like. In one embodiment, Threshold 1, Threshold 2, Threshold 3 and Threshold 4 have the relationship Threshold 1>Threshold 2>Threshold 3>Threshold 4 wherein the process seeks to detect and identify the causes of increases in user altitude. In other embodiments, the flow may be modified to detect and classify decreases or both increases and decreases in user altitude. Thus, in these embodiments, the processing circuitry employs data from the motion sensor to assess the user state based on data from the altitude sensor (Yuen ¶0129, Figures 4K-L), wherein the Examiner notes that traversing a hill may be considered to read on the claimed motion pattern of upslope, wherein in light of the Examiner’s interpretation of ¶0124 of Yuen applying the disclosed thresholds to downstairs logic, traversing down a hill may be considered to read on the claimed motion pattern of downslope; the inventions may have functionality that determines the elevation change and/or slope between two points through, for instance, the use of GPS with an altimeter (Yuen ¶0164)], wherein the thresholds distinguish between upstairs and upslope, upslope and flat ground, flat ground and downslope, and downslope and downstairs [see Yuen ¶0124, ¶0126, ¶0129, Figure 4K]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system of Herr to employ wherein after the recognizing is triggered based on the trigger boundary condition, performing recognizing the motion pattern under the classified type of terrain by: determining that the motion pattern is upslope in response to that the slope of the ground is less than a first slope threshold and greater than a second slope threshold; determining that the motion pattern is walking on flat ground in response to that the slope of the ground is less than the second slope threshold and greater than a third slope threshold; or determining that the motion pattern is downslope in response to that the slope of the ground is less than the third slope threshold and greater than a fourth slope threshold; wherein the first slope threshold is a threshold value for distinguishing the upstairs motion pattern from the upslope motion pattern, the second slope threshold is a threshold value for distinguishing the upslope motion pattern from the flat ground walking motion pattern, and the third slope threshold is a threshold value for distinguishing the flat ground walking motion pattern from the downslope motion pattern, and the fourth slope threshold is a threshold value for distinguishing the downslope motion pattern from the downstairs motion pattern, so as to allow for distinct differentiation between activity states of the user based on identified thresholds of changes in height per step, allowing for the determination of motion patterns of the user [Yuen ¶0129]. Regarding claim 25, Herr in view of Yuen teaches The method according to claim 1, wherein the sensor further comprises an inertial measurement unit fixed to the limb extremity end [Herr ¶¶0177, 0179], and the method further comprises: detecting a standing stage and the swinging stage in a walking process of the subject by using an acceleration signal output from the inertial measurement unit mounted on extremity end [Herr ¶¶0194, 0264, 0358]; and wherein determining that the lower limb of the subject is in the standing stage in response to that an absolute value of the acceleration signal of the inertial measurement unit obtained by measurement is close to a gravity acceleration for a period of time; or determining that the lower limb of the subject is in the swing stage in response to that the absolute value of the acceleration signal obtained by the measurement is greater than the gravity acceleration [Herr ¶0177, 0179, wherein any movement of the subject that would result in motion of the extremity end of the human lower limb is considered to be greater than the gravity acceleration, such that the swing stage would be determined]. Claim(s) 4 and 12-15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Herr in view of Yuen, as applied to claim 1 above, in further view of Nishizawa (JP-4277048-B2, previously presented and translation previously attached). Regarding claim 4, Herr in view of Yuen teaches The method according to claim 1, wherein, the sensor further comprises an inertial measurement unit fixed to the limb extremity end [Herr ¶¶0177, 0179]. However, Herr in view of Yuen fails to explicitly disclose wherein the method further comprises: obtaining one or more of absolute velocity to ground, and the absolute acceleration to ground, through a coordinate transformation and an integration of angular velocity and acceleration data of the inertial measurement unit, which are obtained in the sensor coordinate system. Nishizawa discloses a system for measuring absolute acceleration, absolute velocity, and absolute position of a subject, wherein Nishizawa discloses obtaining one or more of the absolute velocity to ground, and the absolute acceleration to ground, through a coordinate transformation and an integration of angular velocity and acceleration data of the inertial measurement unit, which are obtained in the sensor coordinate system [the acceleration (A .sub.Xn , A .sub.Yn , A .sub.Zn ) of the ground coordinate system is calculated from the output data of the 3-axis acceleration sensor and the 3-axis angular velocity sensor, and this ground coordinate system It is preferable to calculate the position data of the object to be measured based on the acceleration (Translated Nishizawa, Page 4, Paragraph 6); the embodiment of the present invention includes a triaxial acceleration sensor 16 a that measures acceleration (G .sub.xn , G .sub.yn , G .sub.zn ) of an object to be measured and an angular velocity ( A motion capture 10 for detecting the position or orientation of an object to be measured by a six-axis sensor 16 provided with a three-axis angular velocity sensor 16b for measuring (ω .sub.xn , ω .sub.yn , ω .sub.zn ). The acceleration (A .sub.Xn , A .sub.Yn , A .sub.Zn ) of the reference coordinate system is calculated from the output data of 16a and the triaxial angular velocity sensor 16b based on the modified inverse skew matrix (R ′ (n) .sup.−1 ) (Translated Nishizawa, Page 5, Paragraph 8)]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the method of Herr in view of Yuen to employ obtaining one or more of the absolute velocity to ground, and the absolute acceleration to ground, through a coordinate transformation and an integration of angular velocity and acceleration data of the inertial measurement unit, which are obtained in the sensor coordinate system, and use a coordinate transformation and an integration of angular velocity and acceleration data of the inertial measurement unit to obtain one or more of the absolute motion trajectory to ground, the absolute velocity to ground, and the absolute acceleration to ground, so as to rapidly and accurately determine acceleration, speed, and inclination information of the subject relative to ground [By comprising in this way, the position of a to-be-measured object can be detected still more rapidly. Based on the acceleration of the earth coordinate system, not only the position data of the object to be measured but also the speed and inclination angle of the object to be measured can be calculated with high accuracy (Translated Nishizawa, Page 4, Paragraph 7)]. Regarding claim 12, Herr in view of Yuen and Nishizawa teaches The method according to claim 4, further comprising: detecting, based on a time window, the motion pattern of the subject in real time to recognize the motion pattern performed by the subject before a foot of the subject touches the ground [Herr ¶¶0218, 0220], wherein the motion pattern of the subject is recognized in response to one or more of the absolute velocity to ground, the absolute acceleration to ground or the absolute motion trajectory to ground matching, within the time window, a corresponding data of a particular motion pattern [Herr ¶0218, Fig. 6A]. Regarding claim 13, Herr in view of Yuen and Nishizawa teaches The method according to claim 4, wherein the collecting comprises: calculating a rotation angle or angular velocity of the limb extremity end relative to an initial sagittal plane or an initial coronal plane of the subject to recognize turning activity of the subject [Herr ¶¶0177, 0179, Fig. 2A, wherein as depicted in Herr Fig. 2A, the rotation angle of the ankle is calculated relative to an initial coronal plane of the subject (x-z plane of Fig. 2A), wherein the rotation of the foot about the ankle is considered to define turning activity of the subject (turning of the ankle relative to the x-z plane)]. Regarding claim 14, Herr in view of Yuen and Nishizawa teaches The method according to claim 13, further comprising: obtaining the rotation angle or angular velocity of the limb extremity end relative to the initial sagittal plane or the initial coronal plane of the subject by converting output data of the inertial measurement unit fixed to the limb extremity end [Herr ¶¶0177, 0179, Fig. 2A], or recognizing the turning activity of the subject by detecting the rotation angle or angular velocity of other parts of the body of the subject relative to the initial sagittal plane or the initial coronal plane of the subject [Herr ¶¶0177, 0179, Fig. 2A, wherein Herr Fig. 2A further depicts the rotation of the foot relative to the shank and about the ankle, all relative to the x-z plane (coronal plane)]. Regarding claim 15, Herr in view of Yuen and Nishizawa teaches The method according to claim 14, wherein the other parts of the body comprise one or more of head, upper torso, arms, lower thighs, lower legs, and feet [Herr ¶¶0177, 0179, Fig. 2A]. Claim(s) 5-6 is/are rejected under 35 U.S.C. 103 as being unpatentable over Herr in view of Yuen and Nishizawa, as applied to claim 4 above, in further view of Ly (US-20170258374-A1, previously presented). Regarding claim 5, Herr in view of Yuen and Nishizawa teaches The method according to claim 4. However, while Herr discloses performing steps to correct or minimize the effect of drift of the IMU when the subject is in a standing stage [Herr ¶0193], Herr in view of Yuen and Nishizawa fails to explicitly disclose further comprising: resetting, in response to a human body being in a standing stage, a transformation matrix for the coordinate transformation, the absolute velocity to ground, and an absolute motion displacement to ground, to eliminate or reduce a cumulative drift or cumulative error of the inertial measurement unit. Ly discloses systems for monitoring user movement using machine learning, wherein Ly discloses recalibrating a kinematic data collection system in response to the user standing [In one preferred operating state, activation of a calibration input triggers the collection of kinematic data used in determining a target (i.e., a reference) posture sample. For example, the user can direct the system on what is considered good posture by standing with good posture and then calibrating the system to recognize this posture by activating a calibration input and holding the posture for a minimum duration (Ly ¶0031)]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the method of Herr in view of Yuen and Nishizawa to employ resetting, in response to a human body being in a standing stage, a transformation matrix for the coordinate transformation, the absolute velocity to ground, and an absolute motion displacement to ground, to eliminate or reduce a cumulative drift or cumulative error of the inertial measurement unit, so as to prevent error in the data by recalibrating the system to a known reference, wherein in light of the current combination of Herr in view of Yuen and Nishizawa, the modification by Ly would incorporate resetting the transformation matrix. Regarding claim 6, Herr in view of Yuen, Nishizawa, and Ly teaches The method according to claim 5, further comprising: detecting the standing stage of the subject, by the inertial measurement unit fixed at the limb extremity end [Once the inertial measurement unit offsets have been calculated and corrected (zeroed), the foot-slope (.beta.) (alternatively referred to as heel height) is determined as illustrated in, for example, FIG. 3. From the illustration it is easy to see that when the wearer is standing with her foot flat on the ground that .beta.=-(.theta.+.gamma.). By averaging over a period of about a tenth of a second an accurate estimate of .beta. can be determined (Herr ¶0194); Examples of intrinsic sensors include… measurement of the angular rate and acceleration of the link (e.g., using, for example, an inertial measurement unit) (Herr ¶0264); Sitting, standing up and sitting down behavioral context is identified by the intrinsic sensors of the prosthetic apparatus (Herr ¶0358)] or a load cell mounted on a foot of the subject. Claim(s) 18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Herr in view of Yuen, as applied to claim 1 above, in further view of Elazary (US-20180005446-A1). Regarding claim 18, Herr in view of Yuen teaches The method according to claim 1, wherein the sensor further includes an inertial measurement unit mounted to lower legs, thighs, waists, or head of the subject [Herr ¶¶0177, 0179, Fig. 2A], and wherein the inertial measurement unit is configured to, measure one or more of the absolute motion trajectory to ground, the absolute velocity to ground, or the acceleration to ground [Herr ¶¶0177, 0179, 0218, Figs. 2A, 6A], or measure topographic characteristics in the different motion patterns [this limitation is considered optional]. However, Herr fails to explicitly disclose wherein the inertial measurement unit is an inertial measurement unit-combined depth camera. Elazary discloses systems and methods for monitoring a user’s environment, wherein Elazary discloses the combined use of an inertial sensor and depth camera [The augmented reality devices utilized in the implementation of some embodiments include at least one camera, one or more sensors, a power source, wireless network connectivity, a processor, and a semi-transparent surface or screen layering images or information over a presentation of the real-world that appears before or around the device. In some embodiments, the device sensors include inertial sensors, depth cameras, radio beacon receivers, laser scanners, and range finders as some examples (Elazary ¶0024)]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the method of Herr in view of Yuen to employ an inertial measurement unit-combined depth camera, so as to provide additional contextual information regarding the user’s environment. Claim(s) 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Herr in view of Yuen, as applied to claim 1 above, in further view of Cortelyou (US-20150338196-A1, previously presented). Regarding claim 19, Herr in view of Yuen teaches The method according to claim 1. However, Herr in view of Yuen fails to explicitly disclose wherein, the sensor further comprises an infrared capture system mounted in an ambient environment of the subject, and an infrared capture marker point is mounted at the limb extremity end of the subject, and wherein the method further comprises: analyzing one or more of the absolute motion trajectory to ground, the absolute velocity to ground, and the absolute acceleration to ground of the infrared capture marker point to recognize the motion pattern of the subject. Cortelyou discloses systems for monitoring user movement [Cortelyou Abstract], wherein Cortelyou discloses a sensor that comprises an infrared capture system mounted in an ambient environment of the subject, and an infrared capture marker point is mounted at the limb extremity end of the subject [The retro-reflective marker 24 may reflect a majority of the electromagnetic radiation (e.g., infrared, ultraviolet, visible wavelengths, or radio waves and so forth) incident from the electromagnetic radiation beam 28 back toward the detector 16 within a relatively well-defined cone having a central axis with substantially the same angle as the angle of incidence. This reflection facilitates identification of a location of the retro-reflective marker 24 by the system 10 (Cortelyou ¶0040)], and analyzing one or more of the absolute motion trajectory to ground, the absolute velocity to ground, and the absolute acceleration to ground of the infrared capture marker point to recognize the motion pattern of the subject [For instance, the retro-reflective marker 24 may be applied as a strip of retro-reflective tape applied to an armband, headband, shirt, personal identification feature, or other article… The tracking system 10 may interpret this signal 72 to track the position or path of the person 70 (or object 32) moving about a designated area (i.e., track the person or object in space and time). Again, depending on the number of detectors 16 utilized, the control unit 18 may determine vector magnitude, orientation, and sense of the person and/or object's movement based on the retro-reflected electromagnetic radiation received (Cortelyou ¶0063); The survey equipment 140 may, accordingly, identify a position of these markers 24 relative to a position of a certain environmental feature, such as the ground (Cortelyou ¶0106); Once a particular pattern of retro-reflection has been detected, a determination may be made by the control unit 18 as to whether the pattern correlates to a stored pattern identified by the control unit 18 (Cortelyou ¶0051)]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the method of Herr in view of Yuen to employ an infrared capture system mounted in an ambient environment of the subject, and an infrared capture marker point is mounted at the limb extremity end of the subject, and wherein the method further comprises: analyzing one or more of the absolute motion trajectory to ground, the absolute velocity to ground, and the absolute acceleration to ground of the infrared capture marker point to recognize the motion pattern of the subject, as this would amount to merely applying a known technique [infrared capture system of Cortelyou] to a known device (method, or product) ready for improvement to yield predictable results [analyzing one or more of the absolute motion trajectory to ground, the absolute velocity to ground, and the absolute acceleration to ground of the sensor with the similar expected result of recognizing the motion pattern of the subject] [MPEP §2143(I)(B)]. Response to Arguments Applicant’s arguments, see Applicant’s Remarks p. 13, filed 19 November 2025, with respect to the previously presented Drawing Objections have been fully considered and are persuasive. The previously presented Drawing Objections have been withdrawn. Applicant's arguments, see Applicant’s Remarks p. 13-15, with respect to the previously applied § 101 rejections have been fully considered but they are not persuasive. The Applicant asserts that the amended limitation(s) regarding the “trigger boundary condition” is/are not an abstract idea, wherein the Applicant argues that the amended limitation requires: (1) real-time capture of absolute motion trajectory to ground during the limb’s swing stage, which the Applicant argues is a physical, measurable parameter, and not a mathematical concept; (2) comparison of the trajectory to predefined geometric shapes [circle, rectangle, ellipse] in a sensor coordinate system [which the Applicant notes is “tied” to the hardware sensor] fixed to the limb extremity end; and (3) automatic triggering of pattern recognition only when the physical trajectory intersects these geometric boundaries. As such, the Applicant notes that the “geometric boundary-triggered recognition” is a specific, non-trivial technical design that cannot be performed “by pen and paper”, as it depends on real-time sensor data and sensor to enforce precise spatial thresholds, which the Applicant notes is far beyond abstract mathematical calculation. However, the Examiner disagrees with the Applicant’s arguments, as the Examiner notes that In response to applicant’s argument that the references fail to show certain features of the invention, it is noted that the features upon which applicant relies (i.e., “real-time capture of absolute motion trajectory to ground during the limb’s swing stage”) are not recited in the rejected claim(s). Although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims. See In re Van Geuns, 988 F.2d 1181, 26 USPQ2d 1057 (Fed. Cir. 1993). Furthermore, the Examiner notes that (1) the capture of absolute motion trajectory to ground during the limb’s swing stage was analyzed at Step 2A Prong 1 and was not identified as an abstract idea, and was further analyzed as an additional element at Step 2A Prong 2 and Step 2B directed towards extra-solution activity of data gathering. Moreover, the (2) comparison of the trajectory to predefined geometric shapes [circle, rectangle, ellipse] in a sensor coordinate system [which the Applicant notes is “tied” to the hardware sensor] fixed to the limb extremity end [considered to refer to the step of “determining that the absolute motion trajectory to ground passes through a predefined boundary trajectory in a sensor coordinate system”] was analyzed at Step 2A Prong 1 as being directed towards an abstract idea, as the comparison/determination is merely defined by using the mind or pen and paper practice by hand to observe known or collected data and draw mental conclusions therefrom, based on the Applicant’s Specification ¶¶0071-0072 and Fig. 6 describing the comparison/determination as data analysis relative to a known or derived threshold. The mere assertion that the sensor coordinate system being “tied” to a hardware sensor is not considered persuasive, as the sensor coordinate system defined by the hardware sensor merely limits the type or amount of data, which merely limits the abstract idea. Finally, the (3) automatic triggering of pattern recognition only when the physical trajectory intersects these geometric boundaries was analyzed at Step 2A Prong 1 as being directed towards an abstract idea, as the pattern recognition being “triggered” upon meeting a certain threshold merely limits the abstract idea of “pattern recognition”. As such, the argued limitations are considered to be direct towards subject matter that is either a judicial exception of an abstract idea or additional elements that are merely well-understood, routine, and conventional. The Applicant also asserts that the “preset auxiliary torque control” integrates the idea into a practical application, wherein the Applicant argues that the limitation regarding the driver of the auxiliary device is not “extra-solution activity” and is instead the invention’s practical purpose that directly addresses a technical problem in biomechanical assistance [Applicant cites ¶0004 and ¶0070, wherein the Applicant uses ¶0070 to exemplify that the driver generates torque to extend the ankle during the swing stage, such that this preparation reduces impact when the foot touches the ground, which is a tangible physical improvement]. However, the Examiner disagrees with the Applicant’s argument, as the Examiner notes that the specification should be evaluated to determine if the disclosure provides sufficient details such that one of ordinary skill in the art would recognize the claimed invention as providing an improvement [The specification need not explicitly set forth the improvement, but it must describe the invention such that the improvement would be apparent to one of ordinary skill in the art (see MPEP § 2106.04(d)(1))], wherein the Examiner notes that the cited portions of the Applicant’s Specification fail to provide the argued “practical purpose that directly addresses a technical problem”, as ¶0004 of the Specification merely gives background information regarding the state of the art and is not an explicitly disclosure of the Applicant’s invention, and ¶0070 provides NO DISCLOSURE regarding the driver or the generation of torque to extend the ankle during the swing stage [the Examiner notes that there is NO DISCLOSURE of the argued driver or generation of torque beyond the Background section of the specification, such that IT WOULD NOT be apparent to one of ordinary skill in the art that the specification describes the argued improvement/solution to address a technical problem]. The Examiner also notes that the subject matter directed towards controlling a driver of an auxiliary device to generate a preset torque was identified as being directed towards subject matter that is well-understood, routine, and conventional at Step 2B. As such, the "improvements" analysis in Step 2A determines whether the claim pertains to an improvement to the functioning of a computer or to another technology without reference to what is well-understood, routine, conventional activity [MPEP § 2106.04(d)(1)]. As such, since the argued solution to a technical problem is embodied in limitations directed towards well-understood, routine, and conventional subject matter, the argued solution is not considered to be an additional element that provides an improvement in technology. The Examiner further notes that the argued subject matter, while directed towards an additional element, does not provide a particular treatment or prophylaxis at Step 2A Prong 2 or Step 2B, as the Examiner notes that the identified limitation merely “generates” a value and does not positively recite using the generated value to perform any particular treatment or prophylaxis, as merely reciting that the generated value is “to enable…” a user to perform an action is not a positive recitation of implementing the generated value to perform any particular treatment or prophylaxis. The Applicant further asserts that the combination of the features argued above amounts to significantly more than an abstract idea, as the combination of features is a specific technical solution to a real-world problem, resulting in tangible improvements: reduced impact forces, more natural motion assistance, and enhanced usability of lower limb devices [Applicant cites ¶¶0070-0079]. However, for similar reasons to those noted above in the Examiner’s responses regarding argued improvements/solutions to a technical problem, the combination of features as argued are not considered to amount to significantly more than an abstract idea. Applicant's arguments, see Applicant’s Remarks p. 15-16, with respect to the previously applied claim rejections under § 112(a) have been fully considered but they are not persuasive. The Applicant asserts that the amendments to claims 1 and 22-23 to define the scope of “limb” and “auxiliary device” align with the explicit disclosure in the Specification, wherein the Applicant notes that ¶0004 confirms that the invention is directed to “lower limb auxiliary devices” that control a driver to generate a preset torque for motion assistance and ¶0070 specifically identifies “lower limb, prosthesis, orthosis, or exoskeleton” as the target devices for motion pattern recognition and torque control [emphasis applied by Examiner]. However, the Examiner notes that the Applicant has taken the Examiner’s note regarding the scope of identified genus and species out of context, as the Examiner reproduces the Examiner’s note herein: “the Examiner notes that the recited ‘lower limb auxiliary device’ is considered to define a genus, wherein a prosthesis, orthosis, or exoskeleton are each considered to define species of the identified genus, such that while the Applicant’s Specification may provide limited support for prostheses, orthoses, and exoskeletons [Applicant’s citation of ¶0070 of the Specification], the Specification DOES NOT provide explicit support for any lower limb auxiliary device, which is considered to be broader in scope than prostheses, orthoses, and exoskeletons”. The Examiner’s analysis was not to identify that claims 1/22/23 merely need to further limit the claimed limb, but instead identify that while there is disclosure of motion pattern recognition of a limb, prosthesis, orthosis, or exoskeleton as explicitly disclosed in ¶0070 [In order to be able to recognize the motion pattern of the lower limb, prosthesis, orthosis or exoskeleton before the next foot contacting the ground, thereby enabling the lower limb, prosthesis, orthosis or exoskeleton to complete the required preparation during the swing stage, a predetermined triggering condition may be used to trigger the pattern recognition decision of the classifier or the pattern recognizer (¶0070)], the Examiner notes that ¶0070 FAILS TO PROVIDE ANY DISCLOSURE regarding torque control. Furthermore, the Examiner notes that the only recitation of a “driver” or generation of a preset torque is in the Background section of the Specification [see ¶0004], which is considered to only define the current state of the art and not describe or provide written description support for the Applicant’s invention. Applicant’s arguments, see Applicant’s Remarks p. 17, with respect to the previously applied claim rejections under § 112(b) have been fully considered and are persuasive. The § 112(b) rejection of claim 21 has been withdrawn. Applicant’s arguments, see Applicant’s Remarks p. 17-20, with respect to the rejection(s) of claim(s) 1, 22-23, and those dependent therefrom under § 103 have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of Herr (US-20100179668-A1) in view of Yuen (US-20120084054-A1, previously presented). The Applicant asserts that the amended limitations of claims 1 and 22-23 regarding “wherein the triggering boundary condition is satisfied when the absolute motion trajectory to ground during the swing stage passes through the circle, the rectangle or the ellipse in the sensor coordinate system, wherein the sensor coordinate system is a two-dimensional coordinate system” fails to be disclosed or suggested by the cited prior art, as the Applicant argues that the thresholds as disclosed by Yuen are not equivalent to the amended geometric trigger limitation. The Applicant notes that the terrain classification system of Yuen relies on “quantitative, one-dimensional numerical analysis of altitude changes” and never references, uses, or suggests two-dimensional geometric shapes (circles, rectangles, or ellipses) or spatial trajectory comparisons; wherein the Applicant cites ¶¶0124, 0129 of Yuen as using altitude change per step/second as being single-value numerical parameters to differentiate terrain types; and wherein the Applicant notes that Yuen never captures or analyzes the absolute motion trajectory to ground during the swing stage, which the Applicant asserts is a two-dimensional spatial parameter that describes the shape and position of the limb’s movement. Applicant’s arguments with respect to claim(s) 1 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Herr is now applied to teach the amended limitation “wherein the triggering boundary condition is satisfied when the absolute motion trajectory to ground during the swing stage passes through the circle, the rectangle or the ellipse in the sensor coordinate system, wherein the sensor coordinate system is a two-dimensional coordinate system”, wherein the Examiner notes that as noted in the Claim Interpretations section above, the Applicant has failed to specifically define the constants A and B used in the elliptical boundary condition [Applicant’s Specification ¶¶0071-0072, Fig. 6; similar lack of specificity to define the circle and rectangle as defined in ¶¶0074-0075 of the Applicant’s Specification are noted]. As such, as Herr discloses using the absolute motion trajectory to ground in a y-z plane to recognize motion patterns [Herr ¶¶0177, 0179, 0218, Fig. 6A], the motion trajectory may be considered to have “passed through” any arbitrarily defined circle/rectangle/ellipse, as the motion trajectory itself allows for the recognition of the motion pattern. The Examiner notes that Yuen is still considered to be applicable regarding the recognition of the motion pattern under the classified type of terrain, which is understood to define a separate step from triggering the boundary condition, as while Yuen does disclose that the thresholds to separate change in height per step account for a change in altitude but not specifically a change in horizontal displacement [step length], the thresholds of Yuen are still considered to define step-normalized slope thresholds to differentiate between up/downstairs, up/downslope, and level ground. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to SEVERO ANTONIO P LOPEZ whose telephone number is (571)272-7378. The examiner can normally be reached M-F 9-6 EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Charles Marmor II can be reached at (571) 272-4730. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /SEVERO ANTONIO P LOPEZ/Examiner, Art Unit 3791
Read full office action

Prosecution Timeline

Nov 04, 2020
Application Filed
Feb 16, 2023
Non-Final Rejection — §101, §103, §112
May 17, 2023
Response Filed
Jun 23, 2023
Final Rejection — §101, §103, §112
Aug 29, 2023
Response after Non-Final Action
Sep 11, 2023
Response after Non-Final Action
Sep 28, 2023
Request for Continued Examination
Oct 06, 2023
Response after Non-Final Action
Jan 04, 2024
Non-Final Rejection — §101, §103, §112
Apr 03, 2024
Response Filed
Jun 06, 2024
Final Rejection — §101, §103, §112
Aug 13, 2024
Examiner Interview Summary
Aug 13, 2024
Applicant Interview (Telephonic)
Sep 06, 2024
Request for Continued Examination
Oct 01, 2024
Response after Non-Final Action
Nov 14, 2024
Non-Final Rejection — §101, §103, §112
Feb 07, 2025
Response Filed
Mar 04, 2025
Non-Final Rejection — §101, §103, §112
Jun 17, 2025
Response Filed
Sep 09, 2025
Final Rejection — §101, §103, §112
Nov 19, 2025
Request for Continued Examination
Dec 03, 2025
Response after Non-Final Action
Jan 05, 2026
Non-Final Rejection — §101, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12575781
PORTABLE AND WEARABLE ELECTROMYOGRAPHIC BIOFEEDBACK FOR SPINAL CORD INJURY TO ENHANCE NEUROPLASTICITY
2y 5m to grant Granted Mar 17, 2026
Patent 12549134
NON-CONTACT SENSING NODE, SYSTEMS AND METHODS OF REMOTE SENSING
2y 5m to grant Granted Feb 10, 2026
Patent 12543972
BIOMECHANICAL MEASUREMENT DEVICES AND USES THEREOF FOR PHENOTYPE-GUIDED MOVEMENT ASSESSMENT, INTERVENTION, AND ACTIVE ASSISTANCE DEVICE CONTROL
2y 5m to grant Granted Feb 10, 2026
Patent 12419554
PRECISE ARTERIAL BLOOD SAMPLING DEVICE
2y 5m to grant Granted Sep 23, 2025
Patent 12408901
INTRAUTERINE TISSUE COLLECTION INSTRUMENT
2y 5m to grant Granted Sep 09, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

8-9
Expected OA Rounds
32%
Grant Probability
65%
With Interview (+33.4%)
3y 6m
Median Time to Grant
High
PTA Risk
Based on 149 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month