Prosecution Insights
Last updated: April 19, 2026
Application No. 18/686,483

METHOD AND SYSTEM FOR MONITORING BODY MOVEMENTS

Non-Final OA §101§102§103§112
Filed
Feb 26, 2024
Examiner
LOPEZ, SEVERO ANTON P
Art Unit
3791
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Soter Analytics Pty Ltd.
OA Round
1 (Non-Final)
32%
Grant Probability
At Risk
1-2
OA Rounds
3y 6m
To Grant
65%
With Interview

Examiner Intelligence

Grants only 32% of cases
32%
Career Allow Rate
47 granted / 149 resolved
-38.5% vs TC avg
Strong +33% interview lift
Without
With
+33.4%
Interview Lift
resolved cases with interview
Typical timeline
3y 6m
Avg Prosecution
86 currently pending
Career history
235
Total Applications
across all art units

Statute-Specific Performance

§101
14.4%
-25.6% vs TC avg
§103
37.1%
-2.9% vs TC avg
§102
16.5%
-23.5% vs TC avg
§112
27.6%
-12.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 149 resolved cases

Office Action

§101 §102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Drawings The drawings are objected to as failing to comply with 37 CFR 1.84(p)(5) because they do not include the following reference sign(s) mentioned in the description: “96” in ¶123. The drawings are objected to as failing to comply with 37 CFR 1.84(p)(5) because they include the following reference character(s) not mentioned in the description: “+“ in Fig. 7; “116” in Fig. 8; “124” in Fig. 8. Corrected drawing sheets in compliance with 37 CFR 1.121(d), or amendment to the specification to add the reference character(s) in the description in compliance with 37 CFR 1.121(b) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance. Specification The disclosure is objected to because of the following informalities: The amendment filed 26 February 2024 is objected to under 35 U.S.C. 132(a) because it introduces new matter into the disclosure [“This application is a National Phase of PCT Patent Application No. PCT/AU2022/051039 having International filing date of August 25, 2022, which claims the benefit of priority of Australian Patent Application No. 2021902777 filed on August 25, 2021. The contents of the above applications are all incorporated by reference as if fully set forth herein in their entirety” (emphasis applied)]. 35 U.S.C. 132(a) states that no amendment shall introduce new matter into the disclosure of the invention. The added material which is not supported by the original disclosure is as follows: incorporation(s) by reference to foreign priority document(s) when added by amendment at the time of entry to the national stage is/are considered new matter [An incorporation by reference statement added after an application’s filing date is not effective because no new matter can be added to an application after its filing date (see 35 U.S.C. 132(a)) (MPEP § 608.01(p)(I)(B)); An international application designating the U.S. has two stages (international and national) with the filing date being the same in both stages. Often the date of entry into the national stage is confused with the filing date. It should be borne in mind that the filing date of the international stage application is also the filing date for the national stage application (MPEP § 1893.03(b))]]. Applicant is required to cancel the new matter in the reply to this Office Action. The Examiner suggests amending ¶168 of the Specification to read “when the threshold is exceeded a hazard alert is generated [[120]] 124” to maintain consistency with Applicant’s Fig. 8. Appropriate correction is required. Claim Objections Claim(s) 1-3, 6, 9, 12-30, 32, and 36 is/are objected to because of the following informalities: The Examiner suggests amending claim 1 to recite language to positively recite the functions performed by the processor, in order to prevent any interpretations of the steps the processor is used for from being interpreted as an intended use of the processor. The Examiner notes that claim 2 positively recites a function performed by the processor using the language “the processor is configured to…” [lines 1-2]. The Examiner notes a similar suggestion for claim 55. Lines 6-18 of claim 1 should be tabbed to the right once to indicate functions of the processor. Claim 2 should read “[[the]] a type of body part” [line 2]. Claim 9 should read “a change in [[the]] an angle of movement” [lines 4-5]. Claim 9 should read “a rate of change of [[the]] an angle of orientation” [line 6]. Claim 16 should read “[[the]] an arm of the user” [line 2]. The Examiner suggests providing an amendment to recite “a user” prior to claims 16-18, as the Examiner notes that while it is understood that reference to a body and body parts implies the body and body parts belong to a user [reciting a user in reference to the body is not specifically indefinite, but could pose antecedence issues], “a user” is not recited prior to reference of “the user” in claims 16-18. Claim 24 should read “determining the number and amount of intensities of movement from the plurality of movements” [lines 2-3]. Claim 25 should read “determining an average amount of time spent moving in each movement of the plurality of movements and an average of the amount of time spent recovering from each movement of the plurality of movements” [lines 2-4]. The Examiner notes that each dependent claim of claims 2-3, 6, 9, and 12-30 should read “[[A]] The system…” [line 1 in each dependent claim]. Claim 32 should read “[[the]] a risk of injury of the body part” [line 6]. Claim 36 should read “[[the]] a risk of injury of the body part” [line 6]. Appropriate correction is required. Claim Interpretation Examiner Notes: currently, NO limitation invokes interpretation under § 112(f). Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claim(s) 9, 19, and those dependent therefrom is/are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 9 recites the limitation “wherein determining of the false positive comprises analysing one or more of:… an orientation of the body part” [lines 3-4, 5], which is considered indefinite, as it is not clear whether the analyzed orientation of the body part is meant to define a different/separate instance of an orientation of the body part from the orientation of the body part as defined in claim 1 [line 6] or not. For examination purposes, the Examiner has interpreted either identified interpretation to be applicable in light of any prior art applied under § 102 or § 103. Claim 19 recites the limitation “determine whether a user is sitting, and to determine arm movement when the user is sitting” [lines 2-3], which is considered indefinite, as it is not clear whether the recited user is meant to refer to the previously defined body [a user is considered to be defined by a body] of claim 1 [line 1] or define any new or separate user [the Examiner notes that “a user” has not been previously defined in claims 1, 15, 16, or 19; see similar claim objection for claims 16-18]. For examination purposes, the Examiner has interpreted the referred to user to be the same user that defines the body as recited in claim 1 in any prior art applied under § 102 or § 103. Claim 30 recites the limitation “determine a risk of continued performance of the movement of the body part” [lines 2-3], which is considered indefinite, as it is not clear whether this limitation is meant to be interpreted to refer to a “risk” of whether the movement of the body part may be indicative of continued performance of the movement of the body part, or whether the determination is of a risk of injury by continued performance of the movement of the body part [A risk of injury by continued performance of the movement of the body part may be determined from the determined state, type of movement and repetition of movement (Applicant’s Specification ¶195)]. For examination purposes, the Examiner has interpreted that the determination is of a risk of injury by continued performance of the movement of the body part. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claim(s) 1-3, 6, 9, 12-30, 32, 36, 41, 44, and 55 is/are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception without significantly more. Each claim has been analyzed to determine whether it is directed to any judicial exceptions. Representative claim(s) 1 [representing all independent claims] recite(s): A system for monitoring body movement, comprising: an inertial motion sensing unit arranged to measure movement of a body part, the sensing unit arranged to iteratively collect data representing movement of the body part over time; a processor for: determining an orientation of the body part based on the data; tracking movement of the body part based on the data; maintaining a record of a state of movement of the body part; determining whether the body part has changed the state of movement and updating the maintained record of the state of movement when the body part has changed the state of movement; classifying the movement of the body part according to the orientation of the body part, the current state of movement of the body part and the data during the tracked movement of the body part, wherein the classified movement comprises whether the body is: sitting, or standing, or is transitioning between sitting and standing, or transitioning between standing and sitting; analysing the classified movement and the data to determine risk of injury of the body part. (Emphasis added: abstract idea, additional element) Step 2A Prong 1 Representative claim(s) 1 recites the following abstract ideas, which may be performed in the mind or by hand with the assistance of pen and paper: “determining an orientation of the body part based on the data” – may be performed by merely observing at least a limited amount of known or previously collected data and drawing mental conclusions therefrom [Applicant’s Specification ¶¶150-151] “tracking movement of the body part based on the data” – may be performed by merely observing at least a limited amount of known or previously collected data and drawing mental conclusions therefrom [Applicant’s Specification ¶¶150-151] “maintaining a record of a state of movement of the body part” – may be performed by merely retaining at least a limited amount of known or previously derived movements on paper or in the mind [Applicant’s Specification ¶¶150-151] “determining whether the body part has changed the state of movement and updating the maintained record of the state of movement when the body part has changed the state of movement” – may be performed by merely observing at least a limited amount of known or previously collected data and drawing mental conclusions therefrom; and further retaining at least a limited amount of known or previously derived movements on paper or in the mind [Applicant’s Specification ¶¶150-151] “classifying the movement of the body part according to the orientation of the body part, the current state of movement of the body part and the data during the tracked movement of the body part, wherein the classified movement comprises whether the body is: sitting, or standing, or is transitioning between sitting and standing, or transitioning between standing and sitting” – may be performed by merely observing at least a limited amount of known or previously collected data and drawing mental conclusions therefrom based on known or previously derived relationships [Applicant’s Specification ¶¶153-154] “analysing the classified movement and the data to determine risk of injury of the body part” – may be performed by merely observing at least a limited amount of known or previously collected data and drawing mental conclusions therefrom [Applicant’s Specification ¶159] If a claim, under BRI, covers performance of the limitations in the mind but for the mere recitation of extra-solutionary activity (and otherwise generic computer elements) then the claim falls within the “Mental Processes” grouping of abstract ideas. Accordingly, the claim recites an abstract idea under Step 2A Prong 1 of the Mayo framework as set forth in the 2019 PEG. No limitations are provided that would force the complexity of any of the identified evaluation steps to be non-performable by pen-and-paper practice. Alternatively or additionally, these steps describe the concept of using implicit mathematical formula(s) [i.e., “analysing the classified movement and the data to determine risk of injury of the body part”] to derive a conclusion based on input of data, which corresponds to concepts identified as abstract ideas by the courts [Diamond v. Diehr. 450 U.S. 175, 209 U.S.P.Q. 1 (1981), Parker v. Flook. 437 U.S. 584, 19 U.S.P.Q. 193 (1978), and In re Grams. 888 F.2d 835, 12 U.S.P.Q.2d 1824 (Fed. Cir. 1989)]. The concept of the recited limitations identified as mathematical concepts above is not meaningfully different than those mathematical concepts found by the courts to be abstract ideas. The dependent claims merely include limitations that either further define the abstract idea [e.g. limitations relating to the data gathered or particular steps which are entirely embodied in the mental process] and amount to no more than generally linking the use of the abstract idea to a particular technological environment or field of use because they are merely incidental or token additions to the claims that do not alter or affect how the process steps are performed. Thus, these concepts are similar to court decisions of abstract ideas of itself: collecting, displaying, and manipulating data [Int. Ventures v. Cap One Financial], collecting information, analyzing it, and displaying certain results of the collection and analysis [Electric Power Group], collection, storage, and recognition of data [Smart Systems Innovations]. Step 2A Prong 2 The judicial exception is not integrated into a practical application. Representative claim 1 only recites additional elements of extra-solutionary activity – in particular, extra-solution activity [generic computer function, data gathering] – without further sufficient detail that would tie the abstract portions of the claim into a specific practical application (2019 PEG p. 55 – the instant claim, for example does not tie into a particular machine, a sufficiently particular form of data or signal collection – via the claimed extra-solution activity, or a sufficiently particular form of display or computing architecture/structure). Dependent claim(s) 2-3, 6, 9, 12-14, 19-30 merely add detail to the abstract portions of the claim but do not otherwise encompass any additional elements which tie the claim(s) into a particular application/integration [the dependent claim(s) recite generic ‘units’ or ‘steps’ which encompass mere computer instructions to carry out an otherwise wholly abstract idea]. Dependent claim(s) 15-18 encounter substantially the same issues as the independent claim(s) from which they depend in that they encompass further generic extra-solutionary activity [generic data gathering] and/or generic computer elements [storage, memory per se]. Accordingly, the claim(s) are not integrated into a practical application under Step 2A Prong 2. Step 2B The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. Independent claims 1, 32, 36, 41, 44, and 55 as individual wholes fail to amount to significantly more than the judicial exception at Step 2B. As discussed above with respect to integration of the abstract idea into a practical application, the additional elements of extra-solutionary activity [i.e., generic computer function, data gathering] and generic computer elements cannot amount to significantly more than an abstract idea [MPEP § 2106.05(f)] and is further considered to merely implement an abstract idea on a generic computer [MPEP § 2106.05(d)(II) establishes computer-based elements which are considered to be well understood, routine, and conventional when recited at a high level of generality]. For the independent claim portions and dependent claims which provide additional elements of extra-solutionary data gathering, MPEP § 2106.05(g) establishes that mere data gathering for determining a result does not amount to significantly more. The extra-solutionary activity of processor steps [acquiring, storing signals, etc.] as presently recited, cannot provide an inventive concept which amounts to significantly more than the recited abstract idea. For the independent claims as well as the dependent claims merely reciting generic computer elements and functions [processor recited at a high level of generality and generic functions therein], MPEP § 2106.05(d)(II) establishes computer-based elements which are considered to be well understood, routine, and conventional when recited at a high level of generality. Accordingly, the generic computer elements and functions thereof, as presently limited, cannot provide an inventive concept since they fall under a generic structure and/or function that does not add a meaningful additional feature to the judicial exception(s) of the claim(s). Claims 1, 32, 36, 41, 44, and 55 recite “an inertial motion sensing unit arranged to measure movement of a body part, the sensing unit arranged to iteratively collect data representing movement of the body part over time” [claims 1, 32, 36, 41], “measuring movement of a body part by iteratively collect data representing movement of the body part over time using a sensor unit” [claim 44], and “a housing having an attachment device for attaching the housing to a head of a user only; an accelerometer mounted in the housing; a gyroscope mounted in the housing; an atmospheric pressure sensor mounted in the housing and arranged to measure external atmospheric pressure” [claim 55]; wherein claims 15-18 further limit the amount of location of the inertial motion sensing unit of claim 1. Such a sensing unit is considered well-understood, routine, and conventional, as known by at least: Applicant’s disclosure is not particular regarding the particular structure of the generically claimed sensing unit, and recites the sensing unit at a high level of generality [The sensor pack 30 (sensor unit) in this example comprises an inertial sensor and an atmospheric sensor (barometer) 30. Preferably the inertial sensor comprises one or more of an accelerometer and a gyroscope for determining movement of the body part of a user (Applicant’s Specification ¶111); In an embodiment, the sensor pack 30 comprises a multi-axis accelerometer, capable of detecting magnitude and direction of acceleration in order to determine movements of the respective body part of the user that the device 22, 23, 23', 23" is attached to. The sensor pack 30 also comprises a gyroscope capable of detecting angular velocity of the respective body part of the user that the device 22, 23, 23', 23" is attached to. Preferably the sensor pack 30 further comprises an atmospheric pressure sensor (high sensitivity barometer) configured to measure changes in altitude of the device 22, 23, 23', 23". Additional sensors such as magnetometers, or further accelerometers, or gyroscopes may be included in the sensor pack 30 (Applicant’s Specification ¶133); The wearable item 12 shown in Figure 1A is a vest, however in an alternative, the wearable item is a clip-on tag that attaches to the user or their clothing… The wearable item is for attaching the monitoring device 22 to the user and could take on other forms, such as for example, a strap (Applicant’s Specification ¶110)]. This lack of disclosure is acceptable under 35 U.S.C. 112(a) since this hardware performs non-specialized functions known by those of ordinary skill in the medical technology arts. Thus, Applicant's specification essentially admits that this hardware is conventional and performs well understood, routine and conventional activities in the field of motion sensing. In other words, Applicant’s specification demonstrates the well-understood, routine, conventional nature of the above-identified additional element because it describes such an additional element in a manner that indicates that the additional element is sufficiently well-known that the specification does not need to describe the particulars of such additional elements to satisfy 35 U.S.C. 112(a) [see Berkheimer memo from April 19, 2018, Page 3, (III)(A)(1), not attached]. Adding hardware that performs “well understood, routine, conventional activit[ies]’ previously known to the industry” will not make claims patent-eligible [TLI Communications]. Strausser (US-20150045703-A1) [Inertial measurement units (IMUs) could be coupled to the leg support 212. An inertial measurement unit is generally composed of an accelerometer and a gyroscope and sometimes a magnetometer as well; in many modern sensors these devices are MEMS (Mico electromechanical systems) that have measurement in all three orthogonal axes on one or more microchips. The behavior of IMUs is well understood in the art (IMUs being used for applications from missile guidance to robotics to cell phones to hobbyist toys); they typically provide measurement of angular orientation with respect to gravity, as well as measurement of angular velocity with respect to earth and linear acceleration, all in three axes (Strausser ¶0025)] Mohrman (US-20170189752-A1) [Sensing devices (or “sensor platforms”) of the present disclosure can include multiple sensors, such as inertial measurement units (IMUs, such as accelerometers (e.g., one-axis, two-axis or three-axis accelerometers), gyroscopes, and magnetometers), temperature sensors, inertial sensors, force sensors, pressure sensors, Global Positioning System (GPS) receivers, and flex sensors, as well as local digital and analog signal processing hardware, storage device(s), energy source(s), and wireless transceivers integrated into apparel and/or wearable accessories relevant to bipedal motion, such as shoes, insoles, socks, leg bands, arm bands, chest straps, wrist bands/bracelets, and/or the like. Some of the aforementioned sensors, such as accelerometers, gyroscopes and magnetometers, can function as orientation sensors (Mohrman ¶0034)] Mochizuki (US-20200388190-A1) [Therefore, it is considered that, at a maximum, six portions of the waist, the head, the both hands, and the both feet, if necessary, additionally, the fingers, are sufficient as portion where the wearable sensors 101 are worn (Mochizuki ¶0090); Further, a wearing method of the wearable sensor 101 is not particularly limited. For example, a band, a belt, a supporter, a tape, a clip, or the like, are used (Mochizuki ¶0091); The wearable sensor 101 includes a high dynamic range (HDR) acceleration sensor 201, a low dynamic range (LDR) acceleration sensor 202, a high dynamic range (HDR) gyro sensor 203, a low dynamic range (LDR) gyro sensor 204, a geomagnetic sensor 205, a strain sensor 206, an atmospheric pressure sensor 207 (Mochizuki ¶0111)] Examiner’s Note Regarding Particular Treatment or Prophylaxis: Claim(s) 1, 32, 36, and 44 recite subject matter regarding “determine the risk of injury of the body part”, which the Examiner notes is not considered to be a particular treatment or prophylaxis, as none of the identified claims positively recite or include language that is considered to be a particular treatment or prophylaxis as an additional element to integrate the judicial exception into a practical application or allow the identified claims to amount to significantly more than the judicial exception [MPEP § 2106.04(d)(2)]. Accordingly, the claim(s) as whole(s) fail amount to significantly more than the judicial exception under Step 2B. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1, 6, 14-17, 20-21, 23-24, 28-30, 36, and 41 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Petterson (US-20170296129-A1, cited by Applicant). Regarding claim 1, Petterson teaches A system for monitoring body movement, comprising: an inertial motion sensing unit arranged to measure movement of a body part, the sensing unit arranged to iteratively collect data representing movement of the body part over time [the wearable sensor 112 includes an inertial measurement unit (“IMU”) sensor, the wearable sensor 112 records three-dimensional motions of the worker during the day, starting with measurements directly from the three integrated sensors of the IMU… In some embodiments, the IMU takes readings from an accelerometer, gyroscope, and magnetometer, each of which measurements has an x, y, and z component. In some embodiments, sensor fusion techniques are applied to filter and integrate the nine-component sensor measurements to calculate the orientation of the single wearable sensor 112 mounted to the worker. In some embodiments, the orientation that is calculated in this manner is described by three angles: yaw, pitch, and roll (herein collectively “YPR”) (Petterson ¶0050, Fig. 2)]; a processor [the analysis may be performed by a processor of the wearable sensor 112 itself. In alternative embodiments, the analysis may be performed by another computing device (e.g., one or more of user computing devices 104) or a remote server that. In the case of analyses performed by a remote server, the results may be further transmitted to one or more of user computing devices 104 (Petterson ¶0114)] for: determining an orientation of the body part based on the data [In operation 306, the wearable sensor 112 measures movements of the worker over any range of motion. For example, in the case where the wearable sensor 112 is mounted to the user's chest, the position of the worker's back and the angle of the back with respect to the ground as a function of time. Such measurements may be taken at discrete time intervals or continuously. The measured worker movement data is saved to the data storage device 106 locally or remotely for subsequent analysis (Petterson ¶0111)]; tracking movement of the body part based on the data [Petterson ¶0111]; maintaining a record of a state of movement of the body part [Petterson ¶0111]; determining whether the body part has changed the state of movement and updating the maintained record of the state of movement when the body part has changed the state of movement [Petterson ¶0111, wherein monitoring and storing measured movement is considered to read on the claimed limitation]; classifying the movement of the body part according to the orientation of the body part, the current state of movement of the body part and the data during the tracked movement of the body part, wherein the classified movement comprises whether the body is: sitting, or standing, or is transitioning between sitting and standing, or transitioning between standing and sitting [the measurements taken by the wearable sensor 112 may be used to determine if the worker is performing one or more movements including, but not limited to, walking, running, jumping, squatting, standing upright, twisting their torso, pivoting around one foot, reaching above their head, and riding in a vehicle (Petterson ¶0112)]; analysing the classified movement and the data to determine risk of injury of the body part [In operation 310, the worker's measured movements are analyzed. As discussed above, in certain embodiments, the analysis may be performed by a processor of the wearable sensor 112 itself. In alternative embodiments, the analysis may be performed by another computing device (e.g., one or more of user computing devices 104) or a remote server (Petterson ¶0114); The data analysis may quantify risk and quality of worker movements. These characterizations may be based upon one or more of industry standards, ergonomist recommendations, and combinations thereof (Petterson ¶0120)]. Regarding claim 6, Petterson teaches A system according to claim 1, wherein determining whether the body part has the changed state of movement comprises determining whether the body part has started moving, or has finished moving [In some embodiments, a lift may be detected, for example, when the sagittal flexion begins at a local minimum of 5 degrees, goes to a local maximum of 60 degrees, and returns to a local minimum of 10 degrees; in this example, the peak of 60 degrees exceeds the MPH of 30 degrees and the difference between the local maximum and the local minimum (i.e., the 50 degree difference between the 60 degree peak and the 10 degree local minimum) exceeds the MPP of 40 degrees (Petterson ¶0115)], and the direction of movement relative to the determined orientation of the body part [Petterson ¶0050]. Regarding claim 14, Petterson teaches A system according to claim 1, wherein the processor is configured to analyse the data to determine when a lift is occurring and to determine a lifting technique used when a lift is occurring, and the processor is configured to analyse the data to determine arm movement when a lift is occurring [As discussed above, the wearable sensor 112 is securely mounted at a desired location on the body, such as the worker's chest or wrist (Petterson ¶0104); operation 310 includes detection of the frequency of lifts by a worker who is wearing the wearable sensor 112 (Petterson ¶0115); Heavy, Frequent or Awkward Lifting Lifting object weighing more than 75 pounds once per day or more than 55 pounds more than 10 times per day. Lifting objects weighing more than 10 pounds if done more than twice per minute, more than 2 hours total per day. Lifting objects weighing more than 25 pounds above the shoulders, below the knees or at arm's length more than 25 times per day (Petterson p. 15, Table 1)]. Regarding claim 15, Petterson teaches A system according to claim 1, wherein the data is obtained from a single inertial motion sensing unit [FIG. 10 is a photograph of an exemplary wearable sensor 112, as engaged with a strap 206 and worn by a user in the position described above (Petterson ¶0102, Fig. 10); wherein as depicted in Petterson Fig. 10, there is only one wearable sensor 112; In operation 302, the wearable sensor(s) 112 are mounted to the worker (Petterson ¶0102), wherein reference to “sensor(s)” is considered to imply the use of only one sensor]. Regarding claim 16, Petterson teaches A system according to claim 15, wherein the sensing unit is only located on the arm of the user [the wearable sensor 112 is securely mounted at a desired location on the body, such as the worker's chest or wrist. In further embodiments, the wearable sensor 112 may be mounted to the worker's back, torso, hip, or ankle (Petterson ¶0104)]. Regarding claim 17, Petterson teaches A system according to claim 15, wherein the sensing unit is only located on the back of the user [Petterson ¶0104]. Regarding claim 20, Petterson teaches A system according to claim 1, wherein the processor is configured to determine an intensity of force used in a movement of the body part [Petterson ¶¶0050, 0052, 0111, wherein an amplitude of inertial data is considered to read on “intensity of force of movement”]. Regarding claim 21, Petterson teaches A system according to claim 1, wherein the processor is configured to determine a state of the body part, wherein the state is one or more of: moving, lifting, bending, twisting, twisting while bending and stationary [Petterson ¶0112]. Regarding claim 23, Petterson teaches A system according to claim 1, wherein the processor is configured to determine a work pattern from a plurality of movements [In some embodiments, operation 310 includes detection of the frequency of lifts by a worker who is wearing the wearable sensor 112. The frequency of lifting is a major component of determining one's risk of lower back injury. Lifting may typically involve forward bending. In some embodiments, a lift is identified by identifying a peak in a worker's forward sagittal flexion motion. In some embodiments, when a peak in a worker's forward sagittal flexion motion occurs, a lift is identified (Petterson ¶0115)]. Regarding claim 24, Petterson teaches A system according to claim 23, wherein determining the work pattern comprises determining the number and amount of intensities of movement [In some embodiments, the determining, for the person during the activity, (a) the lift rate (Petterson ¶0042)]. Regarding claim 28, Petterson teaches A system according to claim 1, wherein the processor is configured to classify movements as one or more of the following: arm elevated more than 90 degrees [the measurements taken by the wearable sensor 112 may be used to determine if the worker is performing one or more movements including, but not limited to,… reaching above their head (Petterson ¶0112); Caution Zone Recommendations Awkward Posture Working with the hand(s) above the head, or the elbow(s) above the shoulders more than 2 hours total per day (Petterson p. 15, Table 1), wherein identifying a hazard of hand(s)/elbow(s) above head/shoulders is considered to read on classifying a movement as an arm elevated more than 90 degrees]; arm elevated more than 90 degrees for a period in excess of 30 seconds; arm elevated more than 90 degrees more than 2 times a minute; arm elevated for than 90 degrees for more than 20% of working time; and hazardous pulling and pushing. Regarding claim 29, Petterson teaches A system according to claim 1, wherein the processor is configured to classify movements as one or more of the following: twisting of back more than 30 degrees while bending more than 50 degrees; bending more than 90 degrees [Examples of measured data are illustrated in FIGS. 4A-4B. FIG. 4A is a plot of number of bends (normalized to 8 hours) as a function of angle. A complementary representation, illustrated in FIG. 4B, plots cumulative time (in seconds, normalized to 8 hours) as a function of angle. It may be observed that motions within the range of 40 degrees-50 degrees are frequent and held for a brief time, while motions within the range of 60 degrees-70 degrees occur less frequently but are held for a longer time. Motions within the range of 80 degrees-130 degrees occur less frequently and are held for a brief time. From this, it may be inferred that bends occurring frequently and for long times represent the position a worker adopts when carrying an object, while bends occurring frequently or infrequently for short times represent transitions while an object is being lifted (Petterson ¶0113, Figs. 4A-B)]; bending more than 60 degrees for at least 20 seconds; at least 2 hazardous movement in 2 minutes; and jerky movement. Regarding claim 30, Petterson teaches A system according to claim 1, wherein the processor is configured to determine a risk of continued performance of the movement of the body part [see corresponding § 112(b) rejection and interpretation above; the activity sensing system 102 provides interventions other than in real time. In some embodiments, information is collated and presented to a website (for example, in real-time, in near-real-time, or at a predetermined availability schedule) where a customer's user (e.g., a worker or a manager) is permitted to review the analyzed data (Petterson ¶0143); FIG. 12I shows an exemplary display providing historical tracking of safety scores before and after an intervention. In the display of FIG. 12I, baseline safety scores and safety scores resulting from the intervention are shown (Petterson ¶0155)]. Regarding claim 36, Petterson teaches A system for monitoring body movement, comprising: an inertial motion sensing unit arranged to measure movement of a body part, the sensing unit arranged to iteratively collect data representing movement of the body part over time [the wearable sensor 112 includes an inertial measurement unit (“IMU”) sensor, the wearable sensor 112 records three-dimensional motions of the worker during the day, starting with measurements directly from the three integrated sensors of the IMU… In some embodiments, the IMU takes readings from an accelerometer, gyroscope, and magnetometer, each of which measurements has an x, y, and z component. In some embodiments, sensor fusion techniques are applied to filter and integrate the nine-component sensor measurements to calculate the orientation of the single wearable sensor 112 mounted to the worker. In some embodiments, the orientation that is calculated in this manner is described by three angles: yaw, pitch, and roll (herein collectively “YPR”) (Petterson ¶0050, Fig. 2)]; a processor receiving and analysing the data to determine a work pattern and to determine the risk of injury of the body part and/or to track the movement of the body part, wherein the work pattern comprises counting risks in a movement [In operation 310, the worker's measured movements are analyzed. As discussed above, in certain embodiments, the analysis may be performed by a processor of the wearable sensor 112 itself. In alternative embodiments, the analysis may be performed by another computing device (e.g., one or more of user computing devices 104) or a remote server (Petterson ¶0114); The data analysis may quantify risk and quality of worker movements. These characterizations may be based upon one or more of industry standards, ergonomist recommendations, and combinations thereof (Petterson ¶0120), wherein quantifying risks of movements is considered to read on counting risks of a movement]. Regarding claim 41, Petterson teaches A system for monitoring body movement, comprising: an inertial motion unit sensor arranged to measure movement of a body part, the sensor arranged to iteratively collect data representing movement of the body part over time [the wearable sensor 112 includes an inertial measurement unit (“IMU”) sensor, the wearable sensor 112 records three-dimensional motions of the worker during the day, starting with measurements directly from the three integrated sensors of the IMU… In some embodiments, the IMU takes readings from an accelerometer, gyroscope, and magnetometer, each of which measurements has an x, y, and z component. In some embodiments, sensor fusion techniques are applied to filter and integrate the nine-component sensor measurements to calculate the orientation of the single wearable sensor 112 mounted to the worker. In some embodiments, the orientation that is calculated in this manner is described by three angles: yaw, pitch, and roll (herein collectively “YPR”) (Petterson ¶0050, Fig. 2)]; a processor for receiving and analysing the data [the analysis may be performed by a processor of the wearable sensor 112 itself. In alternative embodiments, the analysis may be performed by another computing device (e.g., one or more of user computing devices 104) or a remote server that. In the case of analyses performed by a remote server, the results may be further transmitted to one or more of user computing devices 104 (Petterson ¶0114)], the processor configured to: determine an orientation of the body part [In operation 306, the wearable sensor 112 measures movements of the worker over any range of motion. For example, in the case where the wearable sensor 112 is mounted to the user's chest, the position of the worker's back and the angle of the back with respect to the ground as a function of time. Such measurements may be taken at discrete time intervals or continuously. The measured worker movement data is saved to the data storage device 106 locally or remotely for subsequent analysis (Petterson ¶0111)]; maintain a record of a state of movement of the body part [Petterson ¶0111]; determine whether the body part has a changed state of movement and to update the maintained record of the state of movement when the body part has the changed state of movement [Petterson ¶0111, wherein monitoring and storing measured movement is considered to read on the claimed limitation]; classify the movement of the body part according to the orientation of the body part, the current state of movement of the body part and the data during the current state of movement of the body part [the measurements taken by the wearable sensor 112 may be used to determine if the worker is performing one or more movements including, but not limited to, walking, running, jumping, squatting, standing upright, twisting their torso, pivoting around one foot, reaching above their head, and riding in a vehicle (Petterson ¶0112)]. Claim(s) 55 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Mochizuki (US-20200388190-A1). Regarding claim 55, Mochizuki teaches A sensor device for monitoring body part movements, comprising: a housing having an attachment device for attaching the housing to a head of a user only [For example, in A in FIG. 2, the wearable sensor 101-1 is worn on the head of the user, and detects motion of the head of the user… That is, motion, center of gravity, an attitude, or the like, of the body trunk of the user are detected by the wearable sensor 101-1 or the wearable sensor 101-6 (Mochizuki ¶0078, Fig. 2)]; an accelerometer mounted in the housing [The wearable sensor 101 includes a high dynamic range (HDR) acceleration sensor 201, a low dynamic range (LDR) acceleration sensor 202, a high dynamic range (HDR) gyro sensor 203, a low dynamic range (LDR) gyro sensor 204, a geomagnetic sensor 205, a strain sensor 206, an atmospheric pressure sensor 207 (Mochizuki ¶0111)]; a gyroscope mounted in the housing [Mochizuki ¶0111]; an atmospheric pressure sensor mounted in the housing and arranged to measure external atmospheric pressure [Mochizuki ¶0111]; a processor for collecting data of measurements from the accelerometer, and determining movement of body parts of the user including the head and back of the user [The sensor data acquiring unit 209 acquires the HDR acceleration data, the LDR acceleration data, the HDR angular velocity data and the LDR angular velocity data respectively from the HDR acceleration sensor 201, the LDR acceleration sensor 202, the HDR gyro sensor 203, and the LDR gyro sensor 204, adds the acquired time to the acquired data, and supplies the data to the merge processing unit 210. Further, the sensor data acquiring unit 209 acquires the geomagnetic data from the geomagnetic sensor 205, adds the acquired time to the acquired data and supplies the data to the attitude detecting unit 211. Still further, the sensor data acquiring unit 209 acquires the strain data and the atmospheric pressure data respectively from the strain sensor 206 and the atmospheric pressure sensor 207, adds the acquired time to the acquired data, and supplies the data to the correcting unit 212 (Mochizuki ¶0122); Mochizuki ¶0078, Fig. 2]. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 2-3 is/are rejected under 35 U.S.C. 103 as being unpatentable over Petterson, as applied to claim 1 above, in view of Chowdhary (US-20120303271-A1). Regarding claim 2, Petterson teaches A system according to claim 1, wherein, the processor is configured to determine a type of movement the body part is making from data collected during tracking of the movement of the body part and a body part type [Petterson ¶¶0111-0112]. However, while Petterson discloses that the same inertial motion sensing unit is arranged to be positionable on different body parts [As discussed above, the wearable sensor 112 is securely mounted at a desired location on the body, such as the worker's chest or wrist. In further embodiments, the wearable sensor 112 may be mounted to the worker's back, torso, hip, or ankle (Petterson ¶0104)] and that a calibration and data analysis is performed with the inertial motion sensing unit at a prearranged location [In operation 304, the wearable sensor 112 is calibrated. For example, the worker presses the calibration button 212, while standing upright and still (i.e., in a neutral posture), to begin the calibration process (Petterson ¶0105)], Petterson fails to explicitly disclose wherein the processor is configured to determine the type of body part. Chowdhary discloses systems for monitoring user movement, wherein Chowdhary discloses using inertial measurements from an IMU to determine the type of body part the IMU is attached to [GPS device 102 according to aspects of the invention also includes sensors such as accelerometers, pressure sensors, gyroscopes and the like (collectively, inertial measurement unit or IMU) (Chowdhary ¶0024); Control module 402 receives tracking LPC data for motion mode classification, and may be further assisted by a time-domain classifier, such as, acceleration norm data. The output of the control module is context information, such as whether the user carrying the PND is stationary, walking, fast walking, or jogging. Once a non-stationary mode is detected, the position module 404 analyzes data from the various sensors, such as a compass (magnetic sensor for orientation data), accelerometer, gyroscope, and roll, pitch and yaw indicative sensors. The output of the position module 404 is the precise location of the PND, i.e. whether it is near the head, in a shirt pocket, near the waist (possibly in a holster), in a trouser pocket, in a swinging hand (i.e. the user is moving), in a backpack etc. (Chowdhary ¶0058, Fig. 4); Various time domain classifiers, such as, amplitude of pitch angle variation, number of zero crossings in norm of acceleration, number of peaks in acceleration data, and amplitude of yaw angle variation are utilized to determine the location of module on the body (Chowdhary ¶0059)]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system of Petterson to employ wherein the processor is configured to determine the type of body part, so as to provide contextual information regarding the position of the inertial motion sensing unit to allow for body type accurate analysis. Regarding claim 3, Petterson in view of Chowdhary teaches A system according to claim 2, wherein the determined type of movement is classified as being implemented in either a 'correct' or an 'incorrect' manner [Examples of measured data are illustrated in FIGS. 4A-4B. FIG. 4A is a plot of number of bends (normalized to 8 hours) as a function of angle. A complementary representation, illustrated in FIG. 4B, plots cumulative time (in seconds, normalized to 8 hours) as a function of angle. It may be observed that motions within the range of 40 degrees-50 degrees are frequent and held for a brief time, while motions within the range of 60 degrees-70 degrees occur less frequently but are held for a longer time. Motions within the range of 80 degrees-130 degrees occur less frequently and are held for a brief time. From this, it may be inferred that bends occurring frequently and for long times represent the position a worker adopts when carrying an object, while bends occurring frequently or infrequently for short times represent transitions while an object is being lifted. In general, a worker exhibiting good lifting technique will spend higher amounts of time at lower angles, while a worker exhibiting bad lifting technique will show a higher amount of time at higher angles (Petterson ¶0113)]. Claim(s) 9, 12-13, 22, and 26-27 is/are rejected under 35 U.S.C. 103 as being unpatentable over Petterson, as applied to claim 1 above, in view of Elhawary (US-20170347965-A1, cited by Applicant). Regarding claim 9, Petterson teaches A system according to claim 1. However, while Petterson discloses performing determinations to confirm whether a determined state change is accurately classified [for example, if a person bends such that the sagittal angle begins at a local minimum of 50 degrees, goes to a local maximum of 60 degrees, and returns to a local minimum of 50 degrees, no lift is detected, because, although the peak sagittal flexion of 60 degrees exceeds the MPH of 30 degrees, the prominence (i.e., the 10 degree difference between the 60 degree peak and the 50 degree local minimum) does not exceed the 40 degrees MPP (Petterson ¶0115)], Petterson fails to explicitly disclose wherein the analysing comprises determining whether a determined state change is a false positive, wherein determining of the false positive comprises analysing one or more of: a duration of a detected movement, a change in the angle of movement, an orientation of the body part, a rate of change of the angle of movement, a rate of change of the angle of orientation. Elhawary discloses systems for monitoring user movement and determining risk of injury to the user, wherein Elhawary discloses that during analysis of user movement, the analysis comprises determining whether a state change in movement of the user is a false positive, wherein determining of the false positive comprises analysing one or more of: a duration of a detected movement, a change in the angle of movement, an orientation of the body part, a rate of change of the angle of movement, a rate of change of the angle of orientation [Optionally, the method may then evaluate (450) a portion of the signals from the time period immediately before lift and immediately following the lift. This may be used, for example, to eliminate false positives prior to incorporating such results into statistics being reported. For example, when a worker bends over to lift something outside the scope of his task, such as a worker bending down to lift a pen from the floor and place it in his pocket. In such an example, the initial back bending angle and lowering of the wrist, as measured by wrist height, would indicate a lifting event. However, since the wrist would then align with hip of the worker and the back of the worker would straighten, this would not be considered a lifting event. Accordingly, the portion of the signal immediately following the lift may then clarify that the lift detected would constitute a false positive for the purpose of statistics being gathered (Elhawary ¶0043); The height of sensors is typically extracted from a barometer, or other types of altimeters. Data from these sensors tend to drift. Accordingly, the drift may be corrected by coupling the sensor data with acceleration data in the gravity direction in a Kalman filter. This may also be done by way of a low pass filter for certain types of altimeters. Further, the height detector may be calibrated by setting the height to a known value upon the initiation of a lift. For example, the height of a back sensor may be set to a fixed value at the beginning of each lift, regardless of whether the worker is, for example, standing on a stool (Elhawary ¶0064)]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system of Petterson to employ wherein the analysing comprises determining whether a determined state change is a false positive, wherein determining of the false positive comprises analysing one or more of: a duration of a detected movement, a change in the angle of movement, an orientation of the body part, a rate of change of the angle of movement, a rate of change of the angle of orientation, so as to prevent improper analysis of movements for the purposes of determining risk of injury. Regarding claim 12, Petterson in view of Elhawary teaches A system according to claim 9, wherein classifying comprises determining a selected one of a plurality of predetermined tilting or movement techniques performed by the body part [Petterson ¶0112]. Regarding claim 13, Petterson in view of Elhawary teaches A system according to claim 12, wherein determining the technique comprises analysing the change in the angle of movement, and/or change in the orientation of the body part and/or determining whether there is an atmospheric pressure change [Petterson ¶¶0111-0112]. Regarding claim 22, Petterson teaches A system according to claim 1. However, Petterson fails to explicitly disclose wherein the processor is configured to determine that during movement of the body part, the movement is jerky. Elhawary discloses systems for monitoring user movement and determining risk of injury to the user, wherein Elhawary discloses determining whether movement of a body part of the user is jerky [The various signals evaluated upon identifying a lifting motion may then be used to detect acceleration in the vertical direction in the world frame of reference. Accordingly, when a worker begins a lifting process, the wrist based accelerometer may immediately detect a jerking motion as the height sensor begins to rise from its lowest position. The velocity of the rising motion may then be used as a proxy for effort applied in lifting, which in turn may be used as a proxy for determining the weight of an object lifted. Such an approach may determine both the weight of the object being lifted or, if the weight of the object is known, the fatigue of the worker lifting the object. Either approach will allow the system to determine an effective weight of the object from the perspective of the worker. Including the fatigue of the worker lifting the object in this way may further incorporate a fatigue component in evaluating risk to the worker (Elhawary ¶0079)]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system of Petterson to employ wherein the processor is configured to determine that during movement of the body part, the movement is jerky, as jerky movement is considered to be indicative of certain actions that involve risk of injury. Regarding claim 26, Petterson teaches A system according to claim 1, wherein the processor is configured to determine velocity of movement [For example, in the case where the wearable sensor 112 is mounted to the user's chest, the position of the worker's back and the angle of the back with respect to the ground as a function of time (Petterson ¶0111)], hazards per hour [The risk factors indicated in Table 2 include lift rate (i.e., the number of lifting movements made per hour) (Petterson ¶0123)], frequency of arm elevation [the measurements taken by the wearable sensor 112 may be used to determine if the worker is performing one or more movements including, but not limited to,… reaching above their head (Petterson ¶0112); Caution Zone Recommendations Awkward Posture Working with the hand(s) above the head, or the elbow(s) above the shoulders more than 2 hours total per day (Petterson p. 15, Table 1)], proportion of time arm elevated [Examples of measured data are illustrated in FIGS. 4A-4B. FIG. 4A is a plot of number of bends (normalized to 8 hours) as a function of angle. A complementary representation, illustrated in FIG. 4B, plots cumulative time (in seconds, normalized to 8 hours) as a function of angle (Petterson ¶0113, Figs. 4A-B), wherein in the context of measuring arm movements, the bends may be considered applicable to levels of arm elevation], intensity of force of movement over time [Petterson ¶¶0050, 0052, 0111, wherein an amplitude of inertial data is considered to read on “intensity of force of movement”], amount of time spent moving, and amount of time spent recovering [Petterson ¶0113, Figs. 4A-B]. However, Petterson fails to explicitly disclose wherein the processor is configured to determine a shoulder limb position angle, jerkiness of movement. Elhawary discloses systems for monitoring user movement and determining risk of injury to the user, wherein Elhawary discloses determining a shoulder limb position angle using derived arm elevation angles [Arm elevation angles may further be used to detect lifts above the shoulder, for example (Elhawary ¶0037)] and whether movement of a body part of the user is jerky [Elhawary ¶0079]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system of Petterson to employ wherein the processor is configured to determine a shoulder limb position angle, jerkiness of movement, as this modification would amount to mere application of a known technique to a known device (method, or product) ready for improvement to yield predictable results [MPEP § 2143(I)(D)] and as jerky movement is considered to be indicative of certain actions that involve risk of injury. Regarding claim 27, Petterson teaches A system according to claim 1, wherein the processor is configured to determine a back position [For example, in the case where the wearable sensor 112 is mounted to the user's chest, the position of the worker's back and the angle of the back with respect to the ground as a function of time (Petterson ¶0111)], velocity of movement [Petterson ¶0111], twisting while bending angle [In some embodiments, parameters that are relevant to the ergonomics of the worker's motions, such as sagittal position, twist position, and lateral position. In some embodiments, a geometric calculation is performed on the set of YPR values for the body to determine the sagittal, twist, and lateral positions (Petterson ¶0052); Petterson ¶0111], bending angle [Petterson ¶0111], hazards per hour [The risk factors indicated in Table 2 include lift rate (i.e., the number of lifting movements made per hour) (Petterson ¶0123)], frequency of bending, twisting, static posture and intensity movement [Petterson ¶¶0050, 0052, 0111], proportion of time bending twisting [Petterson ¶¶0050, 0052, 0111], and intensity of force of movement twisting while bending [Petterson ¶¶0050, 0052, 0111, wherein an amplitude of inertial data is considered to read on “intensity of force of movement”]. However, Petterson fails to explicitly disclose wherein the processor is configured to determine jerkiness of movement. Elhawary discloses systems for monitoring user movement and determining risk of injury to the user, wherein Elhawary discloses determining whether movement of a body part of the user is jerky [Elhawary ¶0079]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system of Petterson to employ wherein the processor is configured to determine jerkiness of movement, as jerky movement is considered to be indicative of certain actions that involve risk of injury. Claim(s) 18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Petterson, as applied to claim 15, in view of Mochizuki (US-20200388190-A1). Regarding claim 18, Petterson teaches A system according to claim 15. However, Petterson fails to explicitly disclose wherein the sensing unit is only located on the head of the user. Mochizuki discloses systems for monitoring user movement, wherein Mochizuki discloses a singular inertial motion sensing unit arranged to measure movement of a body part, wherein the sensing unit is located on a head of a user and is configured to provide measure movement of body parts of the user including the head and back of the user [For example, in A in FIG. 2, the wearable sensor 101-1 is worn on the head of the user, and detects motion of the head of the user… That is, motion, center of gravity, an attitude, or the like, of the body trunk of the user are detected by the wearable sensor 101-1 or the wearable sensor 101-6 (Mochizuki ¶0078, Fig. 2); The wearable sensor 101 includes a high dynamic range (HDR) acceleration sensor 201, a low dynamic range (LDR) acceleration sensor 202, a high dynamic range (HDR) gyro sensor 203, a low dynamic range (LDR) gyro sensor 204, a geomagnetic sensor 205,… an atmospheric pressure sensor 207 (Mochizuki ¶0111)]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system of Petterson to employ wherein the sensing unit is only located on the head of the user, as this modification would amount to mere simple substitution of one known element for another with similar expected results [position of inertial motion sensing unit for monitoring movement of body parts] [MPEP § 2143(I)(B)]. Claim(s) 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Petterson, as applied to claim 16 above, in view of Raghuram (US-20160058372-A1). Regarding claim 19, Petterson teaches A system according to claim 16. However, while Petterson discloses determining arm movement of the user [Petterson ¶0104], Petterson fails to explicitly disclose wherein the processor is configured to analyse the data to determine whether a user is sitting, and to determine arm movement when the user is sitting. Raghuram discloses systems for monitoring user movement, wherein Raghuram discloses determining whether a user is sitting based on measurements from a single inertial motion sensing unit located on the arm of the user [the fitness tracking device 100 may be a wearable device, such as a watch configured to be worn around an individual's wrist (Raghuram ¶0053); the fitness tracking device 100 may also include the motion sensing module 220. The motion sensing module 220 may include one or more motion sensors, such as an accelerometer or a gyroscope (Raghuram ¶0058); performing posture detection (e.g., detecting whether the user is sitting or standing) (Raghuram ¶0254); At block 1550 (i.e., user was determined to be sitting), intensity may be estimated based on motion data from blocks 1510 and 1520. For example, the motion data may indicate how much the user is “fidgeting” (e.g., incidental movement, swaying, etc.), so an amount of fidgeting may be detected at block 1550, and the amount of fidgeting may be passed to block 1560. Incidental movement or fidgeting may be relatively low while sitting (e.g., typing, turning a steering wheel, etc.) (Raghuram ¶0258)]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system of Petterson to employ wherein the processor is configured to analyse the data to determine whether a user is sitting, and to determine arm movement when the user is sitting, so as to provide additional contextual information regarding an activity state of the user, and as this modification amount to mere application of a known technique to a known device (method, or product) ready for improvement to yield predictable results [MPEP § 2143(I)(D)]. Claim(s) 25 is/are rejected under 35 U.S.C. 103 as being unpatentable over Petterson, as applied to claim 23, in view of Elhawary (US-20170347965-A1, cited by Applicant) and Muller III (US-20200379567-A1), hereinafter Muller. Regarding claim 25, Petterson teaches A system according to claim 23. However, while Petterson discloses continuous monitoring over time, and monitoring time when the user is performing movements and time when the user is not performing movements/recovering from movement [For example, in the case where the wearable sensor 112 is mounted to the user's chest, the position of the worker's back and the angle of the back with respect to the ground as a function of time. Such measurements may be taken at discrete time intervals or continuously (Petterson ¶0113); the measurements taken by the wearable sensor 112 may be used to determine if the worker is performing one or more movements including, but not limited to, walking, running, jumping, squatting, standing upright, twisting their torso, pivoting around one foot, reaching above their head, and riding in a vehicle (Petterson ¶0114); Examples of measured data are illustrated in FIGS. 4A-4B. FIG. 4A is a plot of number of bends (normalized to 8 hours) as a function of angle. A complementary representation, illustrated in FIG. 4B, plots cumulative time (in seconds, normalized to 8 hours) as a function of angle (Petterson ¶0115, Figs. 4A-B)], as well as monitoring user risk of injury over months [Risk scores calculated from the measured worker movements may be further displayed to the worker, as illustrated in FIGS. 6A-6C. FIG. 6A illustrates one example of how the analysis of measured worker's movements can be plotted over time and broken out into durations of time in various risk levels, in the case the various risk levels are differentiated by color (Petterson ¶0134, Fig. 6A)], Petterson fails to explicitly disclose wherein determining the work pattern comprises determining an average amount of time spent moving in each movement and an average of the amount of time spent recovering from each movement. Elhawary discloses systems for monitoring user movement and determining risk of injury to the user, wherein Elhawary discloses monitoring average movement over time during active and non-active states of the user as an assessment of fatigue and risk of injury [Fatigue may be further evaluated by monitoring average acceleration rates of the wrist and back of the worker over time, including during non-lifting activities, such as inventory checking or manufacturing processes. By detecting reductions in acceleration rates over time, such a method may then identify fatigue and determine potential and kinetic energies expected by a workers body (Elhawary ¶0103); For example, depending on the values for the variables underlying the risk metric, the platform may recommend… allowing for longer recovery periods between lifts (Elhawary ¶0098)]. Elhawary further discloses monitoring periods comprising days, weeks, or months [In addition to recommendations, the a platform implementing the method may generate actionable visualizations by summarizing metrics recorded over the course of an evaluation period, or over an extended period of time, by providing charts indicating high risk times of days, weeks, or months, so that specific risks may be identified and addressed (Elhawary ¶099)]. Muller discloses systems for monitoring user movements, wherein Muller discloses an average time spent performing a movement as a known statistic for assessing the movement [the storage device 306 can store not only computer instructions related to providing guidance for certain movements or exercises, but also can store user data related to the performance of these movements or exercises, such as statistics related to user execution of the movement or exercise, including repetition count, average time related to repetitions and total exercise time, as well as user error(s) in the execution of the movement or exercise (Muller ¶0078)]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system of Petterson to employ wherein determining the work pattern comprises determining an average amount of time spent moving in each movement and an average of the amount of time spent recovering from each movement, so as to assess user fatigue and risk of injury over periods of weeks or months. Claim(s) 32 and 44 is/are rejected under 35 U.S.C. 103 as being unpatentable over Petterson (US-20170296129-A1, cited by Applicant) in view of Chowdhary (US-20120303271-A1). Regarding claim 32, Petterson teaches A system for monitoring body movement, comprising: an inertial motion sensing unit arranged to measure movement of a body part, the sensing unit arranged to iteratively collect data representing movement of the body part over time [the wearable sensor 112 includes an inertial measurement unit (“IMU”) sensor, the wearable sensor 112 records three-dimensional motions of the worker during the day, starting with measurements directly from the three integrated sensors of the IMU… In some embodiments, the IMU takes readings from an accelerometer, gyroscope, and magnetometer, each of which measurements has an x, y, and z component. In some embodiments, sensor fusion techniques are applied to filter and integrate the nine-component sensor measurements to calculate the orientation of the single wearable sensor 112 mounted to the worker. In some embodiments, the orientation that is calculated in this manner is described by three angles: yaw, pitch, and roll (herein collectively “YPR”) (Petterson ¶0050, Fig. 2)]; a processor receiving and analysing the data to determine the risk of injury of the body part and/or to track the movement of the body part [In operation 306, the wearable sensor 112 measures movements of the worker over any range of motion. For example, in the case where the wearable sensor 112 is mounted to the user's chest, the position of the worker's back and the angle of the back with respect to the ground as a function of time. Such measurements may be taken at discrete time intervals or continuously. The measured worker movement data is saved to the data storage device 106 locally or remotely for subsequent analysis (Petterson ¶0111); In operation 310, the worker's measured movements are analyzed. As discussed above, in certain embodiments, the analysis may be performed by a processor of the wearable sensor 112 itself. In alternative embodiments, the analysis may be performed by another computing device (e.g., one or more of user computing devices 104) or a remote server (Petterson ¶0114); The data analysis may quantify risk and quality of worker movements. These characterizations may be based upon one or more of industry standards, ergonomist recommendations, and combinations thereof (Petterson ¶0120)]. However, while Petterson discloses that the same inertial motion sensing unit is arranged to be positionable on different body parts [As discussed above, the wearable sensor 112 is securely mounted at a desired location on the body, such as the worker's chest or wrist. In further embodiments, the wearable sensor 112 may be mounted to the worker's back, torso, hip, or ankle (Petterson ¶0104)] and that a calibration and data analysis is performed with the inertial motion sensing unit at a prearranged location [In operation 304, the wearable sensor 112 is calibrated. For example, the worker presses the calibration button 212, while standing upright and still (i.e., in a neutral posture), to begin the calibration process (Petterson ¶0105)], Petterson fails to explicitly disclose wherein the processor analyzes the data to determine which body part the inertial motion sensing unit is attached to. Chowdhary discloses systems for monitoring user movement, wherein Chowdhary discloses using inertial measurements from an IMU to determine the type of body part the IMU is attached to [GPS device 102 according to aspects of the invention also includes sensors such as accelerometers, pressure sensors, gyroscopes and the like (collectively, inertial measurement unit or IMU) (Chowdhary ¶0024); Control module 402 receives tracking LPC data for motion mode classification, and may be further assisted by a time-domain classifier, such as, acceleration norm data. The output of the control module is context information, such as whether the user carrying the PND is stationary, walking, fast walking, or jogging. Once a non-stationary mode is detected, the position module 404 analyzes data from the various sensors, such as a compass (magnetic sensor for orientation data), accelerometer, gyroscope, and roll, pitch and yaw indicative sensors. The output of the position module 404 is the precise location of the PND, i.e. whether it is near the head, in a shirt pocket, near the waist (possibly in a holster), in a trouser pocket, in a swinging hand (i.e. the user is moving), in a backpack etc. (Chowdhary ¶0058, Fig. 4); Various time domain classifiers, such as, amplitude of pitch angle variation, number of zero crossings in norm of acceleration, number of peaks in acceleration data, and amplitude of yaw angle variation are utilized to determine the location of module on the body (Chowdhary ¶0059)]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system of Petterson to employ wherein the processor analyzes the data to determine which body part the inertial motion sensing unit is attached to, so as to provide contextual information regarding the position of the inertial motion sensing unit to allow for body type accurate analysis. Regarding claim 44, Petterson teaches A method of monitoring body movement, comprising: measuring movement of a body part by iteratively collect data representing movement of the body part over time using a sensor unit [the wearable sensor 112 includes an inertial measurement unit (“IMU”) sensor, the wearable sensor 112 records three-dimensional motions of the worker during the day, starting with measurements directly from the three integrated sensors of the IMU… In some embodiments, the IMU takes readings from an accelerometer, gyroscope, and magnetometer, each of which measurements has an x, y, and z component. In some embodiments, sensor fusion techniques are applied to filter and integrate the nine-component sensor measurements to calculate the orientation of the single wearable sensor 112 mounted to the worker. In some embodiments, the orientation that is calculated in this manner is described by three angles: yaw, pitch, and roll (herein collectively “YPR”) (Petterson ¶0050, Fig. 2)]; analysing the data to determine risk of injury of the body part or to track the movement of the body part [In operation 306, the wearable sensor 112 measures movements of the worker over any range of motion. For example, in the case where the wearable sensor 112 is mounted to the user's chest, the position of the worker's back and the angle of the back with respect to the ground as a function of time. Such measurements may be taken at discrete time intervals or continuously. The measured worker movement data is saved to the data storage device 106 locally or remotely for subsequent analysis (Petterson ¶0111); In operation 310, the worker's measured movements are analyzed. As discussed above, in certain embodiments, the analysis may be performed by a processor of the wearable sensor 112 itself. In alternative embodiments, the analysis may be performed by another computing device (e.g., one or more of user computing devices 104) or a remote server (Petterson ¶0114); The data analysis may quantify risk and quality of worker movements. These characterizations may be based upon one or more of industry standards, ergonomist recommendations, and combinations thereof (Petterson ¶0120)]. However, while Petterson discloses that the same inertial motion sensing unit is arranged to be positionable on different body parts [As discussed above, the wearable sensor 112 is securely mounted at a desired location on the body, such as the worker's chest or wrist. In further embodiments, the wearable sensor 112 may be mounted to the worker's back, torso, hip, or ankle (Petterson ¶0104)] and that a calibration and data analysis is performed with the inertial motion sensing unit at a prearranged location [In operation 304, the wearable sensor 112 is calibrated. For example, the worker presses the calibration button 212, while standing upright and still (i.e., in a neutral posture), to begin the calibration process (Petterson ¶0105)], Petterson fails to explicitly disclose determining which body part the sensor unit is attached to based on the collected data. Chowdhary discloses systems for monitoring user movement, wherein Chowdhary discloses using inertial measurements from an IMU to determine the type of body part the IMU is attached to [GPS device 102 according to aspects of the invention also includes sensors such as accelerometers, pressure sensors, gyroscopes and the like (collectively, inertial measurement unit or IMU) (Chowdhary ¶0024); Control module 402 receives tracking LPC data for motion mode classification, and may be further assisted by a time-domain classifier, such as, acceleration norm data. The output of the control module is context information, such as whether the user carrying the PND is stationary, walking, fast walking, or jogging. Once a non-stationary mode is detected, the position module 404 analyzes data from the various sensors, such as a compass (magnetic sensor for orientation data), accelerometer, gyroscope, and roll, pitch and yaw indicative sensors. The output of the position module 404 is the precise location of the PND, i.e. whether it is near the head, in a shirt pocket, near the waist (possibly in a holster), in a trouser pocket, in a swinging hand (i.e. the user is moving), in a backpack etc. (Chowdhary ¶0058, Fig. 4); Various time domain classifiers, such as, amplitude of pitch angle variation, number of zero crossings in norm of acceleration, number of peaks in acceleration data, and amplitude of yaw angle variation are utilized to determine the location of module on the body (Chowdhary ¶0059)]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the method of Petterson to employ determining which body part the sensor unit is attached to based on the collected data, so as to provide contextual information regarding the position of the inertial motion sensing unit to allow for body type accurate analysis. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to SEVERO ANTONIO P LOPEZ whose telephone number is (571)272-7378. The examiner can normally be reached M-F 9-6 EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Charles Marmor II can be reached at (571) 272-4730. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /SEVERO ANTONIO P LOPEZ/Examiner, Art Unit 3791
Read full office action

Prosecution Timeline

Feb 26, 2024
Application Filed
Feb 10, 2026
Non-Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12575781
PORTABLE AND WEARABLE ELECTROMYOGRAPHIC BIOFEEDBACK FOR SPINAL CORD INJURY TO ENHANCE NEUROPLASTICITY
2y 5m to grant Granted Mar 17, 2026
Patent 12549134
NON-CONTACT SENSING NODE, SYSTEMS AND METHODS OF REMOTE SENSING
2y 5m to grant Granted Feb 10, 2026
Patent 12543972
BIOMECHANICAL MEASUREMENT DEVICES AND USES THEREOF FOR PHENOTYPE-GUIDED MOVEMENT ASSESSMENT, INTERVENTION, AND ACTIVE ASSISTANCE DEVICE CONTROL
2y 5m to grant Granted Feb 10, 2026
Patent 12419554
PRECISE ARTERIAL BLOOD SAMPLING DEVICE
2y 5m to grant Granted Sep 23, 2025
Patent 12408901
INTRAUTERINE TISSUE COLLECTION INSTRUMENT
2y 5m to grant Granted Sep 09, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
32%
Grant Probability
65%
With Interview (+33.4%)
3y 6m
Median Time to Grant
Low
PTA Risk
Based on 149 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month