Detailed Action
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Priority
Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55.
Claim Objections
Claims 5-6 are objected to because of the following informalities: these claims contain limitations with elements that are not written out in their entirety as well as limitations that are missing proper punctuation to connect elements.
Claim 5 is objected to because of the following informalities: the limitation “the instructions to” recited in line 2 of the claim should end in a colon in order to connect the limitation to the elements that follow it recited within the claim. This correction would recite “the instructions to:”.
Claim 6 is objected to because of the following informalities: the limitation “the instructions to” recited in line 2 of the claim should end in a colon in order to connect the limitation to the elements that follow it recited within the claim. This correction would recite “the instructions to:”.
Appropriate correction is required.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claim 2 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Regarding Claim 2, it is unclear what the limitations “obtained in verification” recited in lines 4-5 and “measured in verification” recited in line 7 of the claim means. It is unclear what is being verified. Is the parameter being verified? Is the knee flexion angle being verified? Is the subject themselves being verified? Is the gait itself being verified? What does verification entail or what is it being measured against? The terminology within this limitation is indefinite for how “verification” should be interpreted.
Claims not explicitly rejected above are rejected due to their dependence on the above claims.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-10 are rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. The claim(s) as a whole, considering all claim elements both individually and in combination, do not amount to significantly more than an abstract idea. A streamlined analysis of Claim 9 follows.
STEP 1
Regarding Claim 9, the claim recites a series of steps or acts, including acquiring feature amount data including a feature amount to be used for estimating a parameter regarding a knee flexion angle of a user; inputting the parameter regarding the knee flexion angle to a machine learning model; and displaying information according to the parameter regarding the knee flexion output on a screen of a mobile terminal used by the user. Thus, the claim is directed to a process, which is one of the statutory categories of invention.
STEP 2A, PRONG ONE
The claim is then analyzed to determine whether it is directed to any judicial exception. The steps of estimating a parameter regarding a knee flexion angle of a user and displaying information according to the parameter regarding the knee flexion output on a screen of a mobile terminal used by the user set forth a judicial exception. These steps describe a concept performed in the human mind (including an observation, evaluation, judgment, opinion). Thus, the claim is drawn to a Mental Process, which is an Abstract Idea.
STEP 2A, PRONG TWO
Next, the claim as a whole is analyzed to determine whether the claim recites additional elements that integrate the judicial exception into a practical application. The claim fails to recite an additional element or a combination of additional elements to apply, rely on, or use the judicial exception in a manner that imposes a meaningful limitation on the judicial exception. Claim 9 fails to recite any application of estimating a parameter regarding a knee flexion angle of a user and displaying information according to the parameter regarding the knee flexion output on a screen of a mobile terminal used by the user in a manner that imposes a meaningful limitation on the Abstract Idea. The Abstract Idea alone does not provide an improvement to the technological field, the method does not affect a particular treatment or effect a particular change based on an estimated parameter regarding a knee flexion angle of a user or displaying information according to the parameter regarding the knee flexion output, nor does the method use a particular machine to perform the Abstract Idea.
STEP 2B
Next, the claim as a whole is analyzed to determine whether any element, or combination of elements, is sufficient to ensure that the claim amounts to significantly more than the exception. Besides the Abstract Idea, Claim 9 recites additional steps of acquiring feature amount data; inputting the parameter regarding the knee flexion angle to a machine learning model. The acquiring and inputting steps are recited at a high level of generality such that they amount to insignificant pre-solution activity, e.g., mere data gathering step necessary to perform the Abstract Idea. When recited at this high level of generality, there is no meaningful limitation, such as a particular or unconventional step that distinguishes it from well-understood, routine, and conventional data gathering activity engaged in by medical professionals prior to Applicant's invention. Furthermore, it is well established that the mere physical or tangible nature of additional elements such as the acquiring and inputting steps do not automatically confer eligibility on a claim directed to an abstract idea (see, e.g., Alice Corp. v. CLS Bank Int'l, 134 S.Ct. 2347, 2358-59 (2014)).
Consideration of the additional elements as a combination also adds no other meaningful limitations to the exception not already present when the elements are considered separately. Unlike the eligible claim in Diehr in which the elements limiting the exception are individually conventional, but taken together act in concert to improve a technical field, the claim here does not provide an improvement to the technical field. Even when viewed as a combination, the additional elements fail to transform the exception into a patent-eligible application of that exception. Thus, the claim as a whole does not amount to significantly more than the exception itself. The claim is therefore drawn to non-statutory subject matter.
Regarding Claim 1, the claim recites a series of components, including a memory configured to store instructions, and a processor configured to execute instructions to acquire feature amount data to be used for estimating a parameter regarding a knee flexion angle of a user, and display information related to a parameter regarding the knee flexion angle. Thus, the claim is directed to a machine, which is one of the statutory categories of invention. The steps of estimating an output and outputting information set forth a judicial exception. These steps describe a concept performed in the human mind (including an observation, evaluation, judgement, opinion). Thus, the claim is drawn to a Mental Process, which is an Abstract Idea. Additionally, the device recited in the claim is a generic device comprising generic components configured to perform the abstract idea. The recited “index value estimation device” is a generic device configured to perform acquiring feature amount data and estimating an output obtained by inputting data as mere pre-solution data gathering; outputting information related to an estimated index value as mere post-solution data gathering; and the “memory”, the “processor”, and “machine learning model” are generic computer programs configured to perform storing an estimation model that outputs data, storing instructions, and executing instructions as well as perform the Abstract Idea. According to section 2106.05(f) of the MPEP, merely using a computer as a tool to perform an abstract idea does not integrate the Abstract Idea into a practical application.
Regarding Claim 10, the claim recites a non-transitory recording medium as a component configured to store a program that causes a computer to acquire feature amount data to be used for estimating a parameter regarding a knee flexion angle of a user, inputting the parameter regarding the knee flexion angle to a machine learning model, and displaying information related to the parameter regarding the knee flexion angle. Thus, the claim is directed to a machine, which is one of the statutory categories of invention. The steps of estimating an output and outputting information set forth a judicial exception. These steps describe a concept performed in the human mind (including an observation, evaluation, judgement, opinion). Thus, the claim is drawn to a Mental Process, which is an Abstract Idea. Additionally, the claim recites a series of steps or acts, including acquiring feature amount data including a feature amount to be used for estimating a parameter regarding a knee flexion angle of a user; inputting the parameter regarding the knee flexion angle to a machine learning model; and displaying information according to the parameter regarding the knee flexion angle output. Thus, the claim is directed to a process, which is one of the statutory categories of invention. The device recited in the claim is a generic device comprising generic components configured to perform the abstract idea. The recited “non-transitory program recording medium” and “machine learning model” are generic computer devices with generic computer programs that are configured to perform acquiring feature amount data; inputting the parameter into a machine learning model; and displaying information regarding the parameter as mere post-solution data gathering. The generic computer and its program are configured to perform the above as well as the Abstract Idea. According to section 2106.05(f) of the MPEP, merely using a computer as a tool to perform an abstract idea does not integrate the Abstract Idea into a practical application.
Dependent Claims 2-8 fail to add something more to the abstract independent claims as they generally recite steps pertaining to data gathering and processing. Regarding Claims 2-7, a processor and a machine learning model are recited at a high level of generality that they amount to a generic computer and generic computer program. The step of estimating an output is mere pre-solution data gathering. The step of displaying recommendation information is mere post-solution data gathering. Regarding Claim 8, a “data acquisition device” configured to measure spatial acceleration and spatial angular velocity is mere pre-solution data gathering.
The acquiring, inputting, estimating, and displaying steps recited in the independent claims, Claims 1 and 9-10, maintain a high level of generality even when considered in combination with the dependent claims.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-2, 5, and 7-10 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Marcus et. al.'779 (U.S. Patent Application 20220051779 – cited by applicant).
Regarding Claim 1, Marcus et. al.'779 discloses an index estimation device (Paragraph [0043] - the pose module 221 may estimate one or more internal forces of the lower-extremity musculoskeletal system, and in some cases may combine those force estimate(s) into a single numerical value that estimates knee health of the wearer in real-time; Paragraph [0053] - the numerical value that estimates knee health of the wearer may be a knee index) comprising:
a memory storing instructions (Paragraph [0123] - The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device); and
a processor connected to the memory and configured to execute the instructions (Paragraph [0122] - The present invention may be a system, a method, and/or a computer program product. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention) to:
acquire feature amount data including a feature amount to be used for estimating a parameter regarding a knee flexion angle of a user, the feature amount being extracted from sensor data related to a motion of a foot of the user (Paragraph [0041] – entire paragraph; Paragraph [0043] - the pose module 221 may estimate one or more internal forces of the lower-extremity musculoskeletal system, and in some cases may combine those force estimate(s) into a single numerical value that estimates knee health of the wearer in real-time; Paragraph [0053] - the numerical value that estimates knee health of the wearer may be a knee index);
input the parameter regarding the knee flexion angle to a machine learning model that output the parameter regarding the knee flexion angle in response to input of the feature amount data (Paragraph [0027] - Compact implementations of the goniometers generally can measure motion in 2 degrees of freedom, but may not perform as well at measuring small motions such as the rotation about the long axis of the knee or the anterior draw (fore-aft motion of the knee); Paragraph [0038] - In some embodiments, sensor signals 230 from the one or more sensors 214 are applied to a model to determine the values. In some embodiments, the model comprises a machine learning model 224, which may be trained using sensor data from the wearer and/or one or more other wearers. The values may be represented in any suitable forms, such as pressure set points, dimensions or sizes of the one or more actuatable components 216 (e.g., height values), and so forth); and
display information according to the parameter regarding the knee flexion angle output from the machine learning model in response to the input of the feature amount data on a screen of a mobile terminal used by the user (Paragraph [0038] - In some embodiments, the mobile computing device 125 comprises a component geometry module 222 that is configured similarly to the component geometry module 210; Paragraph [0040] - In some embodiments, the memory 220 comprises a pose module 221 that is configured to determine and/or predict poses for one or more body parts of the wearer using the sensor signals 230. As defined herein, a “pose” represents a disposition of one or more body parts of the wearer in two-dimensional or three-dimensional space. In some embodiments, each pose includes position information and orientation information for the one or more body parts; Paragraph [0056] - In one example, the target poses may be displayed on the mobile computing device 125 and the wearer may graphically adjust the target poses (e.g., rotating a graphical representation of the leg to adjust an angle of the target poses)).
Regarding Claim 2, Marcus et. al.'779 discloses wherein the machine learning model is generated by a machine learning using training data having, as an explanatory variable, a feature amount used for estimating the parameter regarding the knee flexion angle extracted from the sensor data obtained in verification regarding a gait of each of a plurality of subjects, and having, as an objective variable, a measured value of the parameter regarding the knee flexion angle actually measured in verification regarding a gait of each of the plurality of subjects (Paragraph [0038] - In some embodiments, the model comprises a machine learning model 224, which may be trained using sensor data from the wearer and/or one or more other wearers. The values may be represented in any suitable forms, such as pressure set points, dimensions or sizes of the one or more actuatable components 216 (e.g., height values), and so forth; Paragraph [0114] - In some embodiments, the parameters for the gait controller 805 may be updated by an overall system controller. Thus, the parameters for the gait controller 805 may be determined locally or remotely. For example, a cloud-based analytics system may identify a desired modification to the external geometry of the body-wearable actuatable components for different poses, during gait cycles, and so forth. The modification may then be translated to an adjustment of the target parameters of the gait controller 805. Alternatively, the target parameters may be tuned locally based on the control of the foot-wearable apparatus; Paragraph [0116] - At block 1015, the different phases of a gait cycle are detected using the one or more sensor signals; Paragraph [0131] - receive data from body-worn actuatable components of sensor devices, and generate values for a desired external geometry using a machine-learning algorithm).
Regarding Claim 5, Marcus et. al.’779 discloses displaying recommendation information according to the estimation result of the parameter regarding the knee flexion angle on the screen of the mobile terminal (Paragraph [0056] - In one example, the target poses may be displayed on the mobile computing device 125 and the wearer may graphically adjust the target poses (e.g., rotating a graphical representation of the leg to adjust an angle of the target poses). In another example, the wearer may specify the adjustment to be made (e.g., increase inflation by 10%). In another example, the wearer may indicate a location of pain or discomfort, and the pose module 221 determines adjustment(s) to target pose(s) that are executed using the component geometry module 222).
Regarding Claim 7, Marcus et. al.'779 discloses the output is estimated by machine learning (Paragraph [0038] - In some embodiments, the model comprises a machine learning model 224, which may be trained using sensor data from the wearer and/or one or more other wearers), and
the information is used for decision making to address the knee state of the user (Paragraph [0131] - For example, the component geometry application could execute on a computing system in the cloud, receive data from body-worn actuatable components of sensor devices, and generate values for a desired external geometry using a machine-learning algorithm).
Regarding Claim 8, Marcus et. al.’779 discloses a data acquisition device configured to measure a spatial acceleration and a spatial angular velocity, and generate the sensor data based on the spatial acceleration and the spatial angular velocity (Paragraph [0027] - Measurement devices used may include inertial measurement units (IMU) (measuring acceleration and rotation to estimate angle in space) and/or goniometers (measuring joint angle). The IMU typically senses some combination of acceleration (in up to six axes), gyroscopes (rotational velocity), magnetometer (heading), and barometers (altitude)).
Regarding Claim 9, Marcus et. al.'779 discloses an estimation method executed by a computer (Paragraph [0035] - each module includes program code that is executable by the one or more computer processors 206; Paragraph [0043] - the pose module 221 may estimate one or more internal forces of the lower-extremity musculoskeletal system, and in some cases may combine those force estimate(s) into a single numerical value that estimates knee health of the wearer in real-time; Paragraph [0053] - the numerical value that estimates knee health of the wearer may be a knee index) the method comprising:
acquiring feature amount data including a feature amount to be used for estimating a parameter regarding a knee flexion angle of a user, the feature amount being extracted from sensor data related to a motion of a foot of the user (Paragraph [0041] – entire paragraph; Paragraph [0043] - the pose module 221 may estimate one or more internal forces of the lower-extremity musculoskeletal system, and in some cases may combine those force estimate(s) into a single numerical value that estimates knee health of the wearer in real-time; Paragraph [0053] - In some embodiments, the numerical value that estimates knee health of the wearer may be a knee index that characterizes articular cartilage loading at the tibio-femoral interface, represented by the resultant knee adduction moments (KAM) and knee flexion moments (KFM). The cumulative damage caused by these two loading conditions contribute to progression of knee osteoarthritis and impulse external moments (time integral over mid-stance phase) may be better predictors than peak external moments for disease progression);
inputting the parameter regarding the knee flexion angle to a machine learning model that output the parameter regarding the knee flexion angle in response to input of the feature amount data (Paragraph [0027] - Compact implementations of the goniometers generally can measure motion in 2 degrees of freedom, but may not perform as well at measuring small motions such as the rotation about the long axis of the knee or the anterior draw (fore-aft motion of the knee); Paragraph [0038] - In some embodiments, sensor signals 230 from the one or more sensors 214 are applied to a model to determine the values. In some embodiments, the model comprises a machine learning model 224, which may be trained using sensor data from the wearer and/or one or more other wearers. The values may be represented in any suitable forms, such as pressure set points, dimensions or sizes of the one or more actuatable components 216 (e.g., height values), and so forth); and
displaying information according to the parameter regarding the knee flexion angle output from the machine learning model in response to the input of the feature amount data on a screen of a mobile terminal used by the user (Paragraph [0038] - In some embodiments, the mobile computing device 125 comprises a component geometry module 222 that is configured similarly to the component geometry module 210; Paragraph [0040] - In some embodiments, the memory 220 comprises a pose module 221 that is configured to determine and/or predict poses for one or more body parts of the wearer using the sensor signals 230. As defined herein, a “pose” represents a disposition of one or more body parts of the wearer in two-dimensional or three-dimensional space. In some embodiments, each pose includes position information and orientation information for the one or more body parts; Paragraph [0056] - In one example, the target poses may be displayed on the mobile computing device 125 and the wearer may graphically adjust the target poses (e.g., rotating a graphical representation of the leg to adjust an angle of the target poses)).
Regarding Claim 10, Marcus et. al.'779 discloses a non-transitory program recording medium (Paragraph [0035] - each module includes program code that is executable by the one or more computer processors 206; Paragraph [0043] - the pose module 221 may estimate one or more internal forces of the lower-extremity musculoskeletal system, and in some cases may combine those force estimate(s) into a single numerical value that estimates knee health of the wearer in real-time; Paragraph [0123] - A non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM)) causing a computer to perform the following:
acquiring feature amount data including a feature amount to be used for estimating a parameter regarding a knee flexion angle of a user, the feature amount being extracted from sensor data related to a motion of a foot of the user (Paragraph [0041] – entire paragraph; Paragraph [0043] - the pose module 221 may estimate one or more internal forces of the lower-extremity musculoskeletal system, and in some cases may combine those force estimate(s) into a single numerical value that estimates knee health of the wearer in real-time; Paragraph [0053] - In some embodiments, the numerical value that estimates knee health of the wearer may be a knee index that characterizes articular cartilage loading at the tibio-femoral interface, represented by the resultant knee adduction moments (KAM) and knee flexion moments (KFM). The cumulative damage caused by these two loading conditions contribute to progression of knee osteoarthritis and impulse external moments (time integral over mid-stance phase) may be better predictors than peak external moments for disease progression);
inputting the parameter regarding the knee flexion angle to a machine learning model that output the parameter regarding the knee flexion angle in response to input of the feature amount data (Paragraph [0027] - Compact implementations of the goniometers generally can measure motion in 2 degrees of freedom, but may not perform as well at measuring small motions such as the rotation about the long axis of the knee or the anterior draw (fore-aft motion of the knee); Paragraph [0038] - In some embodiments, sensor signals 230 from the one or more sensors 214 are applied to a model to determine the values. In some embodiments, the model comprises a machine learning model 224, which may be trained using sensor data from the wearer and/or one or more other wearers. The values may be represented in any suitable forms, such as pressure set points, dimensions or sizes of the one or more actuatable components 216 (e.g., height values), and so forth); and
displaying information according to the parameter regarding the knee flexion angle output from the machine learning model in response to the input of the feature amount data on a screen of a mobile terminal used by the user (Paragraph [0038] - In some embodiments, the mobile computing device 125 comprises a component geometry module 222 that is configured similarly to the component geometry module 210; Paragraph [0040] - In some embodiments, the memory 220 comprises a pose module 221 that is configured to determine and/or predict poses for one or more body parts of the wearer using the sensor signals 230. As defined herein, a “pose” represents a disposition of one or more body parts of the wearer in two-dimensional or three-dimensional space. In some embodiments, each pose includes position information and orientation information for the one or more body parts; Paragraph [0056] - In one example, the target poses may be displayed on the mobile computing device 125 and the wearer may graphically adjust the target poses (e.g., rotating a graphical representation of the leg to adjust an angle of the target poses)).
It is noted that Marcus et. al.’779 discloses that elements from different embodiments could be used within each other (Paragraph [0120] - However, the scope of the present disclosure is not limited to specific described embodiments. Instead, any combination of the features and elements described herein, whether related to different embodiments or not, is contemplated to implement and practice contemplated embodiments).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 3 and 4 are rejected under 35 U.S.C. 103 as being unpatentable over Marcus et. al.'779 (U.S. Patent Application 20220051779 – cited by applicant) as applied to Claim 1 above, in view of Zhang et. al.'290 (CN Patent Application 112107290 – cited by applicant).
Regarding Claim 3, Marcus et. al.'779 discloses an estimation model that estimates a parameter related to a knee state (Paragraph [0044] – entire paragraph; Paragraph [0045] - Combination of the sensor signals and pre-measured or estimated biomechanical data of the wearer may be used to calculate the reaction forces and moments acting at the knee), but fails to disclose the machine learning model is configured to estimate the parameter regarding the knee flexion angle associated with two peaks appearing in time series data of the knee flexion angle for one gait cycle. Zhang et. al.'290 teaches estimating a knee angle of a user by analyzing peaks obtained from a user’s foot motion within a user’s gait cycle (Page 10 Paragraph 11 - indicating four peak values in two gait periods. As is known in the art, peaks 71 and 75 indicate the heel landing, and the peaks 73 and 77 represent the toe-to-ground portions of the subject gait cycle. The segment 78a between the reference numerals 71 and 73 and the segment 78b between the reference numerals 75 and 77 are the standing periods of each gait cycle, and are periods useful for calculating KAM). It would have been obvious to one of ordinary skill in the art at the time the invention was effectively filed to have modified the device of Marcus et. al.’779 to include a learning model that considers a relationship between two peaks in a user’s gait cycle caused by the user’s foot motion in order to gather information regarding a knee’s position at various time points within a gait and discard time periods that are not preferred as seen in Zhang et. al.’290 (Page 9 Paragraph 12 - As will be understood by those skilled in the art, KAM is present only between the leg and the ground contact (i.e., standing period). In order to eliminate the unnecessary swing period part (if it contains these parts, it may cause abnormal prediction and additional calculation burden), and also in order to provide clear and accurate feedback to the object, implementing the real-time segmentation algorithm).
Regarding Claim 4, Marcus et. al.'779 discloses the device outlined in Claim 3 above as well as obtaining gait information of a user based on foot motion that involves identifying “push off” – also known as “toe off” - and “swing” moments (Paragraph [0092] – entire paragraph; Paragraph [0106] – entire paragraph; Figure 6), wherein the at least one processor is configured to execute the instructions to input the feature amount data acquired according to a gait of the user to the estimation model (Paragraph [0105] - For each of the phases, the computer processor(s) 206, 218 may access threshold values and/or set point parameters for the body-worn actuatable components, and in some cases may include transition set points that specify behavior of the body-worn actuatable components during transition between the phases. In some embodiments, the threshold values and/or set point parameters may be dynamically adjusted for subsequent gait cycles based on the sensor signals). Marcus et. al.'779 fails to disclose wherein the machine learning model is configured to estimate the parameter regarding the knee flexion angle including a temporal relationship between a timing of a peak appearing in a swing phase of two peaks appearing in time series data of the knee flexion angle for one gait cycle and a timing of a toe off. Zhang et. al.'290 teaches estimating a knee flexion angle of a user by analyzing peaks obtained from a user’s foot motion within a user’s gait cycle (Page 10 Paragraph 11 - indicating four peak values in two gait periods. As is known in the art, peaks 71 and 75 indicate the heel landing, and the peaks 73 and 77 represent the toe-to-ground portions of the subject gait cycle. The segment 78a between the reference numerals 71 and 73 and the segment 78b between the reference numerals 75 and 77 are the standing periods of each gait cycle, and are periods useful for calculating KAM). It would have been obvious to one of ordinary skill in the art at the time the invention was effectively filed to have modified the device of Marcus et. al.’779 to include a learning model that considers a relationship between two peaks in a user’s gait cycle caused by the user’s foot motion in order to gather information regarding a knee’s position at various time points within a gait and discard time periods that are not preferred as seen in Zhang et. al.’290 (Page 9 Paragraph 12 - As will be understood by those skilled in the art, KAM is present only between the leg and the ground contact (i.e., standing period). In order to eliminate the unnecessary swing period part (if it contains these parts, it may cause abnormal prediction and additional calculation burden), and also in order to provide clear and accurate feedback to the object, implementing the real-time segmentation algorithm). Although this reference, Zhang et. al.’290, considers data that is not in the “swing” phase, it would have been obvious to one of ordinary skill in the art to also consider data within the “swing” phase as it would have been obvious to try by choosing from a finite number of identified, predictable solutions, with a reasonable expectation of success; see as reference KSR Int'l Co. v. Teleflex Inc., 550 U.S. 398, 415-421, 82 USPQ2d 1385, 1395-97 (2007).
Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Marcus et. al.'779 (U.S. Patent Application 20220051779 – cited by applicant) as applied to Claim 5 above, in view of Tam et. al.'175 (U.S. Patent Application 20160220175).
Regarding Claim 6, Marcus et. al.'779 discloses the device outlined in Claim 5 above as well as identifying positioning of a user’s sole (Paragraph [0046] - a position of the IMU may be tracked in space to identify the level of pronation in each step or variations in terrain such as stairs or slopes) and whenever a person has made a movement that could cause harm to a user (Paragraph [0097] - a failure to fully extend the leg or a misalignment at maximum extension may cause additional wear on the leg joints (e.g., hip, knee, and ankle). Additionally, a greater flexure of the knee during the push-off phase 625-3 causes the surrounding muscles to compensate, which may give rise to other musculoskeletal disorders). Marcus et. al.’779 fails to disclose display recommendation information including information regarding a hospital at which the user can seek medical advice according to the estimation result of the parameter regarding the knee flexion angle on the screen of the mobile terminal. Tam et. al.’175 teaches (Paragraph [0056] - If a patient is doing exercises incorrectly, as determined by the software algorithm, the patient may get an alert on the screen of a mobile device, computer, or any small display to direct them to correct the exercise; Paragraph [0057] - If the patient keeps snoozing their exercises, does not do them correctly, does not run the software or does not wear the device, the software may detect that the patient was not in compliance with the exercises that had been programmed into the software at initiation of the program. This may trigger an alert to the patient…The patient may also have the ability to reach out to the sponsor, physician staff, operator, physician or other appropriate party via text message including picture texting, email, videoconference or voice call directly from the mobile application or from a computer). It would have been obvious to one of ordinary skill in the art at the time the invention was effectively filed to have modified the device of Marcus et. al.’779 to include providing contact information to a user if they are not achieving appropriate walking form in order to get professional feedback that could help their walking form as seen in Tam et. al.’175.
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer.
Claim 1 is rejected on the ground of nonstatutory double patenting as being unpatentable over Claim 3 of co-pending application 18/202829 in view of Ohno et al.’290 (WO 2020/170290). For this rejection, the US translation of Ohno et al.’290, as seen in Ohno et al.’338 (US Pub No. 2022/0125338), will be referenced.
Although the claims at issue are not identical, they are not patentably distinct from each other because Claim 3 of the co-pending application is narrower in scope than the claims of the instant application, and encompasses the subject matter of the claim of the instant application, with the exception of inputting a parameter regarding the knee flexion angle to a machine learning model, and displaying information to a mobile terminal used by the user. Ohno et al.’338 teaches that a diagnostic model (which is analagous to the “estimation model” of Claim 3 of the co-pending application) can be generated by performing machine learning, and further teaches outputting diagnostic information on a screen of a mobile terminal so that it can be seen by a user (Paragraphs [0048] and [0050]). It would have been obvious to one of ordinary skill in the art at the time the invention was effectively filed to have modified the device recited in Claim 3 of the co-pending application such that its estimation model is generated by machine learning, as Ohno et al.’338 teaches that diagnostic models can be generated by machine learning. The modification to Claim 3 of the co-pending application would merely be combining prior art elements according to known methods to yield predictable results. Furthermore, it would have been obvious to one of ordinary skill in the art at the time the invention was effectively filed to have modified the at least one processor of Claim 3 of the co-pending application such that it displays information related to the index value to a mobile terminal used by the user, as Ohno et al.’338 teaches that this would allow the determined information to be seen and analyzed.
Any reference meeting the limitations set forth in Claim 3 of the co-pending application in view of Ohno et al.’290 would also meet the requirements set forth in Claim 1 of the instant application.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SARAH ANN WESTFALL whose telephone number is (571) 272-3845. The examiner can normally be reached Monday-Friday 7:30am-4:30pm EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jennifer Robertson can be reached at (571) 272-5001. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/SARAH ANN WESTFALL/Examiner, Art Unit 3791
/ETSUB D BERHANU/Primary Examiner, Art Unit 3791