DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Election/Restrictions
Restriction to one of the following inventions is required under 35 U.S.C. 121:
I. Claims 1-2, 4 and 7-12, drawn to a computer implemented method of running a machine learning model classifying kinematic data, classified in A61B5/7267.
II. Claims 15-21, drawn to an invasive sensor system, classified in A61B5/686.
III. Claims 24-27, drawn to a non-invasive sensor system, classified in A61B5/11.
The inventions are independent or distinct, each from the other because:
Inventions I and II are related as process and apparatus for its practice. The inventions are distinct if it can be shown that either: (1) the process as claimed can be practiced by another and materially different apparatus or by hand, or (2) the apparatus as claimed can be used to practice another and materially different process. (MPEP § 806.05(e)). In this case the method of invention I could use an entirely different sensor system of invention II.
Inventions I and III are related as process and apparatus for its practice. The inventions are distinct if it can be shown that either: (1) the process as claimed can be practiced by another and materially different apparatus or by hand, or (2) the apparatus as claimed can be used to practice another and materially different process. (MPEP § 806.05(e)). In this case the method of invention I could use an entirely different sensor system of invention III.
Inventions II and III are directed to related devices. The related inventions are distinct if: (1) the inventions as claimed are either not capable of use together or can have a materially different design, mode of operation, function, or effect; (2) the inventions do not overlap in scope, i.e., are mutually exclusive; and (3) the inventions as claimed are not obvious variants. See MPEP § 806.05(j). In the instant case, the inventions as claimed, invention II is an invasive sensor system and invention III is a non-invasive sensor system. Furthermore, the inventions as claimed do not encompass overlapping subject matter and there is nothing of record to show them to be obvious variants.
Restriction for examination purposes as indicated is proper because all the inventions listed in this action are independent or distinct for the reasons given above and there would be a serious search and/or examination burden if restriction were not required because one or more of the following reasons apply:
The inventions differ in scope significantly and require different areas of searching and different strategies
Applicant is advised that the reply to this requirement to be complete must include (i) an election of an invention to be examined even though the requirement may be traversed (37 CFR 1.143) and (ii) identification of the claims encompassing the elected invention.
The election of an invention may be made with or without traverse. To reserve a right to petition, the election must be made with traverse. If the reply does not distinctly and specifically point out supposed errors in the restriction requirement, the election shall be treated as an election without traverse. Traversal must be presented at the time of election in order to be considered timely. Failure to timely traverse the requirement will result in the loss of right to petition under 37 CFR 1.144. If claims are added after the election, applicant must indicate which of these claims are readable upon the elected invention.
Should applicant traverse on the ground that the inventions are not patentably distinct, applicant should submit evidence or identify such evidence now of record showing the inventions to be obvious variants or clearly admit on the record that this is the case. In either instance, if the examiner finds one of the inventions unpatentable over the prior art, the evidence or admission may be used in a rejection under 35 U.S.C. 103 or pre-AIA 35 U.S.C. 103(a) of the other invention.
During a telephone conversation with David Parker on March 12, 2026 a provisional election was made without traverse to prosecute the invention of group I, claims 1-2, 4 and 7-12. Affirmation of this election must be made by applicant in replying to this Office action. Claims 15-21 and 24-27 are withdrawn from further consideration by the examiner, 37 CFR 1.142(b), as being drawn to a non-elected invention.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-2, 4 and 7-12 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more.
Step 1 of the subject matter eligibility test (see MPEP 2106.03).
Claim 1 is directed to “a computer-implemented method for generating a patient movement classification model, wherein the computer-implemented method comprises, as implemented by a computing system comprising one or more computer processors” which describes one of the four statutory categories of patentable subject matter, i.e. a process.
Each of Claims 1-2, 4 and 7-12 has been analyzed to determine whether it is directed to any judicial exceptions.
Step 2A of the subject matter eligibility test (see MPEP 2106.04).
Prong One:
Claim 1 recites (“sets forth” or “describes”) the abstract idea of “mathematical concepts” (MPEP 2106.04(a)(2).I.), substantially as follows: “for individual records of the plurality of records: identifying one or more elements represented by the kinematic data;
determining one or more kinematic features based on the one or more elements; and
labeling the one or more kinematic features with a movement type of a plurality of movement types to generate one or more labeled kinematic features, wherein each movement type of the plurality of movement types is associated with movement of a body part; and
training a machine learning model using the labeled kinematic features to classify motion of a particular implant as a particular movement type.”
The above recited steps are mathematical concepts, which is defined as mathematical relationships, mathematical formulas or equations, and mathematical calculations. The Specification teaches that kinematic data calculations may be used to compute the fiducial points to identify movement types and classify the movement. Spec. page 21-23. Computing these based on the feature points encompasses the use of mathematical equations, which has been recognized as an abstract idea (i.e., a mathematical concept). Patent Eligibility Guidance, 84 Fed. Reg. at 52. In sum, we determine that Prong 1 recites a judicial exception, and proceed to Step 2A, Prong 2.
Therefore, each of the above steps are grouped as mathematical concepts, hence an abstract idea.
Claim 1 recites (“sets forth” or “describes”) the abstract idea of “a mental process” (MPEP 2106.04(a)(2).III.), substantially as follows: “for individual records of the plurality of records: identifying one or more elements represented by the kinematic data;
determining one or more kinematic features based on the one or more elements; and
labeling the one or more kinematic features with a movement type of a plurality of movement types to generate one or more labeled kinematic features, wherein each movement type of the plurality of movement types is associated with movement of a body part; and
training a machine learning model using the labeled kinematic features to classify motion of a particular implant as a particular movement type.”
The above recited steps can be practically performed in the human mind, with the aid of a pen and paper or with a generic computer, in a computer environment, or merely using the generic computer as a tool to perform the steps. If a person were to visually examine, i.e., perform an observation, the waveform data, either in a printout or an electronic format, he/she would be able to perform the calculations to obtain the features via pen and paper. He/she would further be able to obtain at least one feature value, for example, an amplitude or a highest peak, via visual examination, and further to estimate the movement information. There is nothing recited in the claim to suggest an undue level of complexity in how the waveforms, the peaks and the information to be identified. Therefore, a person would be able to perform the identification of peaks mentally or with a generic computer.
Prong Two: Claim 1 does not include additional elements that integrate the mental process into a practical application.
This judicial exception is not integrated into a practical application. In particular, the claims recites (1) “obtaining a plurality of records from across a patient population, wherein a record of the plurality of records comprises kinematic data representing motion of an implant implanted in a patient of the patient population, and wherein the implant comprises a plurality of sensors configured to detect motion of the implant;”
(2) “training a machine learning model using the labeled kinematic features to classify motion of a particular implant as a particular movement type (as an output)”.
(3) “one or more computer processors; computer-implemented”.
The steps in (1) represent merely data gathering or pre-solution activities that are necessary for use of the recited judicial exception and are recited at a high level of generality with conventionally used tools (see below Step IIB for further details).
The step in (2) represents merely notification outputting by a processor as a post-solution activity and is recited at a high level of generality.
The steps in (3) merely recite generic computer components used to implement the abstract idea on, as tools.
As a whole, the additional elements merely serve to gather and feed information to the abstract idea and to output a notification based on the abstract idea, while generically implementing it on conventionally used tools. There is no practical application because the abstract idea is not applied, relied on, or used in a meaningful way. No improvement to the technology is evident, and the estimated bio-information is not outputted in any way such that a practical benefit is realized. Therefore, the additional elements, alone or in combination, do not integrate the abstract idea into a practical application.
Step 2B of the subject matter eligibility test (see MPEP 2106.05).
Claim 1 does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above, the claims recite additional steps of (1) “obtaining a plurality of records from across a patient population, wherein a record of the plurality of records comprises kinematic data representing motion of an implant implanted in a patient of the patient population, and wherein the implant comprises a plurality of sensors configured to detect motion of the implant;”
(2) “training a machine learning model using the labeled kinematic features to classify motion of a particular implant as a particular movement type (as an output)”.
(3) “one or more computer processors; computer-implemented”.
These steps represents mere data gathering, data outputting or pre/post/extra-solution activities that are necessary for use of the recited judicial exception and are recited at a high level of generality.
The patient information is obtained from a database. These additional limitations merely represent insignificant, conventional pre-solution activities well-understood the sensors must not even be described in detail as they are WURC.
The recited processors and computer-readable storage medium are generic computer elements (i.d. para. [00383 - 00402] describing generic computers).
The machine learning models recited are generic black box recited models as evidenced by [00372]
Therefore, none of the claim 1 amounts to significantly more than the abstract idea itself.
Accordingly, Claim 1 is not patent eligible and rejected under 35 U.S.C. 101 as being directed to abstract ideas implemented on a generic computer in view of the Supreme Court Decision in Alice Corporation Pty. Ltd. v. CLS Bank International, et al. and 2019 PEG.
Dependent Claims
The following dependent claims merely further define the abstract idea and are, therefore, directed to an abstract idea for similar reasons:
Claims 2, 4, 7 and 12 recitations further limits the abstract idea above, representing the data differently for calculations, defining the specific joint, defining the specific algorithm, merely further defines the mental process or mathematical equations discussed above.
The following dependent claims merely further describe the extra-solution activities and therefore, do not amount to significantly more than the judicial exception or integrate the abstract idea into a practical application for similar reasons:
Claims 8-11 further define the sensors used for insignificant extra-solution activity (data collection).
Taken alone and in combination, the additional elements do not integrate the judicial exception into a practical application at least because the abstract idea is not applied, relied on, or used in a meaningful way. They also do not add anything significantly more than the abstract idea. Their collective functions merely provide computer/electronic implementation and processing, and no additional elements beyond those of the abstract idea. Looking at the limitations as an ordered combination adds nothing that is not already present when looking at the elements individually. There is no indication that the combination of elements improves the functioning of a computer, output device, improves technology other than the technical field of the claimed invention, etc. Therefore, the claims are rejected as being directed to non-statutory subject matter.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1 and 4 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Buckland et al. (US 20200205900) (“Buckland”).
Regarding claim 1, Buckland discloses A computer-implemented method for generating a patient movement classification model, wherein the computer-implemented method comprises, as implemented by a computing system comprising one or more computer processors (Abstract and entire document, see also at least [0002], [0017], “computing device 110” see also [0053]):
obtaining a plurality of records from across a patient population ([0025 – 0027] obtaining the patient population record data,),
wherein a record of the plurality of records comprises kinematic data representing motion of an implant implanted in a patient of the patient population, and wherein the implant comprises a plurality of sensors configured to detect motion of the implant ([0025 – 0027] and FIG. 1 and associated paragraphs, see at least [0019], “In one embodiment, motion capture sensors 142 capture the 3D motion capture data 144 while the subject user 140 is performing a physical activity or a physical movement…..physical activity. Each physical activity may have its own unique set of associated body movements. Each physical movement can involve motion of a bone or joint of the subject user 140. Thus, the 3D motion capture data 144 can include continuous motion capture data representing dynamic motion of at least one of a bone or joint of the subject user 140 while they are performing the physical movement. The continuous nature can differentiate the 3D motion capture data 144 from a mere static image captured at a single point in time.” The data is captured);
for individual records of the plurality of records: identifying one or more elements represented by the kinematic data ([0025 – 0027] and FIG. 1 and associated paragraphs, see at least [0019], “In one embodiment, motion capture sensors 142 capture the 3D motion capture data 144 while the subject user 140 is performing a physical activity or a physical movement…..physical activity. Each physical activity may have its own unique set of associated body movements. Each physical movement can involve motion of a bone or joint of the subject user 140. Thus, the 3D motion capture data 144 can include continuous motion capture data representing dynamic motion of at least one of a bone or joint of the subject user 140 while they are performing the physical movement. The continuous nature can differentiate the 3D motion capture data 144 from a mere static image captured at a single point in time.” It is interpreted that an element of the data is anything at all representing the data);
determining one or more kinematic features based on the one or more elements ([0025 – 0027] and FIG. 1 and associated paragraphs, see at least [0019], “In one embodiment, motion capture sensors 142 capture the 3D motion capture data 144 while the subject user 140 is performing a physical activity or a physical movement…..physical activity. Each physical activity may have its own unique set of associated body movements. Each physical movement can involve motion of a bone or joint of the subject user 140. Thus, the 3D motion capture data 144 can include continuous motion capture data representing dynamic motion of at least one of a bone or joint of the subject user 140 while they are performing the physical movement. The continuous nature can differentiate the 3D motion capture data 144 from a mere static image captured at a single point in time.” It is interpreted that a movement feature based on any of the elements is determined, such as the orientation of the implant); and
labeling the one or more kinematic features with a movement type of a plurality of movement types to generate one or more labeled kinematic features, wherein each movement type of the plurality of movement types is associated with movement of a body part ([0025 – 0027] and FIG. 1 and associated paragraphs, see at least [0019], “In one embodiment, motion capture sensors 142 capture the 3D motion capture data 144 while the subject user 140 is performing a physical activity or a physical movement…..physical activity. Each physical activity may have its own unique set of associated body movements. Each physical movement can involve motion of a bone or joint of the subject user 140. Thus, the 3D motion capture data 144 can include continuous motion capture data representing dynamic motion of at least one of a bone or joint of the subject user 140 while they are performing the physical movement. The continuous nature can differentiate the 3D motion capture data 144 from a mere static image captured at a single point in time.” The models are trained based on known orientations and known movements and those movements are labeled, such as “swinging a golf club”); and
training a machine learning model using the labeled kinematic features to classify motion of a particular implant as a particular movement type ([0025 – 0027] and FIG. 1 and associated paragraphs, see at least [0019], “In one embodiment, motion capture sensors 142 capture the 3D motion capture data 144 while the subject user 140 is performing a physical activity or a physical movement…..physical activity. Each physical activity may have its own unique set of associated body movements. Each physical movement can involve motion of a bone or joint of the subject user 140. Thus, the 3D motion capture data 144 can include continuous motion capture data representing dynamic motion of at least one of a bone or joint of the subject user 140 while they are performing the physical movement. The continuous nature can differentiate the 3D motion capture data 144 from a mere static image captured at a single point in time.” The models are trained based on known orientations and known movements and those movements are labeled, such as “swinging a golf club” the model is trained such that new occurrences can be classified).
Regarding claim 4, Buckland discloses The computer-implemented method of claim 1, wherein the body part is associated with a body joint comprising one of a hip joint, knee joint, ankle joint, shoulder joint, elbow joint, and wrist joint ([0019], “Each physical movement can involve motion of a bone or joint of the subject user 140.” And [0013], [0014], [0020]).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 2 and 7-9 are rejected under 35 U.S.C. 103 as being unpatentable over Buckland in view of Amit et al. (US 11,006,860 B1) (“Amit”).
Regarding claim 2, Buckland discloses The computer-implemented method of claim 1, Buckland fails to disclose wherein identifying one or more elements represented by the kinematic data comprises: representing the kinematic data as a time-series waveform, and identifying a set of fiducial points in the time-series waveform, wherein the one or more elements correspond to the set of fiducial points.
However, in the same field of endeavor, Amit teaches wherein identifying one or more elements represented by the kinematic data comprises: representing the kinematic data as a time-series waveform, and identifying a set of fiducial points in the time-series waveform, wherein the one or more elements correspond to the set of fiducial points (See at least FIG. 2 and associated paragraphs, see at least Col. 15 lines 6-26, “The results shown in FIG. 7 demonstrate the occurrences of strides according to the values of two different gait kinematic parameters as axes, being foot on ground duration (vertical axis) and velocity (horizontal axis).”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to modify the method as taught by Buckland to include wherein identifying one or more elements represented by the kinematic data comprises: representing the kinematic data as a time-series waveform, and identifying a set of fiducial points in the time-series waveform, wherein the one or more elements correspond to the set of fiducial points as taught by Amit to show a visual representation (Col. 15 lines 20-26).
Regarding claim 7, Buckland discloses The computer-implemented method of claim 1, Buckland fails to disclose further comprising: representing each kinematic data included in the plurality of records as one of a time-series waveform or a spectral distribution graph; and
applying a clustering algorithm to a plurality of time-series waveforms or spectral distribution graphs to automatically separate the plurality of time-series waveforms or spectral distribution graphs into a plurality of clusters;
wherein labeling the one or more kinematic features with a movement type is based determining that the one or more kinematic features are associated with a particular cluster of the plurality of clusters.
However, in the same field of endeavor, Amit teaches further comprising: representing each kinematic data included in the plurality of records as one of a time-series waveform or a spectral distribution graph (See at least FIG. 2 and associated paragraphs, see at least Col. 15 lines 6-26, “The results shown in FIG. 7 demonstrate the occurrences of strides according to the values of two different gait kinematic parameters as axes, being foot on ground duration (vertical axis) and velocity (horizontal axis).”); and
applying a clustering algorithm to a plurality of time-series waveforms or spectral distribution graphs to automatically separate the plurality of time-series waveforms or spectral distribution graphs into a plurality of clusters (See at least FIG. 2 and associated paragraphs, see at least Col. 15 lines 6-26, “Each stride shape is associated with one stride type mentioned in the matrix of FIG. 6, for example a point associated with a walking stride is marked by a circle and a point associated with a running stride is marked by a cross. Multiple clusters are seen, including two significant clusters, cluster 704 indicating lower velocity and longer foot on ground times, which are typical of walking, and cluster 708 indicating higher velocity, for example 2-8 meter per second, and shorter foot on ground times, which are typical of running, wherein the stride types associated with these two clusters are consistent with the dominant types on the diagonal of the confusion matrix, being forward_walk and forward_run.”);
wherein labeling the one or more kinematic features with a movement type is based determining that the one or more kinematic features are associated with a particular cluster of the plurality of clusters (See at least FIG. 2 and associated paragraphs, see at least Col. 15 lines 6-26, “Each stride shape is associated with one stride type mentioned in the matrix of FIG. 6, for example a point associated with a walking stride is marked by a circle and a point associated with a running stride is marked by a cross. Multiple clusters are seen, including two significant clusters, cluster 704 indicating lower velocity and longer foot on ground times, which are typical of walking, and cluster 708 indicating higher velocity, for example 2-8 meter per second, and shorter foot on ground times, which are typical of running, wherein the stride types associated with these two clusters are consistent with the dominant types on the diagonal of the confusion matrix, being forward_walk and forward_run.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to modify the method as taught by Buckland to include further comprising: representing each kinematic data included in the plurality of records as one of a time-series waveform or a spectral distribution graph; and applying a clustering algorithm to a plurality of time-series waveforms or spectral distribution graphs to automatically separate the plurality of time-series waveforms or spectral distribution graphs into a plurality of clusters; wherein labeling the one or more kinematic features with a movement type is based determining that the one or more kinematic features are associated with a particular cluster of the plurality of clusters as taught by Amit to show a visual representation (Col. 15 lines 20-26).
Regarding claim 8, Buckland discloses The computer-implemented method of claim 1, Buckland fails to disclose wherein a first sensor of the plurality of sensors comprises a gyroscope oriented relative to the body part and configured to provide, as kinematic data, a signal representing angular velocity about a first axis relative to the body part.
However, in the same field of endeavor, Amit teaches wherein a first sensor of the plurality of sensors comprises a gyroscope oriented relative to the body part and configured to provide, as kinematic data, a signal representing angular velocity about a first axis relative to the body part (FIG. 1C, gyroscope, accelerometers).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to modify the method as taught by Buckland to include wherein a first sensor of the plurality of sensors comprises a gyroscope oriented relative to the body part and configured to provide, as kinematic data, a signal representing angular velocity about a first axis relative to the body part as taught by Amit to show a visual representation (Col. 15 lines 20-26).
Regarding claim 9, Buckland discloses The computer-implemented method of claim 1, Buckland fails to disclose wherein a first sensor of the plurality of sensors comprises an accelerometer oriented relative to the body part and configured to provide, as kinematic data, a signal representing acceleration along a first axis relative to the body part.
However, in the same field of endeavor, Amit teaches wherein a first sensor of the plurality of sensors comprises an accelerometer oriented relative to the body part and configured to provide, as kinematic data, a signal representing acceleration along a first axis relative to the body part (FIG. 1C, gyroscope, accelerometers).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to modify the method as taught by Buckland to include wherein a first sensor of the plurality of sensors comprises an accelerometer oriented relative to the body part and configured to provide, as kinematic data, a signal representing acceleration along a first axis relative to the body part as taught by Amit to show a visual representation (Col. 15 lines 20-26).
Claims 10-11 are rejected under 35 U.S.C. 103 as being unpatentable over Buckland in view of Amit in further view of Zhang Y, Yan W, Yao Y, Ahmed JB, Tan Y, Gu D. Prediction of Freezing of Gait in Patients With Parkinson's Disease by Identifying Impaired Gait Patterns. IEEE Trans Neural Syst Rehabil Eng. 2020 Mar;28(3):591-600. doi: 10.1109/TNSRE.2020.2969649. Epub 2020 Jan 27. PMID: 31995497. (“Zhang”, submitted in July 25, 2023 IDS).
Regarding claim 10, Buckland as modified discloses The computer-implemented method of claim 8, Buckland as modified fails to disclose wherein the first axis is one axis of a three-dimensional implant coordinate system comprising a second axis and a third axis, and wherein obtaining the plurality of records comprises: obtaining from a second sensor of the plurality of sensors, as kinematic data, a signal representing one of: angular velocity about the second axis relative to the body part, or acceleration along the second axis relative to the body part; and obtaining from a third sensor of the plurality of sensors, as kinematic data, a signal representing one of: angular velocity about the third axis relative to the body part, or acceleration along the third axis relative to the body part.
However, in the same field of endeavor, Zhang teaches wherein the first axis is one axis of a three-dimensional implant coordinate system comprising a second axis and a third axis, and wherein obtaining the plurality of records comprises: obtaining from a second sensor of the plurality of sensors, as kinematic data, a signal representing one of: angular velocity about the second axis relative to the body part, or acceleration along the second axis relative to the body part; and obtaining from a third sensor of the plurality of sensors, as kinematic data, a signal representing one of: angular velocity about the third axis relative to the body part, or acceleration along the third axis relative to the body part (Page 592, last paragraph – page 593 discussing the sensors and coordinate system, kinematic data, angular velocity measurements).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to modify the method as taught by Buckland to include wherein the first axis is one axis of a three-dimensional implant coordinate system comprising a second axis and a third axis, and wherein obtaining the plurality of records comprises: obtaining from a second sensor of the plurality of sensors, as kinematic data, a signal representing one of: angular velocity about the second axis relative to the body part, or acceleration along the second axis relative to the body part; and obtaining from a third sensor of the plurality of sensors, as kinematic data, a signal representing one of: angular velocity about the third axis relative to the body part, or acceleration along the third axis relative to the body part as taught by Zhang to achieve higher accuracy (Abstract).
Regarding claim 11, Buckland as modified discloses The computer-implemented method of claim 10, further comprising, Buckland as modified fails to disclose prior to labeling the one or more kinematic features, combining two or more of the respective signals representing angular velocity or acceleration about the first axis, the second axis, and the third axis.
However, in the same field of endeavor, Zhang teaches prior to labeling the one or more kinematic features, combining two or more of the respective signals representing angular velocity or acceleration about the first axis, the second axis, and the third axis (Page 592, last paragraph – page 593 discussing the sensors and coordinate system, kinematic data, angular velocity measurements).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to modify the method as taught by Buckland to include prior to labeling the one or more kinematic features, combining two or more of the respective signals representing angular velocity or acceleration about the first axis, the second axis, and the third axis as taught by Zhang to achieve higher accuracy (Abstract).
Claim 12 is rejected under 35 U.S.C. 103 as being unpatentable over Buckland in view of Amit in further view of Zhang in further view of Joglekar (US 2010/0135553 A1) (“Joglekar”).
Regarding claim 12, Buckland as modified discloses The computer-implemented method of claim 10, Buckland as modified fails to disclose further comprising: calculating a transverse plane skew angle between corresponding transverse planes of the implant coordinate system and an anatomical coordinate system associated with the body part;
responsive to a transverse plane skew angle that is less than a threshold value, determining that the implant coordinate system is aligned with the anatomical coordinate system; and responsive to a transverse plane skew angle that is above the threshold value, determining that the implant coordinate system is not aligned with the anatomical coordinate system.
However, in the same field of endeavor, Joglekar teaches further comprising: calculating a transverse plane skew angle between corresponding transverse planes of the implant coordinate system and an anatomical coordinate system associated with the body part ([0249] discussing skew angle between coordinate systems);
responsive to a transverse plane skew angle that is less than a threshold value, determining that the implant coordinate system is aligned with the anatomical coordinate system; and responsive to a transverse plane skew angle that is above the threshold value, determining that the implant coordinate system is not aligned with the anatomical coordinate system ([0087 – 0089] discussing threshold and coordinates and images, both thresholds discussed).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to modify the method as taught by Buckland to include further comprising: calculating a transverse plane skew angle between corresponding transverse planes of the implant coordinate system and an anatomical coordinate system associated with the body part; responsive to a transverse plane skew angle that is less than a threshold value, determining that the implant coordinate system is aligned with the anatomical coordinate system; and responsive to a transverse plane skew angle that is above the threshold value, determining that the implant coordinate system is not aligned with the anatomical coordinate system as taught by Joglekar to verify configuration (abstract).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JOSEPH A TOMBERS whose telephone number is (571)272-6851. The examiner can normally be reached on M-TH 7:00-16:00, F 7:00-11:00(Eastern).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Robert Chen can be reached on 571-272-3672. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see https://ppair-my.uspto.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JOSEPH A TOMBERS/Examiner, Art Unit 3791