DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
The amendments filed 20th February 2026 have been entered. Claims 1, 3, 4, 6, and 8-9 are pending. Claim 7 has been canceled. Applicants amendments to the claims have failed to overcome under 35 U.S.C. § 112(a) and 35 U.S.C. § 112(b) that were previously applied in the office action dated 20th August 2025.
Response to Arguments
In response to the arguments regarding the 35 U.S.C. § 101, Applicant argues that the claim sets forth an improvement to the technology of identifying and calculating error data occurring in posture data from posture sensor(s) by quantifying the error data due to deformation or slack in the clothing having the posture sensor(s) attached thereon. Applicant further cites from the specification to show where the support for the improvement is within the disclosure.
Examiner points out that the purported improvement is not reflected in the claims as amended, the Claims currently claim abstract ideas implemented by generically recited additional elements that do not provide significantly more, they do not reflect the quantification of the error data due to deformation or slack in the clothing. In contrast, Claim 1 currently recites “wherein the error estimation section calculates the error data for each of the posture sensors indicating an error that is occurring in respective posture sensors”, while quantification of error data is cited, the purported improvement is absent. The recitation of sections of the specification provide some support for the purported improvement, with that said, limitations from the specification are not to be read into the claims and the claims are instead given their broadest reasonable interpretation in light of the specification. See MPEP § 2111.01 II.
Applicant further asserts the improvement is embodied by the amendments of “an information generation section that evaluates the error data based on the posture data and the error data to generate additional information indicating reliability of the estimated posture”. Examiner respectfully disagrees with the Applicant, as the amendments to the claims do not reflect or even suggest the purported improvement of ‘quantifying error data due to deformation or slack in the clothing’, as the broad limitations make no hint or mention of what the Applicant asserts to be the improvement.
It is important to note that according to MPEP 2106.05(a), the judicial exception alone cannot provide the improvement. The improvement can be provided by one or more additional elements. See the discussion of Diamond v. Diehr, 450 U.S. 175, 187 and 191-92, 209 USPQ 1, 10 (1981)) in subsection II, below. In addition, the improvement can be provided by the additional element(s) in combination with the recited judicial exception. See MPEP § 2106.04(d) (discussing Finjan, Inc. v. Blue Coat Sys., Inc., 879 F.3d 1299, 1303-04, 125 USPQ2d 1282, 1285-87 (Fed. Cir. 2018)). Thus, it is important for examiners to analyze the claim as a whole when determining whether the claim provides an improvement to the functioning of computers or an improvement to other technology or technical field.
The evaluation of whether the claim as a whole integrates the recited judicial exception into a practical application of the exception or whether the claim is ‘directed to’ the judicial exception is performed by identifying additional elements recited in the claim beyond the judicial exception and evaluating those additional elements individually and in combination to determine whether the claim as a whole integrates the exception into a practical application. See MPEP 2106.04(d). The claim recites the additional elements of processor(s), and posture sensor(s). The claim recites that the processor(s) (a computer) execute the limitations using the posture sensor(s).
The limitations as underlined in the rejection below are mere data gathering, manipulating and output recited at a high level of generality, and thus are insignificant extra-solution activity. See MPEP 2106.05(g) (“whether the limitation is significant”). In addition, all uses of the recited judicial exceptions require such data gathering and output, and, as such, these limitations do not impose any meaningful limits on the claim. These limitations amount to necessary data gathering and outputting. See MPEP 2106.05.
Further, the limitations are executed on one or more processor(s) (a computer) and utilize posture sensor(s). The processor(s) and posture sensor(s) are recited at a high level of generality. The processor(s) and posture sensor(s) are used to perform an abstract idea, such that it amounts to no more than mere instructions to apply the exception using a generic computer. See MPEP 2106.05(f). There is no indication that the claim as a whole includes an improvement to a computer or to a technological field, as a technical explanation of the asserted improvement presented in the specification is not reflected in the claims. See MPEP 2106.04(d)(1). According to the specification, existing systems for posture recognition have trouble accurately accounting for error while using sensors that are attached to loose fitting clothing. The disclosed system allegedly accounts for error in posture detection due to deformation or slack in the clothing having the posture sensor(s) attached thereon. The claimed invention fails to reflect this improvement in the technical field of posture detection. Thus, the claim as a whole does not integrate the judicial exception into a practical application such that the claim is not directed to the judicial exception. The additional elements, when considered in combination, do not integrate the abstract idea into a practical application because the claim does not improve the functioning of a computer or technical field.
Applicants arguments are not persuasive and the rejection under 35 U.S.C. § 101 is maintained, rewritten to account for amendments to the claims.
In response to Applicants arguments regarding the rejection of Claim 1 under 35 U.S.C. § 103. The Applicant states that Harms does not disclose “an information generation section that evaluates the error data based on the posture data and the error data to generate additional information indicating reliability of the estimated posture,” as set forth in claim 1. Applicant then conclusory asserts that Aoki does not cure the deficiencies in Harms.
Examiner respectfully disagrees with the Applicant, and asserts that Aoki was used to teach the error estimation section which calculates the error data which is used in the limitation of the amendment. Aoki further reads on the amended limitation of “an information generation section that evaluates the error data based on the posture data and the error data to generate additional information indicating reliability of the estimated posture,”, see Para. [0103] and Fig. 14, where Aoki teaches evaluating the error data using posture data and measured errors to generate information via calculation, indicating reliability of the estimated posture in the form of a correction amount which is used to correct estimation in the following cycle.
Applicants arguments are not persuasive.
Specification
The lengthy specification has not been checked to the extent necessary to determine the presence of all possible minor errors. Applicants cooperation is requested in correcting any errors of which applicant may become aware in the specification.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1, 3-4, 6 & 8-9 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claims 1, 8 & 9 recite “evaluat[ing] the error data based on the posture data and the error data…”, it is unclear how or to what degree an evaluation of error data is based on itself, rendering Claims 1, 8 & 9 indefinite. For examination purposes, Examiner interprets the indefinite limitation as reading “evaluat[ing] the error data using the posture data and the error data…”, as best understood by the disclosure.
Claims 3, 4 & 6 are rejected for their dependence on a rejected parent claim.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1, 3-4, 6 & 8-9 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more.
Each of Claims 1, 3-4, 6 & 8-9 has been analyzed to determine whether it is directed to any judicial exceptions.
Step 2A, Prong 1
Each of Claims 1, 3-4, 6 & 8-9 recites at least one step or instruction for estimating posture, which is grouped as a mental process under the 2019 PEG or a certain method of organizing human activity or a mathematical concept under the 2019 PEG. Accordingly, each of Claims 1, 3-4, 6 & 8-9 recites an abstract idea.
Specifically, the claims recite:
Claim 1
A posture recognition system comprising: one or more processors configured to execute: a posture estimation section calculating, on a basis of the sensor data measured by a posture sensor attached to clothing, a feature quantity representing a feature of an estimated posture as posture data indicating an estimation of a posture of a user wearing the clothing (Judgement);
an error estimation section calculating error data which is an estimate of an error that is occurring in the posture data, on a basis of the sensor data and the posture data (Judgement), and
an information generation section that evaluates the error data based on the posture data and the error data to generate additional information indicating reliability of the estimated posture (Evaluation/Opinion)
wherein the sensor data is data indicating a position or motion of the user in accordance with a position of the clothing to which the posture sensor is attached
wherein the posture estimation section calculates, on a basis of sensor data measured by each of a plurality of posture sensors, which include the posture sensor, the posture data indicating the posture of the user (Judgement) for each of the posture sensors, and
wherein the error estimation section calculates the error data for each of the posture sensors indicating an error that is occurring in respective posture sensors (Judgement).
Claim 8
A posture recognition method by a posture recognition system, the posture recognition method comprising: calculating, on a basis of the sensor data measured by a posture sensor attached to clothing, a feature quantity representing a feature of an estimated posture as posture data (Judgement), indicating an estimation of a posture of a user wearing the clothing;
calculating error data which is an estimate of an error that is occurring in the posture data, on a basis of the sensor data and the posture data (Judgement);
evaluating the error data based on the posture data and the error data to generate additional information indicating reliability of the estimated posture (Evaluation/Opinion);
calculating, on a basis of sensor data measured by each of a plurality of posture sensors, which include the posture sensor, the posture data indicating the posture of the user (Judgement) for each of the posture sensors; and
calculating the error data for each of the posture sensors indicating an error that is occurring in respective posture sensors (Judgement)
wherein the sensor data is data indicating a position or motion of the user in accordance with a position of the clothing to which the posture sensor is attached.
Claim 9
A non-transitory computer-readable recording medium storing a program to be executed by a computer, the program causing a computer to execute steps comprising:
calculating, on a basis of the sensor data measured by a posture sensor attached to clothing, a feature quantity representing a feature of an estimated posture as posture data indicating an estimation indicating of a posture of a user wearing the clothing (Judgement);
calculating error data which is an estimate of an error that is occurring in the posture data, on a basis of the sensor data and the posture data (Judgement);
evaluating the error data based on the posture data and the error data to generate additional information indicating reliability of the estimated posture (Evaluating/Opinion);
calculating, on a basis of sensor data measured by each of a plurality of posture sensors, which include the posture sensor, the posture data indicating the posture of the user (Judgement) for each of the posture sensors; and
calculating the error data for each of the posture sensors indicating an error that is occurring in respective posture sensors (Judgement)
wherein the sensor data is data indicating a position or motion of the user in accordance with a position of the clothing to which the posture sensor is attached.
The underlined portions of which are grouped as a mental process or a mathematical concept under the 2019 PEG. A person would be able to manually collect data and calculate posture of a subject using their mind, which is grouped as a mental process under the 2019 PEG. The claimed invention, when given the broadest reasonable interpretation in light of the specification encompasses a mathematical calculation and thus falls within the mathematical concepts grouping under the 2019 PEG.
Further, dependent Claims 3, 4, and 6 merely include limitations that either further define the abstract idea (and thus don’t make the abstract idea any less abstract) or amount to no more than generally linking the use of the abstract idea to a particular technological environment or field of use because they’re merely incidental or token additions to the claims that do not alter or affect how the process steps are performed.
Accordingly, as indicated above, each of the above-identified claims recites an abstract idea.
Step 2A, Prong 2
The above-identified abstract idea in each of independent Claim 1 (and its respective dependent Claims 3, 4, and 6), Claim 8, and Claim 9 are not integrated into a practical application under 2019 PEG because the additional elements (identified in bold above in independent Claims 1, 8, and 9), either alone or in combination, generally link the use of the above-identified abstract idea to a particular technological environment or field of use. More specifically, the additional elements of: posture recognition system, posture estimation section, error estimation section, information generation section, computer, non-transitory computer-readable recording medium, measuring with posture sensors and the posture sensor(s) themselves are generically recited computer elements in independent Claim 1 (and its respective dependent claims), Claim 8, and Claim 9 which do not improve the functioning of a computer or any other technology or technical field. Nor do these above-identified additional elements serve to apply the above-identified abstract idea with, or by use of, a particular machine, effect a transformation or apply or use the above-identified abstract idea in some other meaningful way beyond generally linking the use thereof to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception. Furthermore, the above-identified additional elements do not add a meaningful limitation to the abstract idea because they amount to simply implementing the abstract idea on a computer. Further, the steps as set forth in dependent Claims 3, 4 & 6 are insignificant extra-solution activity and are regarded as pre-solution activity such as generating posture data, generating error data, arranging sensors on the same or different body parts, the posture data and error data being represented as chronological data, and generating additional information after analyzing error data (Identified as underlined text above in Step 2A Prong 1 for dependent Claims 3, 4, and 6). All uses of the recited abstract idea require such activity to be performed. For at least these reasons, the abstract idea identified above in independent Claim 1 (and its respective dependent claims), Claim 8 and Claim 9 are not integrated into a practical application under 2019 PEG.
Moreover, the above-identified abstract idea is not integrated into a practical application under 2019 PEG because the claimed method and system merely implements the above-identified abstract idea (e.g., mental process) using rules (e.g., computer instructions) executed by a computer (e.g., processor(s), posture recognition system, posture estimation section, error estimation section, information generation section, a computer as claimed). In other words, these claims are merely directed to an abstract idea with additional generic computer elements which do not add a meaningful limitation to the abstract idea because they amount to simply implementing the abstract idea on a computer. Additionally, Applicants specification does not include any discussion of how the claimed invention provides a technical improvement realized by these claims over the prior art or any explanation of a technical problem having an unconventional technical solution that is expressed in these claims. That is, like Affinity Labs of Tex. v. DirecTV, LLC, the specification fails to provide sufficient details regarding the manner in which the claimed invention accomplishes any technical improvement or solution. Thus, for these additional reasons, the abstract idea identified above in independent Claim 1 (and its respective dependent claims), Claim 8, and Claim 9 are not integrated into a practical application under the 2019 PEG.
Accordingly, independent Claim 1 (and its respective dependent claims), Claim 8, and Claim 9 are each directed to an abstract idea under 2019 PEG.
Step 2B
None of Claims 1, 3-4, 6 & 8-9 include additional elements that are sufficient to amount to significantly more than the abstract idea for at least the following reasons.
These claims require the additional elements of: posture recognition system, posture estimation section, error estimation section, information generation section, computer, non-transitory computer-readable recording medium, measuring using postures sensors and the posture sensor(s) themselves, as recited in independent Claim 1 (and its dependent claims), Claim 8, and Claim 9.
The above-identified additional elements are generically claimed computer components which enable the above-identified abstract idea(s) to be conducted by performing the basic functions of automating mental tasks. The courts have recognized such computer functions as well understood, routine, and conventional functions when claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity. See, Versaia Dev. Group, Inc. vy. SAP Am., Inc. , 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); and O/P Techs., 788 F.3d at 1363, 115 USPQ2d at 1092-93,
Per Applicants specification, the posture recognition system is given no structure other than encompassing the sensor section 110 and the information processing apparatus 120 at Para. [0016]. Further, the information processing apparatus is disclosed to function as a posture estimation apparatus and “can include a common server” at Para. [0024], and “includes a communication section 121, a posture estimation section 122, and error estimation section 123, an information generation section 124, and a control section 125 as functional components” at Para. [0026]. Thus the posture recognition system is interpreted as a common computer/server encompassing all of the function elements of the error estimation section, information generation section, a computer, a non-transitory and tangible computer-readable recording medium, and the sensor section which encompasses a posture sensor 112 and a communication section 113, where the posture sensor 112 is disclosed as “a motion measurement sensor for measuring…”, shown at Para. [0018].
Accordingly, in light of Applicants specification, the claimed terms posture recognition system, posture estimation section, error estimation section, information generation section, computer, non-transitory computer-readable recording medium, measuring using postures sensors and the posture sensor(s) themselves are reasonably construed as a generic computing device. Like SAP America vs Investpic, LLC (Federal Circuit 2018), it is clear, from the claims themselves and the specification, that these limitations require no improved computer resources, just already available computers, with their already available basic functions, to use as tools in executing the claimed process.
Furthermore, Applicants specification does not describe any special programming or algorithms required for the posture recognition system. This lack of disclosure is acceptable under 35 U.S.C. §112(a) since this hardware performs non-specialized functions known by those of ordinary skill in the computer arts. By omitting any specialized programming or algorithms, Applicants specification essentially admits that this hardware is conventional and performs well understood, routine and conventional activities in the computer industry or arts. In other words, Applicants specification demonstrates the well-understood, routine, conventional nature of the above-identified additional elements because it describes these additional elements in a manner that indicates that the additional elements are sufficiently well-known that the specification does not need to describe the particulars of such additional elements to satisfy 35 U.S.C. § 112(a) (see Berkheimer memo from April 19, 2018, (III)(A)(1) on page 3). Adding hardware that performs “well understood, routine, conventional activit[ies]’ previously known to the industry” will not make claims patent-eligible (TLI Communications).
The recitation of the above-identified additional limitations in Claims 1, 3-4, 6 & 8-9 amounts to mere instructions to implement the abstract idea on a computer. Simply using a computer or other machinery in its ordinary capacity for economic or other tasks (e.g., to receive, store, or transmit data) or simply adding a general purpose computer or computer components after the fact to an abstract idea (e.g., a fundamental economic practice or mathematical equation) does not provide significantly more. See Affinity Labs v. DirecTV, 838 F.3d 1253, 1262, 120 USPQ2d 1201, 1207 (Fed. Cir. 2016) (cellular telephone); and TLI Communications LLC v. AV Auto, LLC, 823 F.3d 607, 613, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (computer server and telephone unit). Moreover, implementing an abstract idea on a generic computer, does not add significantly more, similar to how the recitation of the computer in the claim in Alice amounted to mere instructions to apply the abstract idea of intermediated settlement on a generic computer.
A claim that purports to improve computer capabilities or to improve an existing technology may provide significantly more. McRO, Inc. v. Bandai Namco Games Am. Inc., 837 F.3d 1299, 1314-15, 120 USPQ2d 1091, 1101-02 (Fed. Cir. 2016); and Enfish, LLC v. Microsoft Corp., 822 F.3d 1327, 1335-36, 118 USPQ2d 1684, 1688-89 (Fed. Cir. 2016). However, a technical explanation as to how to implement the invention should be present in the specification for any assertion that the invention improves upon conventional functioning of a computer, or upon conventional technology or technological processes. That is, the disclosure must provide sufficient details such that one of ordinary skill in the art would recognize the claimed invention as providing an improvement. Here, Applicants specification does not include any discussion of how the claimed invention provides a technical improvement realized by these claims over the prior art or any explanation of a technical problem having an unconventional technical solution that is expressed in these claims. Instead, as in Affinity Labs of Tex. v. DirecTV, LLC 838 F.3d 1253, 1263-64, 120 USPQ2d 1201, 1207-08 (Fed. Cir. 2016), the specification fails to provide sufficient details regarding the manner in which the claimed invention accomplishes any technical improvement or solution (See 112(a) rejection above.).
For at least the above reasons, the system, method, and recording medium of Claims 1, 3-4, 6 & 8-9 are directed to applying an abstract idea as identified above on a general purpose computer without (i) improving the performance of the computer itself, or (ii) providing a technical solution to a problem in a technical field. None of Claims 1, 3-4, 6 & 8-9 provides meaningful limitations to transform the abstract idea into a patent eligible application of the abstract idea such that these claims amount to significantly more than the abstract idea itself.
Taking the additional elements individually and in combination, the additional elements do not provide significantly more. Specifically, when viewed individually, the above-identified additional elements in independent Claim 1 (and its dependent claims), Claim 8, and Claim 9 do not add significantly more because they are simply an attempt to limit the abstract idea to a particular technological environment. That is, neither the general computer elements nor any other additional element adds meaningful limitations to the abstract idea because these additional elements represent insignificant extra-solution activity. When viewed as a combination, these above-identified additional elements simply instruct the practitioner to implement the claimed functions with well-understood, routine and conventional activity specified at a high level of generality in a particular technological environment. As such, there is no inventive concept sufficient to transform the claimed subject matter into a patent-eligible application. When viewed as whole, the above-identified additional elements do not provide meaningful limitations to transform the abstract idea into a patent eligible application of the abstract idea such that the claims amount to significantly more than the abstract idea itself. Thus, Claims 1, 3-4, 6 & 8-9 merely apply an abstract idea to a computer and do not (1) improve the performance of the computer itself (as in Bascom and Enfish), or (ii) provide a technical solution to a problem in a technical field (as in DDR).
Therefore, none of the Claims 1, 3-4, 6 & 8-9 amounts to significantly more than the abstract idea itself. Accordingly, Claims 1, 3-4, 6 & 8-9 are not patent eligible and rejected under 35 U.S.C. 101.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1, 3-4, 6, and 8-9 are rejected under 35 U.S.C. 103 as being unpatentable over WO 2009112281 A1 to Harms et al. (hereinafter, Harms) in view of US 20200383609 A1to Aoki.
Regarding Claim 1, Harms discloses a posture recognition system (Harms: Abstract) comprising:
one or more processors (Harms: Pg. 11, lines 17-19; Fig. 1a) configured to execute:
a posture estimation section calculating, on a basis of the sensor data measured by a posture sensor attached to clothing (Harms: Pg. 7, lines 20-24),
a feature quantity representing a feature of an estimated posture as posture data indicating an estimation of a posture of a user wearing the clothing (Harms: Pg. 7, lines 7-10);(Harms: Pg. 12, lines 8-10; Fig. 2b),
wherein the sensor data is data indicating a position or motion of the user in accordance with a position of the clothing to which the posture sensor is attached (Harms: Pg. 8, line 10-17),
wherein the posture estimation section calculates, on a basis of sensor data measured by each of a plurality of posture sensors, which include the posture sensor, the posture data indicating the posture of the user for each of the posture sensors (Harms: Pg. 9, lines 2-4; Note: Though Harms discloses the posture classification is a fused result, the posture estimation of the user is based on the analysis of posture data calculated from each posture sensor and thus reads on), and
While Harms does disclose measuring effective errors directly using posture sensors (Harms: Pg. 9, lines 10-12), Harms does not explicitly disclose calculating error data which is an estimate of an error that is occurring in the posture data, on a basis of the sensor data and the posture data; an information generation section that evaluates the error data based on the posture data and the error data to generate additional information indicating reliability of the estimated posture, and calculating the error data for each of the posture sensors indicating an error that is occurring in respective posture sensors.
However, Aoki teaches an error estimation section calculating error data which is an estimate of an error that is occurring in the posture data (Harms: Pg. 9, lines 10-12), on a basis of the sensor data and the posture data (Aoki: ; Para. [0035] Note: Paragraph discusses sensor orientation when worn and sensor segments and reference areas; Para. [0074], [0075] and [0083] Note: Paragraphs discuss error calculation that occurs in the posture data on a basis of the sensor data and the posture data),
an information generation section that evaluates the error data based on the posture data and the error data to generate additional information indicating reliability of the estimated posture (Aoki: Para. [0103]; See Fig. 14),
wherein the error estimation section calculates the error data for each of the posture sensors indicating an error that is occurring in respective posture sensors (Aoki: ; Para. [0035] Note: Paragraph discusses sensor orientation when worn and sensor segments and reference areas; Para. [0074], [0075] and [0083] Note: Paragraphs discuss error calculation that occurs in the posture data on a basis of the sensor data and the posture data; Para. [0046] ‘The whole body correction amount calculation part 144 calculates a correction amount of the angular velocity of all segments on the basis of the entire IMU acceleration output by the acceleration collecting part 124’ Note: This would require the error data of each of the posture sensors and thus reads on this limitation).
One of ordinary skill in the art at the time the invention was filed would have found it obvious to modify the error estimation section of Harms by substituting the calculation of error data for each of the posture sensors as taught by Aoki. It would only require the routine skill of simple substitution of one known element for another to obtain predictable results (MPEP 2143 I. B.) in this case, to calculate errors in posture data measured from each sensor and apply corrective measures (Aoki: Para. [0046], [0075]).
Regarding Claim 3, Harms in view of Aoki discloses the invention as discussed in Claim 1. Harms further discloses wherein the plurality of posture sensors are attached to different portions of the clothing (Harms: Fig. 1b; Examiner’s Note: Terminals applied toward the cuff of each sleeve on the garment);(Harms: Pg. 4, lines 13-14) thereby acquiring sensor data for a plurality of different body parts of the user (Harms: Pg. 4, lines 3-6).
Regarding Claim 4, Harms in view of Aoki discloses the invention as discussed in Claim 1. While Harms does allude to being capable of attaching the plurality of sensors to a same portion of the clothing to acquire sensor data for a same body part of the user (Harms: Pg. 7, lines 20-27), Harms does not explicitly state this limitation.
One of ordinary skill in the art at the time the invention was filed would have found it obvious to modify Harms’ sensing garment with additional sensors to a same portion of the clothing to acquire sensor data for a same body part of the user for the purpose of error reduction and redundance as taught by Harms, since it has been held that rearranging parts of an invention involves only routine skill in the art. See MPEP 2144.04 VL (C).
Regarding Claim 6, Harms in view of Aoki discloses the invention as discussed in Claim 1. Harms further discloses wherein the posture data and the error data are chronological data (Harms: Pg. 19, lines 9-13; Examiner’s Note: chronological data is interpreted to be data occurring over time.);(Harms: Pg. 16, lines 21-22, The unit is able to perform a sample-wise realtime classification with seven active acceleration Terminals…; Examiner’s Note: chronological data is interpreted to be data occurring over time.).
Regarding Claim 8, Harms discloses a posture recognition method (Harms: Pg. 4, lines 3-6) by a posture recognition system (Abstract), the posture recognition method comprising:
calculating, on a basis of the sensor data measured by a posture sensor attached to clothing (Harms: Pg. 7, lines 20-24),
a feature quantity representing a feature of an estimated posture as posture data indicating an estimation of a posture of a user wearing the clothing (Harms: Pg. 7, lines 7-10; Pg. 12, lines 8-10; Fig. 2b);
calculating, on a basis of sensor data measured by each of a plurality of posture sensors, which include the posture sensor, the posture data indicating the posture of the user for each of the posture sensors (Harms: Pg. 9, lines 2-4; Note: Though Harms discloses the posture classification is a fused result, the posture estimation of the user is based on the analysis of posture data calculated from each posture sensor and thus reads on), and
wherein the sensor data is data indicating a position or motion of the user in accordance with a position of the clothing to which the posture sensor is attached (Harms: Pg. 8, line 10-17).
While Harms does disclose measuring effective errors directly using posture sensors (Harms: Pg. 9, lines 10-12), Harms does not explicitly disclose calculating error data which is an estimate of an error that is occurring in the posture data, on a basis of the sensor data and the posture data; evaluating the error data based on the posture data and the error data to generate additional information indicating reliability of the estimated posture, and calculating the error data for each of the posture sensors indicating an error that is occurring in respective posture sensors.
However, Aoki teaches an error estimation section calculating error data which is an estimate of an error that is occurring in the posture data (Harms: Pg. 9, lines 10-12), on a basis of the sensor data and the posture data (Aoki: ; Para. [0035] Note: Paragraph discusses sensor orientation when worn and sensor segments and reference areas; Para. [0074], [0075] and [0083] Note: Paragraphs discuss error calculation that occurs in the posture data on a basis of the sensor data and the posture data),
evaluating the error data based on the posture data and the error data to generate additional information indicating reliability of the estimated posture (Aoki: Para. [0103]; See Fig. 14)
wherein the error estimation section calculates the error data for each of the posture sensors indicating an error that is occurring in respective posture sensors (Aoki: ; Para. [0035] Note: Paragraph discusses sensor orientation when worn and sensor segments and reference areas; Para. [0074], [0075] and [0083] Note: Paragraphs discuss error calculation that occurs in the posture data on a basis of the sensor data and the posture data; Para. [0046] ‘The whole body correction amount calculation part 144 calculates a correction amount of the angular velocity of all segments on the basis of the entire IMU acceleration output by the acceleration collecting part 124’ Note: This would require the error data of each of the posture sensors and thus reads on this limitation).
One of ordinary skill in the art at the time the invention was filed would have found it obvious to modify the error estimation section of Harms by substituting the calculation of error data for each of the posture sensors as taught by Aoki. It would only require the routine skill of simple substitution of one known element for another to obtain predictable results (MPEP 2143 I. B.) in this case, to calculate errors in posture data measured from each sensor and apply corrective measures (Aoki: Para. [0046], [0075]).
Regarding Claim 9, Harms discloses a non-transitory computer readable recording medium storing a program to be executed by a computer, the program causing a computer to execute steps (Harms: Pg. 16, lines 25-26) comprising:
calculating, on a basis of the sensor data measured by a posture sensor attached to clothing (Harms: Pg. 7, lines 20-24),
a feature quantity representing a feature of an estimated posture as posture data indicating an estimation of a posture of a user wearing the clothing (Harms: Pg. 7, lines 7-10);(Harms: Pg. 12, lines 8-10; Fig. 2b);
calculating, on a basis of sensor data measured by each of a plurality of posture sensors, which include the posture sensor, the posture data indicating the posture of the user for each of the posture sensors (Harms: Pg. 9, lines 2-4; Note: Though Harms discloses the posture classification is a fused result, the posture estimation of the user is based on the analysis of posture data calculated from each posture sensor and thus reads on), and
wherein the sensor data is data indicating a position or motion of the user in accordance with a position of the clothing to which the posture sensor is attached (Harms: Pg. 8, line 10-17).
While Harms does disclose measuring effective errors directly using posture sensors (Harms: Pg. 9, lines 10-12), Harms does not explicitly disclose calculating error data which is an estimate of an error that is occurring in the posture data, on a basis of the sensor data and the posture data; evaluating the error data based on the posture data and the error data to generate additional information indicating reliability of the estimated posture, and calculating the error data for each of the posture sensors indicating an error that is occurring in respective posture sensors.
However, Aoki teaches an error estimation section calculating error data which is an estimate of an error that is occurring in the posture data (Harms: Pg. 9, lines 10-12), on a basis of the sensor data and the posture data (Aoki: ; Para. [0035] Note: Paragraph discusses sensor orientation when worn and sensor segments and reference areas; Para. [0074], [0075] and [0083] Note: Paragraphs discuss error calculation that occurs in the posture data on a basis of the sensor data and the posture data),
evaluating the error data based on the posture data and the error data to generate additional information indicating reliability of the estimated posture (Aoki: Para. [0103]; See Fig. 14),
wherein the error estimation section calculates the error data for each of the posture sensors indicating an error that is occurring in respective posture sensors (Aoki: ; Para. [0035] Note: Paragraph discusses sensor orientation when worn and sensor segments and reference areas; Para. [0074], [0075] and [0083] Note: Paragraphs discuss error calculation that occurs in the posture data on a basis of the sensor data and the posture data; Para. [0046] ‘The whole body correction amount calculation part 144 calculates a correction amount of the angular velocity of all segments on the basis of the entire IMU acceleration output by the acceleration collecting part 124’ Note: This would require the error data of each of the posture sensors and thus reads on this limitation).
One of ordinary skill in the art at the time the invention was filed would have found it obvious to modify the error estimation section of Harms by substituting the calculation of error data for each of the posture sensors as taught by Aoki. It would only require the routine skill of simple substitution of one known element for another to obtain predictable results (MPEP 2143 I. B.) in this case, to calculate errors in posture data measured from each sensor and apply corrective measures (Aoki: Para. [0046], [0075]).
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SHAWN CURTIS BROUGHTON whose telephone number is (571)272-2891. The examiner can normally be reached Monday - Friday, 8am-4pm EST..
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alexander Valvis can be reached at 571-272-4233. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/SHAWN CURTIS BROUGHTON/Examiner, Art Unit 3791
/PATRICK FERNANDES/Primary Examiner, Art Unit 3791