DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Applicant’s response
In Applicant’s response dated 10/15/2025, Applicant amended Claims 1 – 21; and argued against all objections and rejections previously set forth in the Office Action dated 02/12/2025.
In light of Applicant’s amendments and remarks, the previously set forth objections are withdrawn.
Status of the Claims
Claim 16 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, Claims 1 – 21 are rejected under 35 U.S.C. 101 and Claims 1 – 21 are rejected under 35 U.S.C. 102(a)(1)/102(a)(2).
Examiner Note
The Examiner cites particular columns, line numbers and/or paragraph numbers in the references as applied to the claims below for the convenience of the Applicant(s). Although the specified citations are representative of the teachings in the art and are applied to the specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested that, in preparing responses, the Applicant fully consider the references in their entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the Examiner.
Examiner Note Regarding Claim interpretation
An intended use or purpose usually will not limit the scope of the claim because such statements usually do no more than define a context in which the invention operates. The recitation of the intended use of the claimed invention does not serve to differentiate the claim from the prior art (See MPEP 2103(I)(C).)
Claim 21 recites intended use, because Claim 21 recites “displaying, by the information processing apparatus according to claim 16, the image for an operator to acquire learning data for creating a learned model that predicts a failure of the machine apparatus” (emphasis added).
However, indicating that the displayed image is for an operator to acquire learning data for creating a learned model that predicts a failure of the machine apparatus is the intended use of displaying the image, the claim is not clearly claiming the how the operator is creating the model that predicts a failure of the machine apparatus based on the learning data.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 06/13/2025 have been entered and considered by the examiner.
Claim Objections
Claim 7 is objected to because of the following informalities:
Claim 7 recites the term "and/or", which is selective language, the examiner suggests using either the "and" term or the "or" term, otherwise the claim should be worded in a more clearer fashion to claim both terms.
For the purpose of this examination the examiner is selecting the "or" term from this selective language.
Appropriate correction is required.
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: “acquire portion is configured to acquire…”, “extract portion is configured to extract…” and “display portion is configured to display…” in claim 16.
Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claim 16 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim limitation “a processing portion including a acquire portion, an extract portion and a display portion, wherein the acquire portion is configured to acquire..., wherein the extract portion is configured to extract..., and wherein the display portion is configured to display..." in claim 16 invokes 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. However, the written description fails to disclose the corresponding structure, material, or acts for performing the entire claimed function and to clearly link the structure, material, or acts to the function. The disclosure is devoid of any structure that performs the function in the claim. Therefore, the claim is indefinite and is rejected under 35 U.S.C. 112(b) or pre-AIA 35 U.S.C. 112, second paragraph.
Applicant may:
(a) Amend the claim so that the claim limitation will no longer be interpreted as a limitation under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph;
(b) Amend the written description of the specification such that it expressly recites what structure, material, or acts perform the entire claimed function, without introducing any new matter (35 U.S.C. 132(a)); or
(c) Amend the written description of the specification such that it clearly links the structure, material, or acts disclosed therein to the function recited in the claim, without introducing any new matter (35 U.S.C. 132(a)).
If applicant is of the opinion that the written description of the specification already implicitly or inherently discloses the corresponding structure, material, or acts and clearly links them to the function so that one of ordinary skill in the art would recognize what structure, material, or acts perform the claimed function, applicant should clarify the record by either:
(a) Amending the written description of the specification such that it expressly recites the corresponding structure, material, or acts for performing the claimed function and clearly links or associates the structure, material, or acts to the claimed function, without introducing any new matter (35 U.S.C. 132(a)); or
(b) Stating on the record what the corresponding structure, material, or acts, which are implicitly or inherently set forth in the written description of the specification, perform the claimed function. For more information, see 37 CFR 1.75(d) and MPEP §§ 608.01(o) and 2181.
If applicant does not wish to have the claim limitation treated under 35 U.S.C. 112, sixth paragraph, applicant may amend the claim so that it will clearly not invoke 35 U.S.C. 112, sixth paragraph, or present a sufficient showing that the claim recites sufficient structure, material, or acts for performing the claimed function to preclude application of 35 U.S.C. 112, sixth paragraph.
For more information, see Supplementary Examination Guidelines for Determining Compliance with 35 U.S.C. § 112 and for Treatment of Related Issues in Patent Applications, 76 FR 7162, 7167 (Feb. 9, 2011).
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1 – 21 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Claim 1 recites an information processing method, thus a process, one of the four statutory categories of patentable subject matter.
However, Claim 1 further recites:
extracting, by the information processing apparatus, a plurality of pieces of first partial time-series data corresponding to an event occurring in the machine apparatus during the repetitive operation from the plurality of time-series data (this limitation recites an abstract ideas as the step of extracting data associated with the event occurring in a machine is interpreted as observing, evaluating and judging steps, which are mental processes performed in a human mind).
Claim 1 thus recites an abstract idea.
The claims does not include additional elements which integrate the abstract idea into a practical application, since the additional elements consist of:
acquiring, by an information processing apparatus, a plurality of time-series data of a plurality of types of physical quantity related to a state of a machine apparatus performing a repetitive operation (this limitation is merely obtaining generic data associated with the a state of a machine apparatus performing a repetitive operation, thus, the limitation is merely gathering data. This additional element is interpreted as an insignificant extra-solution activity of data transfer (See MPEP 2106.05(g). Furthermore, the “information processing apparatus” is recited at a high level of generality and it is important to note that a general purpose computer that applies a judicial exception, such as an abstract idea, by use of conventional computer functions does not qualify as a particular machine (see MPEP 2106.05(b)); and
displaying, by the information processing apparatus, an image in which the plurality of pieces of first partial time-series data are combined (This additional element is interpreted as insignificant extra solution activity, in particular to post -solution activity being necessary data outputting (See MPEP 2106.05(g)). Furthermore, the “information processing apparatus” is recited at a high level of generality and it is important to note that a general purpose computer that applies a judicial exception, such as an abstract idea, by use of conventional computer functions does not qualify as a particular machine (see MPEP 2106.05(b)).
Accordingly, these additional elements does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea.
The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above in Step 2A prong 2, with respect to integration of the abstract idea into a practical application, the additional elements amount to no more than mere instructions to apply the exception using a generic computer component (see MPEP 2106.05(b) and MPEP 2106.05(f)), in addition to insignificant extra solution activity (see MPEP 2106.05(g)).
Thus, Claim 1 is ineligible.
Claims 2 – 6, these Claims recite further embellishments of the abstract idea discussed at claim 1 with the addition of performing additional manipulation of the data to transmit the result. This amount to further evaluating, judging, opinion, being abstract ideas.
Claims 7 – 9, these Claims recite additional characteristics in the analysis of data. This amount to further evaluating, judging being abstract ideas.
Claims 10 – 15, these Claims recites the output of the proposed item (the result of evaluating, judging, opinion which are mental processes), the output being in a display portion, which may be an interface which is generic computer components and displaying the results of analysis (See MPEP 2106.05(h)).
Claim 16 recite an information processing apparatus, thus an article of manufacture, one of the four statutory categories of patentable subject matter. However, Claim 16 further recite that this information processing apparatus is performing precisely the method of claim 1. As performance on a computer cannot integrate an abstract idea into a practical application nor provide significantly more than the abstract idea itself (See MPEP 2106.05(f)), Claim 16 is rejected as subject-matter ineligible for reasons set forth in the above rejection of Claim 1.
Claim 17 recites “a method of displaying a plurality of types of physical quantities related to a state of a machine apparatus performing a repetitive operation, thus a process”, one of the four statutory categories of patentable subject matter.
However, Claim 17 further recites:
“displaying an image obtained by combining information related to a plurality of partial time-series data extracted from a plurality of time series data of the physical quantities corresponding to events occurring in the machine apparatus during the repetitive operation” which is an evaluation or judgment of data that can be performed in the human mind or with the use of a physical aid (e.g., pen and paper), a person can review data and create a graph representing the data by using a pen and paper. Thus, falling within the mental process grouping of abstract ideas.
Claim 17, thus recites an abstract idea.
The claim does not include any additional element which integrate the abstract idea into a practical application.
Thus, the claim is directed towards the abstract idea.
Claim 18 recites a “a display method of displaying a plurality of types of physical quantities related to a state of a machine apparatus performing repetitive operation continuously” thus a process, one of the four statutory categories of patentable subject matter.
However, Claim 18 further recites
“displaying an image obtained by combining information related to a plurality of partial time-series data extracted from a plurality of time series data of the physical quantities corresponding to events occurring in the machine apparatus during the repetitive operation” which is an evaluation or judgment of data that can be performed in the human mind or with the use of a physical aid (e.g., pen and paper), a person can review data and create a graph representing the data by using a pen and paper. Thus, falling within the mental process grouping of abstract ideas.
Claim 18, thus recites an abstract idea.
The claim does not include any additional element which integrate the abstract idea into a practical application.
Thus, the claim is directed towards the abstract idea.
Claim 19 recites a computer-readable non-transitory recording medium storing a program that causes a computer to execute the information processing method according to claim 1. As performance on a computer cannot integrate an abstract idea into a practical application nor provide significantly more than the abstract idea itself (See MPEP 2106.05(f)), Claim 19 is rejected as subject-matter ineligible for reasons set forth in the rejection of Claim 1.
Claim 20 recites “a method of manufacturing products” thus a process, one of the four statutory categories of patentable subject matter.
However, Claim 20 further recites
“acquiring, by the information processing apparatus according to claim 16 (See the above rejection of Claim 16), the time-series data when the machine apparatus performs operations for manufacturing products; and displaying, by the information processing apparatus according to claim 16 (see the above rejection of Claim 16), the image” which is an evaluation or judgment of data that can be performed in the human mind or with the use of a physical aid (e.g., pen and paper), a person can review data and create a graph representing the data by using a pen and paper. Thus, falling within the mental process grouping of abstract ideas.
Claim 20, thus recites an abstract idea.
The claim does not include any additional element which integrate the abstract idea into a practical application.
Thus, the claim is directed towards the abstract idea.
Claim 21 recites “a method of acquiring learning data” thus a process, one of the four statutory categories of patentable subject matter.
“creating, by the information processing apparatus according to claim 16 (See the above rejection of Claim 16), the image; and displaying, by the information processing apparatus according to claim 16, the image for an operator to acquire learning data for creating a learned model that predicts a failure of the machine apparatus” which is an evaluation or judgment of data that can be performed in the human mind or with the use of a physical aid (e.g., pen and paper), a person can review data and create a graph representing the data by using a pen and paper. Thus, falling within the mental process grouping of abstract ideas.
Claim 21, thus recites an abstract idea.
The claim does not include any additional element which integrate the abstract idea into a practical application.
Thus, the claim is directed towards the abstract idea.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1 – 21 are rejected under 35 U.S.C. 102(a)(1)/102(a)(2) as being anticipated by Nakamura (hereinafter, Nakamura).
Regarding Claim 1, Nakamura teaches an information processing method (See Nakamura’s Abstract) comprising:
acquiring, by an information processing apparatus, a plurality of time-series data of a plurality of types of physical quantity related to a state of a machine apparatus performing a repetitive operation (Nakamura in par 0033 – 0037 and Fig. 1, teaches that a time series data processing device 1 includes a first time-series data acquiring unit 2 and a second time-series data acquiring unit 8. Time-series data are strings of sensor data (physical quantities) indicating states of subject equipment that are sequentially observed over time by sensors. The first time-series data acquiring unit 2 acquires a plurality of time-series data accumulated in the control system. Nakamura in par 0039, further teaches that the first time-series data acquiring unit 2 acquires parameters to be used for generation of event information from the control system. Nakamura in par 0082 – 0083, further teaches that event waveforms may occur in accordance with a rule. Examples of the rule include a rule in units of a day such as “for ten minutes from midnight every day”, a rule of time and day on the calendar such as “8 PM on the second Friday of every month” (a rule that varies between four and five weeks), and a rule of the operation status such as “each time 1,000 products are manufactured”. In addition, the timing at which an event waveform occurs may change depending on the operating day of the subject equipment, the schedules of workers or working machinery, the suddenness of an event, and the like);
extracting, by the information processing apparatus, a plurality of pieces of first partial time-series data corresponding to an event occurring in the machine apparatus during the repetitive operation from the plurality of time-series data (Nakamura in par 0040 – 0041 and Fig. 1, further teaches that an event waveform extracting unit 3 extracts an event waveform from each of the time-series data acquired by the first time-series data acquiring unit 2. An event waveform is waveform data (partial string data) in time-series data, which are expected to be changed by an event occurred in the subject equipment); and
displaying, by the information processing apparatus, an image in which the plurality of pieces of first partial time-series data are combined (Nakamura in par 0084 and Fig. 4, further teaches a time-series data illustrated in Fig. 4 repeat a gradual increase and a gradual decrease while vibrating throughout the year, and have event waveforms of upper and lower peaks with an amplitude of 5 or larger that occurred in the first half of each month).
Regarding Claim 2, Nakamura teaches the limitations contained in parent Claim 1. Nakamura further teaches:
wherein in the extracting, a plurality of pieces of second partial time-series data corresponding to each of the plurality of pieces of first partial time-series data is extracted (Nakamura in par 0048, further teaches that the second time-series data acquiring unit 8 acquires a plurality of time-series data from the control system of the subject equipment. The subject equipment is the same as that from which the first time-series data acquiring unit 2 acquired time-series data, but the second time-series data acquiring unit 8 is time-series data observed by the sensor provided in the subject equipment when determining abnormality of the subject equipment), and
wherein, in the displaying, each of the plurality of pieces of first partial time-series data and each of the plurality of pieces of second partial time-series data is displayed in a corresponding manner in the image (Nakamura in par 0173 – 0177 and Fig. 12, teaches that figure 12 is a diagram illustrating an example of an information editing screen presented by the presentation unit 12. The screen includes graphs of the time-series data. The presentation unit 12 may highlight the event waveforms extracted by the event waveform extracting unit 3 and the event waveforms occurring at the same time among the time-series data included in the group in the graphs of the time-series data included in the group).
Regarding Claim 3, Nakamura teaches the limitations contained in parent Claim 1. Nakamura further teaches:
wherein in the extracting, the plurality of pieces of first partial time-series data is extracted depending on event data related to the event (Nakamura in par 0040 – 0041 and Fig. 1, further teaches that an event waveform extracting unit 3 extracts an event waveform from each of the time-series data acquired by the first time-series data acquiring unit 2. An event waveform is waveform data in time-series data, which are expected to be changed by an event occurred in the subject equipment. Nakamura in par 0119 – 0120 and Fig. 3, further teaches that the event waveform list information includes the starting times and the ending times of event waveforms generated by the event waveform extracting unit 3. Subsequently, the event information generating unit 6 determines the time at which event waveforms occur at the same time among the time-series data included in each group, and generates event information identifying an event related to the event waveforms on the basis of the determined time (step ST6). The process in step ST6 is performed for each group by the event information generating unit 6).
Regarding Claim 4, Nakamura teaches the limitations contained in parent Claim 3. Nakamura further teaches:
further comprising obtaining the event data relating to a plurality of types of events occurring in the machine apparatus (Nakamura in par 0033 – 0037 and Fig. 1, teaches that a time series data processing device 1 includes a first time-series data acquiring unit 2 and a second time-series data acquiring unit 8. Time-series data are strings of sensor data (physical quantities) indicating states of subject equipment that are sequentially observed over time by sensors. Examples of the subject equipment include equipment in a plant such as a power plant, a chemical plant, or a water and sewerage plant, air conditioning equipment, electrical equipment, lighting equipment, and plumbing equipment in a building or a factory. Furthermore, the subject equipment may be equipment in a production line of a factory, or equipment of an automobile or a railroad vehicle, or may be equipment of an information system related to economics or management. The first time-series data acquiring unit 2 acquires a plurality of time-series data accumulated in the control system. Nakamura in par 0039, further teaches that the first time-series data acquiring unit 2 acquires parameters to be used for generation of event information from the control system),
wherein, in the extracting, the plurality of pieces of first partial time-series data is extracted relating to a plurality of types of predetermined events selected from the plurality of types of events (Nakamura in par 0201 – 0204, teaches that the event information storing unit 7 may store event information for each of time-series data. In addition, event conditions that are common within a group, such as the time of occurrence of an event, may be determined together for each group and stored for each time-series data in the event information storing unit 7. A plurality of events may relate to one time-series data. Event conditions related to a first one of the events are estimated by the event information generating unit 6, and the estimated event conditions are determined by the editing unit 14. Event conditions related to second and subsequent events may be determined through similar procedures), and
wherein, in the displaying, an information related to the plurality of pieces of first partial time-series data on the plurality of types of predetermined events is arranged in the image (Nakamura in par 0173 – 0177 and Fig. 12, teaches that figure 12 is a diagram illustrating an example of an information editing screen presented by the presentation unit 12. The screen includes graphs of the time-series data. The presentation unit 12 may highlight the event waveforms extracted by the event waveform extracting unit 3 and the event waveforms occurring at the same time among the time-series data included in the group in the graphs of the time-series data included in the group. In the example of FIG. 12, waveforms (peaks) indicated by arrows in the graph of the time-series data A do not occur in the time-series data B and C, and are thus event waveforms that do not occur at the same time among the time-series data. In contrast, the peaks enclosed by broken lines are event waveforms occurring at the same time in the time-series data A to C. Such highlighting as enclosing with broken lines facilitates visual recognition of the event waveforms occurring at the same time in the time-series data A to C).
Regarding Claim 5, Nakamura teaches the limitations contained in parent Claim 3. Nakamura further teaches:
wherein the image contains information on the event (Nakamura in par 0189 – 0192, teaches that FIG. 13 is a diagram illustrating an example of the detailed group information screen. As illustrated in FIG. 13, the detailed group information screen displays, in addition to an event waveform list A3, and further displays a button d for determining an event condition. The event waveform list A3 is a list of event waveforms included in the time-series data A. For example, in the event waveform list A3, the starting times and the ending times of the event waveforms, the descriptive statistics of the event waveforms such as the durations, the maximum amplitudes, and the average values and the standard deviations of upward and downward variations, the frequencies, and the types of the event waveforms are displayed).
Regarding Claim 6, Nakamura teaches the limitations contained in parent Claim 3. Nakamura further teaches:
wherein the event is set depending on a peak of the physical quantity (Nakamura in par 0077 – 0079, further teaches that examples of the pattern of change of partial string data constituting an event waveform include a change pattern in which the data value suddenly increases or decreases or a change pattern in which a sudden increase and a sudden decrease of the data value are repeated with peaks).
Regarding Claim 7, Nakamura teaches the limitations contained in parent Claim 6. Nakamura further teaches:
wherein the event is set when a value of the peaks becomes equal to or larger than a predetermined threshold and/or when a number of the peaks becomes equal to or larger than a predetermined number (Nakamura in par 0084 and Fig. 4, teaches that FIG. 4 is a graph illustrating an example of time-series data, in which time-series data that are observed from subject equipment throughout a year are illustrated. The time-series data illustrated in FIG. 4 repeat a gradual increase and a gradual decrease while vibrating throughout the year, and have event waveforms of upper and lower peaks with an amplitude of 5 or larger that occurred in the first half of each month). Accordingly, as shown in figure 4, the event are set when a peak is 5 or larger.
Regarding Claim 8, Nakamura teaches the limitations contained in parent Claim 3. Nakamura further teaches:
wherein the event data is data on a date and time at which the event occurred (Nakamura in par 0091 and Fig. 6, further teaches that FIG. 6 is a table illustrating an example of information output by the event waveform extracting unit 3, in which specific contents of event waveform list information are illustrated. The list information illustrated in FIG. 6 includes, in addition to the starting times and the ending times of event waveforms, the durations of the event waveforms, the maximum amplitudes, averages, and standard deviations, which are descriptive statistics, and the frequencies thereof).
Regarding Claim 9, Nakamura teaches the limitations contained in parent Claim 1. Nakamura further teaches:
wherein the time-series data from which the pieces of partial time-series data have still not been extracted are located on a linear scale that represents time as an index (Nakamura in par 0084 and Fig. 4, teaches that FIG. 4 is a graph illustrating an example of time-series data, in which time-series data that are observed from subject equipment throughout a year are illustrated. The time-series data illustrated in FIG. 4 repeat a gradual increase and a gradual decrease while vibrating throughout the year, and have event waveforms of upper and lower peaks with an amplitude of 5 or larger that occurred in the first half of each month). As shown in figure 4, the graph represents time as an index.
Regarding Claim 10, Nakamura teaches the limitations contained in parent Claim 1. Nakamura further teaches:
wherein the image is displayed on a display portion (Nakamura in par 0055 and Fig, 11, teaches that the output unit 11 outputs the result of abnormality determination performed by the determining unit 10. For example, the output unit 11 may provide a visual output on a display connected with the time-series data processing device 1 or may provide an auditory output from a loudspeaker connected with the time-series data processing device 1).
Regarding Claim 11, Nakamura teaches the limitations contained in parent Claim 1. Nakamura further teaches:
wherein, in the displaying, the plurality of pieces of first partial time-series data are arranged at a distance from each other in the image (Nakamura in par 0173 – 0177 and Fig. 12, teaches that figure 12 is a diagram illustrating an example of an information editing screen presented by the presentation unit 12. The screen includes graphs of the time-series data. The presentation unit 12 may highlight the event waveforms extracted by the event waveform extracting unit 3 and the event waveforms occurring at the same time among the time-series data included in the group in the graphs of the time-series data included in the group. In the example of FIG. 12, waveforms (peaks) indicated by arrows in the graph of the time-series data A do not occur in the time-series data B and C, and are thus event waveforms that do not occur at the same time among the time-series data. In contrast, the peaks enclosed by broken lines are event waveforms occurring at the same time in the time-series data A to C. Such highlighting as enclosing with broken lines facilitates visual recognition of the event waveforms occurring at the same time in the time-series data A to C).
Regarding Claim 12, Nakamura teaches the limitations contained in parent Claim 1. Nakamura further teaches:
wherein the image contains an input area in which an operator puts information (Nakamura in par 0197 and Fig. 12, teaches that the user can select an event waveform related to an actual event from the event waveform list A3 by using the input device 206 and use the selected event waveform for generation of event information. The histograms A4, the band model A5, and the statistic ranges A6, associated with the event waveform selected from the event waveform list A3 are presented by the presentation unit 12).
Regarding Claim 13, Nakamura teaches the limitations contained in parent Claim 12. Nakamura further teaches:
wherein the input area allows a label to be set, and the label indicates a category of the plurality of pieces of first partial time-series data (Nakamura in par 0063, Fig. 1 and Fig. 2A, teaches that the display 208 receives input of information via the display IF 204, and displays the input information. The output unit 11 illustrated in FIG. 1 has a function of displaying, on the display 208, a result of abnormality determination performed by the determining unit 10. Nakamura in par 0197 and Fig. 12, further teaches that when the user inputs set values for the histograms A4, the band model A5, and the statistic ranges A6 by using the input device 206, the values are received by the operation inputting unit 13 and output to the presentation unit 12 and the editing unit 14).
Regarding Claim 14, Nakamura teaches the limitations contained in parent Claim 1. Nakamura further teaches:
wherein, in the displaying, the plurality of pieces of first partial time-series data is displayed on the image regardless of a time axis in the time series data before the plurality of pieces of first partial time-series data is extracted (Nakamura in par 0176 – 0177 and Fig. 12, teaches that the presentation unit 12 may highlight the event waveforms extracted by the event waveform extracting unit 3 and the event waveforms occurring at the same time among the time-series data included in the group in the graphs of the time-series data included in the group. In the example of FIG. 12, waveforms (peaks) indicated by arrows in the graph of the time-series data A do not occur in the time-series data B and C, and are thus event waveforms that do not occur at the same time among the time-series data. In contrast, the peaks enclosed by broken lines are event waveforms occurring at the same time in the time-series data A to C. Such highlighting as enclosing with broken lines facilitates visual recognition of the event waveforms occurring at the same time in the time-series data A to C. Nakamura in par 0190 and Fig. 13, further teaches that the time-series data list A1 is a list of time-series data included in a group selected from the group list illustrated in FIG. 12. The graph A2 of the time-series data A is a graph of the time-series data A selected from the time-series data list A1, and in FIG. 13, event waveforms (peaks) occurring at the same time among the time-series data A and the time-series data B and C are highlighted by being enclosed by broken lines).
Regarding Claim 15, Nakamura teaches the limitations contained in parent Claim 1. Nakamura further teaches:
wherein, in the displaying, the plurality of pieces of first partial time-series data is displayed on the image according to a sampling number or a number of operation cycles of the repetitive operation (Nakamura in par 0173 – 0177 and Fig. 12, teaches that figure 12 is a diagram illustrating an example of an information editing screen presented by the presentation unit 12. The screen includes graphs of the time-series data. The presentation unit 12 may highlight the event waveforms extracted by the event waveform extracting unit 3 and the event waveforms occurring at the same time among the time-series data included in the group in the graphs of the time-series data included in the group).
Regarding Claim 16, this Claim merely recites an information processing apparatus comprising a processing portion including a acquire portion, an extract portion, and a display portion, to perform the method as recited in claim 1. Accordingly, Nakamura discloses/teaches every limitation of Claim 16, as indicated in the above rejection of Claim 1.
Regarding Claim 17, Nakamura teaches a method of displaying a plurality of types of physical quantities related to a state of a machine apparatus performing a repetitive operation (See Nakamura par 0082 – 0083 and 0173 – 0177 and Fig. 12), the method comprising:
displaying an image obtained by combining information related to a plurality of partial time-series data extracted from a plurality of time series data of the physical quantities corresponding to events occurring in the machine apparatus during the repetitive operation (Nakamura in par 0173 – 0177 and Fig. 12, teaches that figure 12 is a diagram illustrating an example of an information editing screen presented by the presentation unit 12. The screen includes graphs of the time-series data. The presentation unit 12 may highlight the event waveforms extracted by the event waveform extracting unit 3 and the event waveforms occurring at the same time among the time-series data included in the group in the graphs of the time-series data included in the group. In the example of FIG. 12, waveforms (peaks) indicated by arrows in the graph of the time-series data A do not occur in the time-series data B and C, and are thus event waveforms that do not occur at the same time among the time-series data. In contrast, the peaks enclosed by broken lines are event waveforms occurring at the same time in the time-series data A to C. Such highlighting as enclosing with broken lines facilitates visual recognition of the event waveforms occurring at the same time in the time-series data A to C).
Regarding Claim 18, Nakamura teaches a display method of displaying a plurality of types of physical quantities related to a state of a machine apparatus performing repetitive operation continuously (See Nakamura par 0082 – 0083 and 0173 – 0177 and Fig. 12), the method comprising:
displaying an image obtained by combining information related to a plurality of partial time-series data extracted from a plurality of time series data of the physical quantities corresponding to events occurring in the machine apparatus during the repetitive operation (Nakamura in par 0173 – 0177 and Fig. 12, teaches that figure 12 is a diagram illustrating an example of an information editing screen presented by the presentation unit 12. The screen includes graphs of the time-series data. The presentation unit 12 may highlight the event waveforms extracted by the event waveform extracting unit 3 and the event waveforms occurring at the same time among the time-series data included in the group in the graphs of the time-series data included in the group. In the example of FIG. 12, waveforms (peaks) indicated by arrows in the graph of the time-series data A do not occur in the time-series data B and C, and are thus event waveforms that do not occur at the same time among the time-series data. In contrast, the peaks enclosed by broken lines are event waveforms occurring at the same time in the time-series data A to C. Such highlighting as enclosing with broken lines facilitates visual recognition of the event waveforms occurring at the same time in the time-series data A to C).
Regarding Claim 19, This claim merely recites a computer-readable non-transitory recording medium storing a program that causes a computer to execute the information processing method according to claim 1. Accordingly, Nakamura discloses/teaches Claim 19, as indicated in the above rejection of Claim 1.
Regarding Claim 20, Nakamura teaches a method of manufacturing products (See Nakamura’s par 0082 0083), comprising:
acquiring, by the information processing apparatus according to claim 16, the time-series data when the machine apparatus performs operations for manufacturing products (See the above rejections of Claim 1 and Claim 16, furthermore Nakamura in par 0033 – 0037 and Fig. 1, teaches that a time series data processing device 1 includes a first time-series data acquiring unit 2 and a second time-series data acquiring unit 8. Time-series data are strings of sensor data (physical quantities) indicating states of subject equipment that are sequentially observed over time by sensors. The first time-series data acquiring unit 2 acquires a plurality of time-series data accumulated in the control system. Nakamura in par 0039, further teaches that the first time-series data acquiring unit 2 acquires parameters to be used for generation of event information from the control system. Nakamura in par 0082 – 0083, further teaches that event waveforms may occur in accordance with a rule. Examples of the rule include a rule in units of a day such as “for ten minutes from midnight every day”, a rule of time and day on the calendar such as “8 PM on the second Friday of every month” (a rule that varies between four and five weeks), and a rule of the operation status such as “each time 1,000 products are manufactured”. In addition, the timing at which an event waveform occurs may change depending on the operating day of the subject equipment, the schedules of workers or working machinery, the suddenness of an event, and the like); and
displaying, by the information processing apparatus according to claim 16, the image (See the above rejections of Claim 1 and Claim 16, furthermore Nakamura in par 0173 – 0177 and Fig. 12, teaches that figure 12 is a diagram illustrating an example of an information editing screen presented by the presentation unit 12. The screen includes graphs of the time-series data. The presentation unit 12 may highlight the event waveforms extracted by the event waveform extracting unit 3 and the event waveforms occurring at the same time among the time-series data included in the group in the graphs of the time-series data included in the group).
Regarding Claim 21, Nakamura teaches a method of acquiring learning data (See Nakamura’s par 0003, 0013 and 0035) comprising:
creating, by the information processing apparatus according to claim 16, the image (See the above rejection for Claims 1 and 16, furthermore Nakamura in par 0176 – 0177 and Fig. 12, teaches that the presentation unit 12 may highlight the event waveforms extracted by the event waveform extracting unit 3 and the event waveforms occurring at the same time among the time-series data included in the group in the graphs of the time-series data included in the group. In the example of FIG. 12, waveforms (peaks) indicated by arrows in the graph of the time-series data A do not occur in the time-series data B and C, and are thus event waveforms that do not occur at the same time among the time-series data. In contrast, the peaks enclosed by broken lines are event waveforms occurring at the same time in the time-series data A to C. Such highlighting as enclosing with broken lines facilitates visual recognition of the event waveforms occurring at the same time in the time-series data A to C. Nakamura in par 0190 and Fig. 13, further teaches that the time-series data list A1 is a list of time-series data included in a group selected from the group list illustrated in FIG. 12. The graph A2 of the time-series data A is a graph of the time-series data A selected from the time-series data list A1, and in FIG. 13, event waveforms (peaks) occurring at the same time among the time-series data A and the time-series data B and C are highlighted by being enclosed by broken lines); and
displaying, by the information processing apparatus according to claim 16, the image for an operator to acquire learning data for creating a learned model that predicts a failure of the machine apparatus (See the above rejection for Claims 1 and 16, furthermore Nakamura in par 0132, teaches that the event information generating unit 6 estimates an event condition that an event related to event waveforms occur at “1 AM on the second Tuesday of each month” on the basis of analysis results of the histograms. In this manner, the time at which an event occurs is statistically estimated. Nakamura in par 0190 and Fig. 13, further teaches that the time-series data list A1 is a list of time-series data included in a group selected from the group list illustrated in FIG. 12. The graph A2 of the time-series data A is a graph of the time-series data A selected from the time-series data list A1, and in FIG. 13, event waveforms (peaks) occurring at the same time among the time-series data A and the time-series data B and C are highlighted by being enclosed by broken lines). Nakamura in par.195 – 0197 and Fig. 12, further teaches that the band model A5 is the band model of the event waveforms. In addition, the ranges of various statistic amounts displayed in the event waveform list A3 are set in the statistic ranges A6. The band model A5 and the statistic ranges A6 are information generated by the event information generating unit 6 by using the group list information. The button d for determining an event condition is a button for determining an event condition resulting from editing. When the user inputs set values for the histograms A4, the band model A5, and the statistic ranges A6 by using the input device 206, the values are received by the operation inputting unit 13 and output to the presentation unit 12 and the editing unit 14).
Response to Arguments
Applicant's arguments filed 10/15/2025 have been fully considered but they are not persuasive.
Regarding Claim interpretation under 35 U.S.C. 112(f):
(1) Applicant submit that the claim language does not invoke section 112(f).
The examiner respectfully disagrees.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: “acquire portion is configured to acquire…”, “extract portion is configured to extract…” and “display portion is configured to display…” in claim 16.
Furthermore, the disclosure is devoid of any structure that performs the function in the claim. Additionally, the specification fail to describe any structure associated with the processing portion.
Claim Rejections Under 35 U.S.C. 101
(2) Applicant submit that claims 20 and 21 incorporate the apparatus recited in claim 16, which is subject matter eligible. The section 101 rejections should therefore, be withdrawn.
The Examiner respectfully disagrees.
Performance on a computer cannot integrate an abstract idea into a practical application nor provide significantly more than the abstract idea itself (See MPEP 2106.05(f)).
Claim Rejection under 35 U.S.C 102
(3) Applicant argues: that Nakamura does not teach or suggest the combination of features of amended Claim 1, including “extracting, by the information processing apparatus, a plurality of pieces of first partial time-series data corresponding to an event occurring in the machine apparatus during the repetitive operation from the plurality of time-series data” and “displaying, by the information processing apparatus, an image in which the plurality of pieces of first time-series data are combined”.
The examiner respectfully disagrees.
Nakamura in par 0033 – 0037 and Fig. 1, teaches that a time series data processing device 1 includes a first time-series data acquiring unit 2 and a second time-series data acquiring unit 8. Nakamura in par 0040 – 0041 and Fig. 1, further teaches that an event waveform extracting unit 3 extracts an event waveform from each of the time-series data acquired by the first time-series data acquiring unit 2. An event waveform is waveform data (partial string data) in time-series data, which are expected to be changed by an event occurred in the subject equipment. Nakamura in par 0082 – 0083, further teaches that event waveforms may occur in accordance with a rule. Examples of the rule include a rule in units of a day such as “for ten minutes from midnight every day”, a rule of time and day on the calendar such as “8 PM on the second Friday of every month” (a rule that varies between four and five weeks), and a rule of the operation status such as “each time 1,000 products are manufactured”.
Accordingly, as correctly indicated by the applicant Nakamura discloses that it is possible to identify event information that identified the waveforms related to an event, Therefore, Nakamura teaches extracting a plurality of time-series data that correspond to an event in a machine executing repetitive operations. Thus, Nakamura teaches or suggests “extracting, by the information processing apparatus, a plurality of pieces of first partial time-series data corresponding to an event occurring in the machine apparatus during the repetitive operation from the plurality of time-series data” as claimed.
Furthermore, the claim recites “displaying, by the information processing apparatus, an image in which the plurality of pieces of first time-series data are combined. The claim as recited does not exclude the presentation of other data in addition to the first time-series data.
Nakamura in par 0084 and Fig. 4, further teaches a time-series data illustrated in Fig. 4 repeat a gradual increase and a gradual decrease while vibrating throughout the year, and have event waveforms of upper and lower peaks with an amplitude of 5 or larger that occurred in the first half of each month.
Accordingly, as shown in figure 4, Nakamura is displaying the first time series associated with the first half of each month throughout the year. Accordingly, Nakamura teaches or suggests “displaying, by the information processing apparatus, an image in which the plurality of pieces of first time-series data are combined” as claimed.
Applicant's remaining arguments with respect to claims are substantially encompassed in the arguments above, therefore examiner responds with the same rationale.
For at least the foregoing reasons, Examiner maintains prior art rejections.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ARIEL MERCADO VARGAS whose telephone number is (571)270-1701. The examiner can normally be reached M-F 8:00am - 4:00pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Scott Baderman can be reached at 571-272-3644. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ARIEL MERCADO-VARGAS/Primary Examiner, Art Unit 2118