Detailed Action
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of the Application and Claims
This action is in reply to the application filed on 9/13/2024.
This communication is the first action on the merits.
IDS filed on 12/12/2024 and 12/15/2025 are acknowledged and considered by the Examiner.
Claims 1-7 is/are currently pending and have been examined.
Claim Objection
Claims 4 is/are objected due to the following informality.
Claim 4 recite, “…using a work took…”, this appears to be a spelling issue, since Claim 3 recite “work tool”. Appropriate correction required.
Claim Rejections – 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-7 is/are rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter.
Claim 1 (similarly 7) recite, “A… to perform operations to output analysis result information on a work performance status of a worker based on video recordings of actions of the worker,
wherein the … is configured to:
based on the video recordings, recognize one or more target objects to be handled by the worker and determine types of the target objects, while recognizing one or more actions of the worker, the actions forming a work, and determining types of the actions; and
output analysis result display information which visualizes the analysis result information, and is a time series representation of a status of the actions associated with each of the types of the target objects.”
Analyzing under Step 2A, Prong 1:
The limitations regarding, …based on the video recordings, recognize one or more target objects to be handled by the worker and determine types of the target objects, while recognizing one or more actions of the worker, the actions forming a work, and determining types of the actions; and output analysis result display information which visualizes the analysis result information, and is a time series representation of a status of the actions associated with each of the types of the target objects…, under the broadest reasonable interpretation, can include a human using their mind and using pen and paper to perform the above identified limitations, therefore, the claims are directed to a mental process.
Further, …based on the video recordings, recognize one or more target objects to be handled by the worker and determine types of the target objects, while recognizing one or more actions of the worker, the actions forming a work, and determining types of the actions; and output analysis result display information which visualizes the analysis result information, and is a time series representation of a status of the actions associated with each of the types of the target objects…, are managing human worker performing work, which are managing interactions between people, therefore the claims, are directed to certain methods of organizing human activities.
Accordingly, the claims are directed to a mental process, certain methods of organizing human activities, and thus, the claims are directed to an abstract idea under the first prong of Step 2A.
Analyzing under Step 2A, Prong 2:
This judicial exception is not integrated into a practical application under the second prong of Step 2A.
In particular, the claims recite the additional elements beyond the recited abstract idea identified under Step 2A, Prong 1, such as:
Claim 1, 7: work analyzing device in which a processor is caused, processor
Claim 2: outputs the video recordings as a video image that allows for playback
, and pursuant to the broadest reasonable interpretation, as an ordered combination, each of the additional elements are computing elements recited at high level of generality implementing the abstract idea, and thus, are no more than applying the abstract idea with generic computer components.
Further, these additional elements generally link the abstract idea to a technical environment, namely the environment of a computer.
Additionally, with respect to, “…based on the video recordings, recognize…”, “…identifies…”, “…detecting…”, “…output analysis result display information…”, these elements do not add a meaningful limitations to integrate the abstract idea into a practical application because they are extra-solution activity, pre and post solution activity - i.e. data gathering – “…based on the video recordings, recognize…”, “…identifies…”, “…detecting…”, data output – “…output analysis result display information…”
Analyzing under Step 2B:
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception under Step 2B.
As noted above, the aforementioned additional elements beyond the recited abstract idea are not sufficient to amount to significantly more than the recited abstract idea because, as an order combination, the additional elements are no more than mere instructions to implement the idea using generic computer components (i.e. apply it).
Additionally, as an order combination, the additional elements append the recited abstract idea to well-understood, routine, and conventional activities in the field as individually evinced by the applicant’s own disclosure, as required by the Berkheimer Memo, in at least:
[0003] Known such technologies for analyzing a work performance status of a worker include a method which involves capturing video recordings of actions of a worker with a camera; analyzing, from the video recordings, a work performance status of a worker; and presenting to a user a Gantt chart that visualizes a status of the worker's actions in each process step and/or a video image of the working worker during work (Patent Document 1).
[0026] Figure 1 is a diagram showing an overall configuration of a work analyzing system according to an embodiment of the present disclosure.
[0027] The work analyzing system is configured to analyze a work performance status of a worker in a workplace such as a factory to thereby present a result of analysis to a user. The system includes a camera 1, a recorder 2, a server 3 (work analyzing device), and an administrator terminal 4.
[0028] The camera 1 images a worker working at the worker's place.
[0029] The recorder 2 records video data provided from the camera 1.
[0030] The server 3 acquires video recordings from the camera 1 and/or the recorder 2, analyzes a work performance status of a worker based on video recordings, and outputs a result of the analysis. The server 3 is installed in a facility (e.g., a factory) and performs work analysis for the facility. The server 3 may also be configured as a cloud computer and perform work analysis for a plurality of facilities.
[0031] The administrator terminal 4 is used by a system administrator or a work administrator (user), and is implemented by a PC, a tablet terminal, or any other suitable device. The system administrator uses the administrator terminal 4 to perform settings for various operations performed by the server 3. In addition, the administrator terminal 4 displays results of analysis provided from the server 3 so that the work administrator can view the results.
[0032] Next, a schematic configuration of the server 3 will be described. Figure 2 is a block
diagram showing a schematic configuration of the server 3.
[0033] The server 3 includes a video input device 11, a screen output device 12, a storage 13, and a processor 14.
[0034] The video input device 11 receives at video recordings recorded in the recorder 2.
[0035] The screen output device 12 outputs data of an analysis result display screen generated by the processor 14, and the analysis result display screen (Figure 6) is displayed on the administrator terminal 4.
[0036] The storage 13 stores programs which are executable by a processor 14 and other data. The storage 13 stores video recordings for analysis acquired from the recorder 2. The storage 13 stores an object action recognition model (object action estimation model) to be used by the processor 14, as well as an analysis result video and analysis result information generated by the processor 14.
[0037] The processor 14 performs various operations by executing programs stored in the storage 13. In the present embodiment, the processor 14 performs an analysis operation P1, an analysis result visualization operation P2, and other operations.
[0038] In the analysis operation P1, the processor 14 makes analysis of a work performance status of a worker based on video recordings of actions of the worker to generate analysis result information.
[0039] In the analysis result visualization operation P2, the processor 14 visualizes the analysis result information generated in the analysis operation P1 and presents the visualized information to a user. In the present embodiment, the processor 14 generates an analysis result display screen (Figure 6).
[0040] Next, the analysis operation P1 performed by the server 3 will be described. Figure 3 is a block diagram showing an outline of the analysis operation P1 performed by the server 3.
[0041] In the server 3, the processor 14 performs an analysis operation P1. The analysis operation P1 includes a video acquisition operation P11, an object action recognition operation P12, an analysis result video generation operation P13, a work cycle detection operation P14, an abnormality detection operation P15, and an output operation P16.
[0042] In the video acquisition operation P11, the processor 14 acquires video recordings to be analyzed from the storage 13.
[0089] While specific embodiments of the present disclosure are described herein for illustrative purposes, the present disclosure is not limited to those specific embodiments. Various changes, substitutions, additions, and omissions may be made to elements of the embodiments without departing from the scope of the present disclosure. Moreover, elements and features of the different embodiments may be combined with each other to yield another embodiment of the present disclosure.
Furthermore, as an ordered combination, these elements amount to generic computer components receiving or transmitting data over a network, performing repetitive calculations, electronic record keeping, and storing and retrieving information in memory, which, as held by the courts, are well-understood, routine, and conventional. See MPEP 2106.05(d).
Moreover, the remaining elements of dependent claims do not transform the recited abstract idea into a patent eligible invention because these remaining elements merely recite further abstract limitations that provide nothing more than simply a narrowing of the abstract idea recited in the independent claims.
Looking at these limitations as an ordered combination adds nothing additional that is sufficient to amount to significantly more than the recited abstract idea because they simply provide instructions to use a generic arrangement of generic computer components to “apply” the recited abstract idea, perform insignificant extra-solution activity, and generally link the abstract idea to a technical environment. Thus, the elements of the claims, considered both individually and as an ordered combination, are not sufficient to ensure that the claim as a whole amounts to significantly more than the abstract idea itself. Since there are no limitations in these claims that transform the exception into a patent eligible application such that these claims amount to significantly more than the exception itself, claims 1-7 are rejected under 35 U.S.C. 101 as being directed to non-statutory subject matter.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claim(s) 1-7 is/are rejected under 35 U.S.C. 102 as being unpatentable by JP Patent Publication to JP2019016226A to Kudo et al., (hereinafter referred to as “Kudo”).
As per Claim 1, Kudo teaches: A work analyzing device in which a processor is caused to perform operations to output analysis result information on a work performance status of a worker based on video recordings of actions of the worker,
wherein the processor is configured to: ([0011]-[0019])
based on the video recordings, recognize one or more target objects to be handled by the worker and determine types of the target objects, while recognizing one or more actions of the worker, the actions forming a work, and determining types of the actions; and (in at least [0011] The work data management system W shown in FIG. 1 is a system for managing data (referred to as work data) relating to work at a work site. In the present embodiment, in order to manufacture a prescribed product, a state in which an operator repeatedly performs a predetermined operation (processing of a member, etc.) is photographed and analysis and display of work are performed based on the imaging result There. The above-mentioned member may be substantially the same as the finished product, or it may be a component of a finished product. [0032] For example, in the detailed process A1 shown in FIG. 4, the process determining unit 16c (see FIG. 2) performs the following process. That is, when the large step γ is "processing" and the event E 2 occurs within 10 seconds after the occurrence of the event E 1, and the state satisfying both of these continues for more than 1 second, the process determining means 16 c , It is determined that "the worker is setting a member in the processing machine". As described above, the detailed process classifying means 16 (see FIG. 2) classifies the work performed by the worker at the work site into a plurality of time-series detailed processes based on the work data including the image data of the work site. Further, the detailed process classifying means 16 specifies the detailed process performed by the operator based on the change in the position of the predetermined part (head and hand) on the body of the worker. [0070] the large process performed by the worker is subdivided into a plurality of time-series detailed processes on the basis of the video data of the work site and the like as the Gantt chart P1 to the result display means 26 (See FIG. 6A). As a result, the user (administrator) can easily grasp how long it took in each detailed process. In addition, it is unnecessary for an operator to attach a position sensor (not shown) or an acceleration sensor (not shown), thereby saving labor on the worker side. [0093] For example, the information processing apparatus 100 ⟨/ b⟩ Ba analyzes the work of processing one product repeatedly by a worker, and the information processing apparatus 100 ⟨/ b⟩ Bb analyzes the work of another worker repeating the assembly of the product after the processing There. That is, when a plurality of types of work are sequentially performed, when a predetermined product is manufactured, each of the plurality of types of work is a large process including a plurality of time-series detailed processes, and different workers It is in charge. In the present embodiment, the information processing apparatuses 100 Ba and 100 Bb individually analyze the operations of a plurality of workers performing such flow work (processing and assembly), and the analysis results are integrated by the data integration apparatus 200 ing. [0094] The work site as the measurement target (imaging target) of the first measurement device G 1 and the second measurement device G 2, the work site as the measurement target (imaging target) of the first measurement device G 1 and the second measurement device G 2, Shall not be far apart.)
output analysis result display information which visualizes the analysis result information, and is a time series representation of a status of the actions associated with each of the types of the target objects. (in at least [0032] For example, in the detailed process A1 shown in FIG. 4, the process determining unit 16c (see FIG. 2) performs the following process. That is, when the large step γ is "processing" and the event E 2 occurs within 10 seconds after the occurrence of the event E 1, and the state satisfying both of these continues for more than 1 second, the process determining means 16 c , It is determined that "the worker is setting a member in the processing machine". As described above, the detailed process classifying means 16 (see FIG. 2) classifies the work performed by the worker at the work site into a plurality of time-series detailed processes based on the work data including the image data of the work site. Further, the detailed process classifying means 16 specifies the detailed process performed by the operator based on the change in the position of the predetermined part (head and hand) on the body of the worker. [0071] Further, based on the statistical information of the work repeated by the operator in the past, in addition to the box beard chart P2 (see FIG. 6A) and the existence probability distributions P 9 and P 11, the flow line histories P 12 and P 13 (see FIGS. 6B and 6 C) Are appropriately displayed. Thus, the user can easily grasp the existence probability distribution P 9, the positional relationship between the head of the operator and the positional relationship between the existence probability distribution P 11 and the operator's hand. In this manner, by grasping the work time and the tendency of the worker's action in each detailed process, it is possible to improve product quality and productivity. [0106] FIG. 16 is a screen image generated by the data visualizing means 24B. It should be noted that illustration of the work deviation marker P 6 (see FIG. 6A) and the flow line variation abnormality marker P 7 (see FIG. 6A) shown in the first embodiment is omitted. When the production number selection button P 31 is selected by the user's operation via the second input means 25 (see FIG. 13), a list of the products to be worked in the past (that is, a list of production numbers) is displayed . In the example shown in FIG. 16, a Gantt chart P1 or the like concerning the processing / assembly of the product with the production number 0001 is displayed. [0107] Then, a Gantt chart P1 or the like integrating a plurality of kinds of work (processing / assembly) for the product selected by the production number selection button P31 is displayed. In this way, the data visualizing means 24 B displays a plurality of time-series detailed processes constituting each work (machining / assembly) as the Gantt chart P 1 on the result display means 26 (see FIG. 13).)
As per Claim 2, Kudo teaches: The work analyzing device as claimed in claim 1,
wherein the processor outputs the video recordings as a video image that allows for playback, in association with the analysis result display information. (in at least [0053] At the playback time shown in FIG. 6A, since the work is normally performed as described above, the head position marker P 8 exists in the existence probability distribution P 9 and the hand position marker P 10 exists in the existence probability distribution P 11 It is present in. In this way, while comparing the Gantt chart P1 and the box beard figure P2 with the existence probability distributions P9 and P11, the user (administrator) can visually grasp what kind of action the worker performed it can. [0120] W, WA, WB work data management system 100, 100 A, 100 Ba, 100 Bb information processing apparatus 11 measurement interface 12 measurement data storage means 13 feature quantity extraction means 14 first input means 15 large process setting means 16 detailed process classification means 16 a event judgment Means 16 b Event determination list 16 c Process determination unit 16 d Process determination list 17 Detailed process storage unit 18 Work time totalization unit 19 Work time storage unit 20, 20 A Work time statistical analysis unit 21 Work operation summary unit 22 Work operation storage unit 23 Work operation statistics Analysis means 24, 24 A data visualization means 25 second input means 26 result display means (display means) 27 production efficiency analysis means 27 a step completion prediction means 27 b large step production plan 27 c step progress analysis means 27 d policy list (storage means) 27 e policy Selection means 3 1, 32, 33, 34 External interface 200 Data integration device 201 External interface 202 Data management means 203 Integrated data storage means 204 Identical product judgment list G 1 First measurement device G 2 Second measurement device N Network P 1 Gantt chart P 2 box beard drawing P 5 Playback time bar (line showing time at playback) P6 Working deviation marker P7 Flow line variation abnormal marker P9, P11 Probability probability distribution P12, P13 Flow line history P21 line (line indicating planning completion time of large process) P22 line Line indicating predicted completion time of large process) Pa 1 element (element of Gantt chart))
As per Claim 3, Kudo teaches: The work analyzing device as claimed in claim 1,
wherein the processor identifies at least one of a component or a work tool as the type of the target objects. (in at least [0011] The work data management system W shown in FIG. 1 is a system for managing data (referred to as work data) relating to work at a work site. In the present embodiment, in order to manufacture a prescribed product, a state in which an operator repeatedly performs a predetermined operation (processing of a member, etc.) is photographed and analysis and display of work are performed based on the imaging result There. The above-mentioned member may be substantially the same as the finished product, or it may be a component of a finished product. [0032] For example, in the detailed process A1 shown in FIG. 4, the process determining unit 16c (see FIG. 2) performs the following process. That is, when the large step γ is "processing" and the event E 2 occurs within 10 seconds after the occurrence of the event E 1, and the state satisfying both of these continues for more than 1 second, the process determining means 16 c , It is determined that "the worker is setting a member in the processing machine". As described above, the detailed process classifying means 16 (see FIG. 2) classifies the work performed by the worker at the work site into a plurality of time-series detailed processes based on the work data including the image data of the work site. Further, the detailed process classifying means 16 specifies the detailed process performed by the operator based on the change in the position of the predetermined part (head and hand) on the body of the worker. [0110] In each of the embodiments, the example in which the work data management system W extracts the position of the head and hand of the worker as the feature amount has been described, but the present invention is not limited to this. For example, the position of a device or a tool (not shown) may be added. This makes it possible to further improve the determination accuracy of the detailed process. [0120] W, WA, WB work data management system 100, 100 A, 100 Ba, 100 Bb information processing apparatus 11 measurement interface 12 measurement data storage means 13 feature quantity extraction means 14 first input means 15 large process setting means 16 detailed process classification means 16 a event judgment Means 16 b Event determination list 16 c Process determination unit 16 d Process determination list 17 Detailed process storage unit 18 Work time totalization unit 19 Work time storage unit 20, 20 A Work time statistical analysis unit 21 Work operation summary unit 22 Work operation storage unit 23 Work operation statistics Analysis means 24, 24 A data visualization means 25 second input means 26 result display means (display means) 27 production efficiency analysis means 27 a step completion prediction means 27 b large step production plan 27 c step progress analysis means 27 d policy list (storage means) 27 e policy Selection means 3 1, 32, 33, 34 External interface 200 Data integration device 201 External interface 202 Data management means 203 Integrated data storage means 204 Identical product judgment list G 1 First measurement device G 2 Second measurement device N Network P 1 Gantt chart P 2 box beard drawing P 5 Playback time bar (line showing time at playback) P6 Working deviation marker P7 Flow line variation abnormal marker P9, P11 Probability probability distribution P12, P13 Flow line history P21 line (line indicating planning completion time of large process) P22 line Line indicating predicted completion time of large process) Pa 1 element (element of Gantt chart))
As per Claim 4, Kudo teaches: The work analyzing device as claimed in claim 3,
wherein the processor identifies at least one of an action of holding a component and an action of using a work took as the type of the actions of the worker. (in at [0032] For example, in the detailed process A1 shown in FIG. 4, the process determining unit 16c (see FIG. 2) performs the following process. That is, when the large step γ is "processing" and the event E 2 occurs within 10 seconds after the occurrence of the event E 1, and the state satisfying both of these continues for more than 1 second, the process determining means 16 c , It is determined that "the worker is setting a member in the processing machine". As described above, the detailed process classifying means 16 (see FIG. 2) classifies the work performed by the worker at the work site into a plurality of time-series detailed processes based on the work data including the image data of the work site. Further, the detailed process classifying means 16 specifies the detailed process performed by the operator based on the change in the position of the predetermined part (head and hand) on the body of the worker. [0110] In each of the embodiments, the example in which the work data management system W extracts the position of the head and hand of the worker as the feature amount has been described, but the present invention is not limited to this. For example, the position of a device or a tool (not shown) may be added. This makes it possible to further improve the determination accuracy of the detailed process. [0120] W, WA, WB work data management system 100, 100 A, 100 Ba, 100 Bb information processing apparatus 11 measurement interface 12 measurement data storage means 13 feature quantity extraction means 14 first input means 15 large process setting means 16 detailed process classification means 16 a event judgment Means 16 b Event determination list 16 c Process determination unit 16 d Process determination list 17 Detailed process storage unit 18 Work time totalization unit 19 Work time storage unit 20, 20 A Work time statistical analysis unit 21 Work operation summary unit 22 Work operation storage unit 23 Work operation statistics Analysis means 24, 24 A data visualization means 25 second input means 26 result display means (display means) 27 production efficiency analysis means 27 a step completion prediction means 27 b large step production plan 27 c step progress analysis means 27 d policy list (storage means) 27 e policy Selection means 3 1, 32, 33, 34 External interface 200 Data integration device 201 External interface 202 Data management means 203 Integrated data storage means 204 Identical product judgment list G 1 First measurement device G 2 Second measurement device N Network P 1 Gantt chart P 2 box beard drawing P 5 Playback time bar (line showing time at playback) P6 Working deviation marker P7 Flow line variation abnormal marker P9, P11 Probability probability distribution P12, P13 Flow line history P21 line (line indicating planning completion time of large process) P22 line Line indicating predicted completion time of large process) Pa 1 element (element of Gantt chart))
As per Claim 5, Kudo teaches: The work analyzing device as claimed in claim 1,
wherein, when detecting an abnormality in the work performance status of the worker, the processor outputs the analysis result display information that includes information on the detected abnormality. (in at least [0049] The reproduction time bar P 5 (described as "Now" on the upper side) shown in FIG. 6A is a line indicating the reproduction time of the video displayed in the measurement data display areas P 3 and P 4. When images are being reproduced in the measurement data display areas P 3 and P 4, the reproduction time bar P 5 moves to the right side of the screen. In this manner, when the data visualizing means 24 displays the video data on the result display means 26 as a moving picture, it causes the Gantt chart P 1 to superimpose and display a line (reproduction time bar P 5) indicating the time at the time of reproduction of the video data. The work deviation marker P 6 and the flow line variation abnormality marker P 7 displayed on the Gantt chart P 1 will be described later. [0050] The measurement data display areas P 3 and P 4 are areas for displaying images of a work site at a predetermined reproduction time, a presence probability distribution of head / hand positions, and the like. In the example shown in FIG. 6A, in the Gantt chart P 1, the reproduction time bar P 5 exists in the detailed step A 1. In the row of the chart of the detailed step A1, the work deviation marker P6 and the flow line variation abnormality marker P7 are not displayed, and the width of the box beard figure P2 is narrow. In this way, when the detailed step A1 is normally repeated, the data visualizing means 24 displays the measurement result (photographed result) of the first measuring device G1 in the measurement data display area P3 and the data visualizing means 24 of the second measuring device G2 And displays the measurement result (imaging result) in the measurement data display area P 4.)
As per Claim 6, Kudo teaches: The work analyzing device as claimed in claim 5, wherein the analysis result display information includes, as the information on the detected abnormality, at least one of
a highlighted display of fields for the recognized target objects, a display of a mark indicating a corresponding one of the actions associated with the abnormality, a display of a text indicating the detected abnormality, and a display of statistical information including a set of statistics collected over a prescribed period of time included in a period of time designated for the analysis result information. (in at least [0032] For example, in the detailed process A1 shown in FIG. 4, the process determining unit 16c (see FIG. 2) performs the following process. That is, when the large step γ is "processing" and the event E 2 occurs within 10 seconds after the occurrence of the event E 1, and the state satisfying both of these continues for more than 1 second, the process determining means 16 c , It is determined that "the worker is setting a member in the processing machine". As described above, the detailed process classifying means 16 (see FIG. 2) classifies the work performed by the worker at the work site into a plurality of time-series detailed processes based on the work data including the image data of the work site. Further, the detailed process classifying means 16 specifies the detailed process performed by the operator based on the change in the position of the predetermined part (head and hand) on the body of the worker. [0056] FIG. 6C shows a screen image when the reproduction time bar P 5 is positioned in the detailed step A 2. In the example shown in FIG. 6C, a flow line variation abnormality marker P 7 is displayed in the row of the chart of the detailed process A 2. The flow line variation abnormality marker P 7 is a marker displayed in association with the element Pa 1 of the Gantt chart P 1 when the operation (ie, flow line history) of the work repeated in the past largely fluctuates. Whether or not to display the flow line variation abnormality marker P 7 is determined based on the variance value of the flow line history P 12. [0057] As described above, when the variance value of a plurality of flow line histories is equal to or larger than the predetermined threshold value in the predetermined detailed process, the data visualizing means 24 adds a flow line variation abnormality to the element Pa 1 of the Gantt chart P 1 corresponding to this detailed process The marker P 7 is displayed. [0058] In addition, when there is an abnormality in the flow line variation, the data visualizing means 24 displays the image of the side of the worker's head / hand where the variance value of the flow line history exceeds the predetermined threshold in the measurement data display areas P 3, P 4 As shown in FIG. In the example shown in FIG. 6C, since the variance value of the flow line history of the hand exceeds the predetermined threshold value, the position marker P 10 of the hand and the existence probability distribution P 11 of the hand are superimposed and displayed together with the image of the foreground in the measurement data display area P 3 There. In addition, in the measurement data display area P 4, the position marker P 10 of the hand and the flow line history P 13 of the hand are superimposed and displayed together with the image of the foreground. By displaying the flow line history P 13 in addition to the Gantt chart P 1 or the like, the user can efficiently confirm the information on the flow line variation abnormality.)
As per Claim 7 for a method (see at least Kudo [0019]), substantially recite the subject matter of Claim 1 and are rejected based on the same reasoning and rationale.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to PO HAN (Max) LEE whose telephone number is (571) 272-3821. The examiner can normally be reached on Monday - Thursday, 9 AM-6:30 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Rutao Wu can be reached on (571) 272-6045. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/PO HAN LEE/Primary Examiner, Art Unit 3623