Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
DETAILED ACTION
The following NON-FINAL Office Action is in response to Application 18271163 - filed on 07/06/2023.
Priority
Receipt is acknowledged of papers submitted under 35 U.S.C. 119(a)-(d), which papers have been placed of record in the file.
The Examiner has noted the Applicants claiming Foreign Priority filed 08/19/2021.
Status of Claims
Claims 1-16 are currently pending of which:
Claims 1-16 are currently under examination and have been rejected as follows.
IDS
The information disclosure statement filed on 07/06/2023 complies with the provisions of 37 CFR 1.97, 1.98 and MPEP § 609 and is considered by the Examiner.
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-15 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Claims 1-5 are directed to a method or process which is a statutory category.
Claims 6-10 are directed to an apparatus or machine which is a statutory category.
Claims 11-15 are directed to a non-transitory computer readable medium or article of manufacture which is a statutory category.
Step 2A Prong One: The claims recite, describe, or set forth a judicial exception of an abstract idea (see MPEP 2106.04(a)). Specifically, the claims recite, describe or set forth managing personal behavior, including: “identifying a… movement… which define a cycle of movements”, “determining a period of time between the identified first movement and the identified second movement to measure productivity”, “comparing the detected number of hands with an expected number of hands”, “performing an imputation of a movement of the missing hand”, and “wherein the imputation is performed by using an average historical position of a hand”, “averaging a plurality of sequences of positions of the hands… to generate the… sequence”. Managing personal behavior by monitoring productivity falls within the larger abstract grouping of Certain Methods of Organizing Human Activity (MPEP 2106.04(a)(2) II).
Additionally, the claims recite, describe or set forth concepts performed in the human mind including observation, evaluation, or judgement including: “identifying a… movement”, “detecting the number of hands”, “comparing the detected number of hands with an expected number of hands”, “performing an imputation of a movement of the missing hand”, “generating a… sequence of positions of hands”, and “identifying the first movement that matches the first sequence”. Observing body part movements, counting and identifying missing body parts, mapping their positions in sequences, and matching movements to sequences fall within observation, evaluation, and judgement as concepts performed in the human mind under the abstract grouping of Mental Processes1 (MPEP 2106.04(a)(2) III). Accordingly, the claims recite an abstract idea.
Step 2A Prong Two: Independent claims 1, 6, 11 recite the following additional elements: “computer”, “image frame”, “apparatus”, “processor”, “memory”, “computer program code”, “non-transitory computer readable medium”, and “program”. The additional elements are recited at a high level of generality (i.e. as a generic computer performing functions of identifying movement based on a change in position of an object in an image frame and calculating cycle times as differences between starting and ending movements) such that they amount to no more than mere instructions to apply the exception using generic computer components. Therefore, these functions can be viewed as not meaningfully different than a mathematical algorithm being applied on a general-purpose computer as tested per MPEP 2106.05(f)(2)(i). The claims are directed to an abstract idea and the judicial exception does not integrate the abstract idea into a practical application.
Step 2B: According to MPEP 2106.05(f)(1), considering whether the claim recites only the idea of a solution or outcome i.e., the claims fail to recite the technological details of how the actual technological solution to the actual technological problem is accomplished. The recitation of claim limitations that attempt to cover an entrepreneurial and thus abstract solution to an entrepreneurial problem with no technological details on how the technological result is accomplished and no description of the mechanism for accomplishing the result do not provide significantly more than the judicial exception.
The dependent claims do not recite any more additional elements. Further, dependent claims 2-5, 7-10, 12-15 merely incorporate the additional elements recited in claims 1, 6, 11 along with further narrowing of the abstract idea of claims 1, 6, 11 along with their execution of the abstract idea. Specifically, the dependent claims narrow the computer, image frame, apparatus, processor, memory, computer program code, non-transitory computer readable medium, and program to capabilities such as detect, compare, impute, generate, and identify various forms of data such as number of hands, expected number of hands, movement, position, sequence, actions, cycles, etc. which, when evaluated per MPEP 2106.05(f)(2) represent mere invocation of computers to perform existing processes. Therefore, the additional elements recited in the claimed invention individually and in combination fail to integrate a judicial exception into a practical application (Step 2A prong two) and for the same reasons they also fail to provide significantly more (Step 2B). Thus, claims 1-15 are reasoned to be patent ineligible.
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1, 4, 6, 9, 11, 14 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by
Radwin1 US 9566004 B1 hereinafter Radwin1. As per,
Claims 1, 6, 11: Radwin1 teaches:
“A method for measuring productivity executed by a computer, comprising: (claim 1)
“An apparatus for measuring productivity, the apparatus comprising: at least one processor; and at least one memory including computer program code; wherein the at least one memory and the computer program code are configured to, with at least one processor, cause the apparatus to: (claim 6)
“A non-transitory computer readable medium storing a program for measuring productivity, wherein the program causes a computer at least to: (claim 11)
“identifying (claim 1) / identify (claims 6, 11) a first movement based on at least one image frame, wherein the first movement matches a start action which define a cycle of movements; identifying (claim 1) / identify (claims 6, 11) a second movement based on at least one image frame, wherein the second movement matches an end action which define the cycle; and determining (claim 1) / determine (claims 6, 11) a period of time between the identified first movement and the identified second movement to measure productivity” (See Radwin1 Fig. 7 showing repetitive motion analysis of hands with video camera images and related text. Radwin1 col. 4 lines 53-57: the processor component is further configured such that it further comprises an algorithm that operates by identifying and tracing a pixel pattern that best resembles the selected ROI as it changes location in successive video frames. Col. 7 lines 19-25: an algorithm that recognizes and identifies the pattern of repetitive motion of a part of a body, through a process known as cyclic motion analysis, by tracking the motion of a selected region of interest on an image of that body part relative to a selected stationary region from the background of the image; and calculates the activity level of that body part. Col. 7 lines 33-39: …the processor is further configured to compute a motion history curve with positive and negative values that are obtained from motion tracking, wherein the frame numbers that contain the boundary-crossing are determined, and wherein a cycle is defined as the period of time between two positive-to-negative [first/second movement and start/end action], or two negative-to-positive, boundary crossing points. Col. 8 lines 34-38: the frequency of the body parts motion f is determined by the time the body part is moving, as if the task is done at 100% duty cycle. In some embodiments, the frequency f [productivity] is defined as inverse of average active cycle time. [Also see Figs. 1-3 and related text and col 5 lines 13-17]).
Claims 4, 9, 14: Radwin1 teaches:
“The method according to claim 1, further comprising: (claim 4)
“The apparatus according to claim 6, wherein the at least one memory and the computer program code configured to, with at least one processor, cause the apparatus to: (claim 9)
“The non-transitory computer readable medium according to claim 11, wherein the program causes the computer to: (claim 14)
“generating (claim 4) / generate (claims 9, 14) a first sequence of positions of hands corresponding to the start action; and generating (claim 4) / generate (claims 9, 14) a second sequence of positions of hands corresponding to the end action (claim 4)” (Radwin1 col. 17 lines 58-66: the method used to identify cycles is by detection of crossings of a gating patch. For example, a current block location may be centered at (x, y), and the gating patch location centered at (x, y). The two coordinates [positions] are compared, and the relative horizontal positions of the current block and the reference gating block are determined, e.g., whether the current block is on the right of the gating block (xc > xg), or the current block is on the left of the gating block (xc ≤ xg));
“wherein the identifying of the first movement includes (claims 4, 14) identifying (claims 4, 14) / identify (claim 9) the first movement that matches the first sequence; and the identifying of the second movement includes (claims 4, 14) identifying (claims 4, 14) / identify (claim 9) the second movement that matches the second sequence” (Radwin1 col. 17 line 66 – col. 18 line 8: A cycle is defined as the period of time between two right-to-left (or left-to-right) gating patch crossings [movements]. The crossing points, or frame numbers are recorded as list [sequence] fr = {fro, fr1, fr2,…}. In some embodiments, the processor is further configured to run a cross-correlation based template-matching algorithm to track a motion trajectory of a selected region of interest (ROI) over successive video frames for a single camera to measure repetition frequency, duty cycle and activity level).
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 2, 3, 7, 8, 12, 13 are rejected under 35 U.S.C. 103 as being unpatentable over:
Radwin1 in view of
Takata US 6256400 B1 hereinafter Takata, and in further view of
Radwin2 US 20200279102 A1 hereinafter Radwin2. As per,
Claims 2, 7, 12: Radwin1 teaches all of the limitations of claims 1, 6, 11 above.
Although Radwin1 teaches detecting and tracking body parts and hands in particular, Radwin1 falls short of teaching the identification of the number of hands, whether one is missing in a particular frame, and how to substitute frame data for missing hands.
However, Takata in analogous art of biometric time motion analysis teaches or suggests “The method according to claim 1, further comprising:
“detecting the number of hands in the image frame” (Takata col. 26 lines 11-20: In FIG. 17, the hand region information includes the number of hands 1701, barycentric coordinates of the first hand 1702, an area of the first hand 1703, barycentric coordinates of the second hand 1704, and an area of the second hand 1705. The body feature extraction part 302 first sets the number of the extracted hands in the number of hands 1701, and then sets the barycentric coordinates of hand(s) and the area of hand(s) according to the number of the extracted hands in the following manner.);
“comparing the detected number of hands with an expected number of hands in the frame to detect at least one missing hand in the frame” (Takata col. 26 lines 21-32: When the number of extracted hands 1701 is 0, the barycentric coordinates of the first hand 1702 and the barycentric coordinates of the second hand 1704 are both set 25 to (0, 0), and the area of the first hand 1703 and the area of the second hand 1704 are both set to 0. When the number of extracted hands 1701 is "1", the barycentric coordinates and the area of the hand region are calculated so as to set the calculations respectively in the 30 barycentric coordinates of the first hand 1702 and the area of the first hand 1703. Thereafter, the barycentric coordinates of the second hand 1704 is set to (0, 0), and the area of the second hand 1704 is set to 0);
Takata and Radwin1 are found as analogous art of biometric time motion analysis. It would have been obvious to one skilled in the art, before the effective filing date of the invention, to have modified Radwin1’s repetitive motion measuring system and method to have included Takata’s teachings around detecting the number of hands in the video image frames. The benefit of these additional features would have further streamlined the efficient and accurate detection of motion of specific body parts (Takata col. 1 lines 28-41). The predictability of such modifications and/or variations, would have been corroborated by the broad level of skill of one of ordinary skills in the art as articulated by Radwin1 in view of Takata (see MPEP 2143 G).
Further, the claimed invention could have also been viewed as a mere combination of old elements in a similar field of endeavor of biometric time motion analysis. In such combination each element would have merely performed same organizational and managerial function as it did separately. Thus, one of ordinary skill in the art would have recognized that, given existing technical ability to combine the elements, as evidenced by Radwin1 in view of Takata above, the to- be combined elements would have fit together like pieces of a puzzle in a logical, complementary, technologically feasible and/or economically desirable manner. Thus, it would have been reasoned that the results of the combination would have been predictable (see MPEP 2143 A).
Furthermore, Radwin2 in analogous art of biometric time motion analysis teaches or suggests “performing an imputation of a movement of the missing hand in the frame in response of detecting the missing hand in the frame” (Radwin2 mid-¶ [0136]: …In the case depicted in FIG. 16, the confidence level in the cross-correlation may be determined by comparing the result of the cross-correlation to an expected cross-correlation indicating the feet of the subject 2 as represented by the silhouette 40 have disappeared from the foreground in the current frame, but other arrangements for determining a confidence level are contemplated. When the determined confidence level has gone beyond (e.g., is greater than) the confidence level threshold the region
CrH, of interest 58 and/or a portion of the silhouette 40 within the region of interest 58 of the segmented previous frame may be substituted 326 [imputed] into the segmented current frame and the final segmented frame may be obtained 328. When the determined confidence level has not gone beyond (e.g., is equal to or less than) the confidence level threshold the CrH, region of interest 58 in the segmented current frame may be the obtained 328 final segmented frame. [Also see Fig. 16 and related text]).
Radwin2, Takata, and Radwin1 are found as analogous art of biometric time motion analysis. It would have been obvious to one skilled in the art, before the effective filing date of the invention, to have modified Radwin1 / Takata’s motion measuring system and method to have included Radwin2’s teachings around substituting data for a missing body part in a frame. The benefit of these additional features would have reduced the algorithmic complexity and computing power necessary for accurate motion tracking (Radwin2 ¶ [0060]). The predictability of such modifications and/or variations, would have been corroborated by the broad level of skill of one of ordinary skills in the art as articulated by Radwin1 in view of Takata and Radwin2 (see MPEP 2143 G).
Further, the claimed invention could have also been viewed as a mere combination of old elements in a similar field of endeavor of biometric time motion analysis. In such combination each element would have merely performed same organizational and managerial function as it did separately. Thus, one of ordinary skill in the art would have recognized that, given existing technical ability to combine the elements, as evidenced by Radwin1 in view of Takata and Radwin2 above, the to- be combined elements would have fit together like pieces of a puzzle in a logical, complementary, technologically feasible and/or economically desirable manner. Thus, it would have been reasoned that the results of the combination would have been predictable (see MPEP 2143 A).
Claims 3, 8, 13: Radwin1 / Takata / Radwin2 teaches all of the limitations of claims 2, 7, 12 above.
Radwin1 / Takata in combination does not specifically teach using an average historical position of a body part to substitute for a missing body part.
However, Radwin2 in analogous art of biometric time motion analysis teaches or suggests “The method according to claim 2, wherein the imputation is performed by using an average historical position of a hand corresponding to the missing hand in a missing period of the missing hand to fill an absent position of the missing hand” (Radwin2 mid-¶ [0134]: To compare a current frame appearance in the region of interest 58 with a previous frame appearance within the region of interest 58 (e.g., where the previous frame appearance within the region of interest 58 may be from a frame immediate before the current frame, may be from a frame at X-number of frames before the current frame, may be an average of previous frame appearances [historical positions] within the region of interest 58 over X-number of frames, a rolling average of previous frame appearances within the region of interest 58 over X-number of frames, or other suitable previous frame appearance within the region of interest 58), a pixel-by-pixel intensity cross-correlation of the region of interest 58 of the current frame and of the region of interest 58 of the previous frame may be utilized… If the confidence value of cross-correlation has not reached (e.g., is lower than, as depicted in FIG. 15, or is greater than) the pre-set confidence threshold, then the feet portion of the silhouette 40 in the region of interest 58 for the current frame may be utilized [filled for absent position] as the feet of the silhouette 40 in the current frame. This happens when the feet of the subject 2 are moving and motion information is considered).
Rationales to have modified / combined Radwin1 / Takata / Radwin2 are above and reincorporated.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Claims 5, 10, 15 are rejected under 35 U.S.C. 103 as being unpatentable over:
Radwin1 in view of
Radwin2 US 20200279102 A1 hereinafter Radwin2. As per,
Claim 5, 10, 15: Radwin1 teaches all of the limitations of claims 4, 9, 14 above.
Although Radwin1 teaches generating a first and second sequence of positions of hands, Radwin1 does not explicitly teach using an average of a plurality of positions to generate the sequences.
However, Radwin2 in analogous art of biometric time motion analysis teaches or suggests “The method according to claim 4, wherein:
“the generating of the first sequence includes (claims 5, 15) averaging (claims 5, 15) / average (claim 10) a plurality of sequences of positions of the hands corresponding to start actions of the cycle of the movements to generate the first sequence; and the generating of the second sequence includes (claims 5, 15) averaging (claims 5, 15) / average (claim 10) a plurality of sequences of positions of the hands corresponding to end actions of the cycle of the movements to generate the second sequence” (Radwin2 mid-¶ [0156]: When video captures a task being repeated by the subject, an average of the determined asymmetry angle [position] of the subject at the beginning of the task and/or an average of the determined asymmetry angle [position] of the subject at the end of the task may be used in the NIOSH lifting equation to determine the RWL or LI for the task).
Rationales to have modified / combined Radwin1 / Radwin2 are above and reincorporated.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Conclusion
The following art is made of record and considered pertinent to Applicant’s disclosure:
Antonucci US 20060204045 A1, System and method for motion performance improvement.
Bhattacharya et al. A method for real-time generation of augmented reality work instructions via expert movements. Proc. SPIE 9392, The Engineering Reality of Virtual Reality 2015, 93920G (17 March 2015); DOI: 10.1117/12.2081214
Büttner et al. Exploring Design Opportunities for Intelligent Worker Assistance: A New Approach Using Projetion-Based AR and a Novel Hand-Tracking Algorithm. Aml 2017, LNCS 10217, pp. 33-45, 2017. Springer International Publishing AG 2017. DOI: 10.1007/978-3-319-56997-0_3
Brikhofer et al. US 20240381881 A1, System and method for smart manufacturing.
Cutler US 20040228503 A1, Video-based gait recognition.
Mayer et al. Using In-Situ Projection to Support Cognitively Impaired Workers at the Workplace. ASSETS' 15, October 26-28, 2015, Lisbon, Portugal. ISBN 978-l-4503-3400-6/15/10$15.00. DOI: http://dx.doi.org/10.1145/2700648.2809853
Müller et al. Motion tracking applied in assembly for worker training in different locations. Procedia CIRP 48 ( 2016 ) 460 – 465. DOI: 10.1016/j.procir.2016.04.117
Wagner et al. US 20160262685 A1, Motion analysis system and methods of use thereof.
Wang et al. CN 113592898 A, Method for reconstructing missing mark in motion capture.
Wang US 20190325207 A1, Method for human motion analysis, apparatus for human motion analysis, device and storage medium.
Winold et al. US 20220035443 A1, Systems and methods for generating complementary data for visual display.
Xue et al. "Multimodal Human Hand Motion Sensing and Analysis—A Review," in IEEE Transactions on Cognitive and Developmental Systems, vol. 11, no. 2, pp. 162-175, June 2019, doi: 10.1109/TCDS.2018.2800167. https://ieeexplore.ieee.org/abstract/document/8277185.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to REED M. BOND whose telephone number is (571) 270-0585. The examiner can normally be reached Monday - Friday 8:00 am - 5:00 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Patricia Munson can be reached at (571) 270-5396. The fax phone number for the organization where this application or proceeding is assigned is (571) 273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/REED M. BOND/Examiner, Art Unit 3624 April 6, 2025
/PATRICIA H MUNSON/Supervisory Patent Examiner, Art Unit 3624
1 MPEP 2106.04(a): “examiners should identify at least one abstract idea grouping, but preferably identify all groupings to the extent possible”.