Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
Applicant’s response to the last office action, filed January 19, 2026 has been entered and made of record. Claim 1 has been amended; claims 7-12 have been newly added. By this amendment, claims 1-12 are now pending in this application.
Response to Arguments
Applicant’s arguments with respect to claims 1-12 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-3, and 7-9 are rejected under 35 U.S.C. 103 as being unpatentable over Arslan et al, (“Semantic trajectory insights for worker safety in dynamic environments”, Automation in Construction 106 (2019) 102854, science direct, PP 1-20), in view of Nakao et al, (US-PGPUB 2023/0041612); and further in view of Deng et al, (US-PGPUB 20210312321).
In regards to claim 1, Arslan discloses a management system, (see at least:
Page 7, Fig. 4), configured to identify a management item to be managed in a space through which plurality of workers move, (see at least: Page 15, section 5, identifying the exact locations where the unsafe movements are occurring in real-time during construction and facility management processes), and to report the management item to a manager that manages a situation in the space, (see at least: Abstract, and Page 15, section 5, visualizing the worker movements in conjunction with the locations of the risky places), the management system comprising:
a tracking unit configured to of workers to be used to identify a behavior of each of the plurality of workers, and to tracking a change in the position, (see at least: Page 18, right-hand-column, one of the methods for tracking the activities of the workers working in the restricted zone is to study their movements from their spatio-temporal trajectories. The developed system has the ability to hold the changes in the purpose of the building or site locations as well as the positions of the workers over time for enabling the process of behavior extraction which can minimize safety risks, [i.e., detecting a position of workers, “implicitly tracking or detecting positions of the workers over time”, to identify a behavior of each of the plurality of workers, “enabling the process of behavior extraction”]). Further, Page 14, right-hand-column, for further tracking the unsafe movements, the corresponding speed and turning angle values associated with such movements can be traced from the time series plots, [i.e., implicitly tracking a change in the position based on tracing speed and turning angle values associated with such movements from the time series plots]);
a storage unit configured
an identifying unit configured to judge whether or not the change in position of the feature point tracked by the tracking unit is due to irregular behavior of a worker, based on the behavior information stored by the storage unit, and to identify paragraph, using this method, four hidden states are defined using different values of ‘step length’ and ‘turning angle’ are shown in Table 4. Further, Page 14, left-hand-column, last paragraph through right-hand-column, 1st paragraph, to show a working of developed system … the fourth state in Fig. 19. Further, Page 14, right-hand-column, last paragraph, our BIM-based movement visualizations can help the H&S managers in identifying the workers who are not complying the safety regulations, [i.e., judge whether or not the change in position of the feature point tracked by the tracking unit is due to irregular behavior of a worker, “an entire set of trajectory points belonging to an ROI is further analyzed using the HMMs for identifying unsafe user movements”, based on the behavior information stored by the storage unit, and to identify the management item, “BIM-based movement can help identifying the workers who are not complying the safety regulations”]); and
a reporting unit configured to display the management item identified by the identifying unit on a map representing the space, so as to report the management item to the manager, (see at least: Page 15, section 5, Visualizing the output of the HMMs using the BIM software after performing computations on the semantic trajectories will enable the system users (H&S managers and building supervisors) in identifying the exact locations where the unsafe movements are occurring in real-time during construction and facility management processes. Further, Page 14, last paragraph, our BIM-based movement visualizations can help the H&S managers in identifying the workers who are not complying the safety regulations, [i.e., displaying the management item identified by the identifying unit on a map representing the space, “Visualizing the output of the HMMs using the BIM software”, so as to report the management item to the manager, “our BIM-based movement visualizations can help the H&S managers in identifying the workers who are not complying the safety regulations”]).
Arslan does not expressly disclose analyze captured image information including each of the plurality of workers, thereby detecting a position of a feature point to be used to identify a behavior of each of the plurality of workers; and a storage unit configured to convert the change in position of the feature point tracked by the tracking unit into behavior of each of the plurality of workers, and to store it as behavior information for learning data to be used by machine learning; and an identifying unit configured to identify the irregular behavior as the management item.
However, Nakao discloses analyze captured image information including each of the plurality of workers, thereby detecting a position of each of the plurality of workers to be used to identify a behavior of each of the plurality of workers, (see at least: Par. 0045-0047, detecting unit 12 analyzes the image data and detects an unsafe condition of the worker including an unsafe behavior of the worker, … and the detecting unit 12 calculates the position information about the worker, …. where the position information about the worker or the like who is the detection target of the unsafe condition may be set using the name of the work area or the process name, [i.e., analyzing captured image information including each of the plurality of workers, “detecting unit 12 analyzes the image data”, thereby detecting a position of each of the plurality of workers, “implicit by calculating the position information about the worker”, to be used to identify a behavior of each of the plurality of workers, “the position information about the worker is the detection target of the unsafe condition including an unsafe behavior of the worker”]); and
an identifying unit configured to judge whether or not the change in position
Arslan and Nakao are combinable because they are both concerned with object tracking. Therefore, it would have been obvious to a person of ordinary skill in the art, to modify Arslan, to use the detecting unit 12, as though by Nakao, in order to detect an unsafe behavior of the worker, (Par. 0045), based on comparing the feature of the detected motion with the reference data set for each unsafe condition, (Nakao, Par. 0047).
The combine teaching Arslan and Nakao as whole does not expressly disclose detecting and using a position of a feature point to identify a behavior of each of the plurality of workers; and a storage unit configured to convert the change in position of the feature point tracked by the tracking unit into behavior of each of the plurality of workers, and to store it as behavior information for learning data to be used by machine learning.
Deng et al discloses detecting a position of a feature point to be used to identify a behavior of each of the plurality of workers, (see at least: Par. 0051-0053, each frame is processed at block 54 using a two-dimensional (2D) convolutional neural network (CNN) that has been trained to identify key points in frames, where key point corresponds to a position of a pixel in a respective frame, [i.e., detecting a position of a feature point, “implicit by identifying key points in frames]). Further, Par. 0056-0057, encoding the position of each key point into an encoded representation 58, and process the plurality of encoded representations 58 at block 60 to identify human behaviors of a human detected in the sequence of frames 101 based on the encoded representation, using 3D CNN, which 3D CNN has been previously trained with large amount of training data, [i.e., identify a behavior of each of the plurality of workers, “implicit by identifying human behavior of a human”, using position of the identified key points, “based on encoding the position of each key point into an encoded representation 58”]); and
a storage unit configured to convert the change in position of the feature point tracked by the tracking unit into behavior of each of the plurality of workers, and to store it as behavior information for learning data to be used by machine learning, (see at least: Par. 0048, an encoding of human body position and movement relies on key points of the human body (hereinafter referred to as key points); which the one of more encodings of human body position and movement may be provided to a 3D CNN; and from Par. 0071, a cache or queue is used to store a predetermined number of encoded representations for use as inputs to the human behavior classifier 211, [i.e., a storage unit, “cache or queue”, is configured to convert the change in position of the feature point tracked by the tracking unit into behavior of each of the plurality of workers, “implicit by the predetermined number of encoded representations, which includes the keypoints based behavior movement information, as it is uses as input to 3D CNN”, and to store it as behavior information for learning data to be used by machine learning, “storing the predetermined number of encoded representations for use as inputs to the human behavior classifier 211 or 3D CNN”]).
Arslan, Nakao, and Deng are combinable because they are both concerned with object tracking. Therefore, it would have been obvious to a person of ordinary skill in the art, to modify the combine teaching Arslan and Nakao, to identify the joints of a human body as key points, as though by Deng, in order to efficiently and accurately identify of behavior of a human body by tracking the location and movement of the joints, (Deng, Par. 0020).
In regards to claim 2, the combine teaching Arslan, Nakao, and Deng as whole discloses the limitations of claim 1.
Deng further discloses wherein personal information of each of the workers is associated with the feature points, (Deng, see at least: Par. 0064-0065, assigning a unique identifier (ID) to the bounding box 204, and outputs the generated bounding box together with assigned unique ID. The unique ID is an identifier that uniquely identifies the human body detected in the respective frame, and the key points identifier 206 assigns a unique identifier to identified each key point, [i.e., associating personal information of each of the workers with the feature points, “assigning a unique identifier to identified each key point”])
In regards to claim 3, the combine teaching Arslan, Nakao, and Deng as whole discloses the limitations of claim 1.
Arslan further discloses wherein the manager includes both a site manager who is in the space where the plurality of workers are, and a remote manager who is away from the space, (Arslan, see at least: Fig. 4, where the building supervisor corresponds to the site manager; and health and safety manager corresponds to remote manager who is away from the space), and wherein the reporting unit shows the management item to both the site manager and the remote manager so as to report to them, (see at least: Fig. 4, “Messages will be sent to building supervisors in case of abnormal behavior observed in trajectory data”, and “visualizing movement behaviors to the health and safety manager”).
In regards to claim 7, the combine teaching Arslan, Nakao, and Deng as whole discloses the limitations of claim 1.
Furthermore, Deng discloses wherein the feature point corresponds to a part of a body of each of the plurality of workers, (see at least: Par. 0020, each key point position corresponds to a joint of the human body)
In regards to claim 8, the combine teaching Arslan, Nakao, and Deng as whole discloses the limitations of claim 1.
Furthermore, Deng discloses wherein the change in position of the feature point is a relative positional change of the feature point, (see at least: Par. 0020, implicit by tracking the location and movement of the joints based on identification of joints of a human body as key points in a frame).
In regards to claim 9, the combine teaching Arslan, Nakao, and Deng as whole discloses the limitations of claim 8.
Furthermore, Deng discloses wherein the change in position of the feature point is a relative positional change of the feature point, (see at least: Par. 0020, implicit by tracking the location and movement of the joints).
Allowable Subject Matter
Claims 4-6 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
With respect to claim 4, the prior art of record, alone or in reasonable combination, does not teach or suggest, the following underlined limitation(s), (in consideration of the claim as a whole):
“wherein the management item to be managed is tagged under agreement by the site manager and the remote manager, and the tag thus applied can be deleted under agreement by the site manager and the remote manager …”
The non-patent-literature to Arslan et al, (“Semantic trajectory insights for worker
safety in dynamic environments”, Automation in Construction 106 (2019) 102854, science direct, PP 1-20), discloses a management system, (see at least: Page 7, Fig. 4), configured to identify a management item to be managed in a space through which plurality of workers move, (see at least: Page 15, section 5, identifying the exact locations where the unsafe movements are occurring in real-time during construction and facility management processes), and to report the management item to a manager that manages a situation in the space, (see at least: Abstract, and Page 15, section 5, visualizing the worker movements in conjunction with the locations of the risky places), the management system comprising:
a tracking unit configured to detect a position of workers to identify a behavior of each of the plurality of workers, and to track a change in the position, (see at least: Page 18, right-hand-column, one of the methods for tracking the activities of the workers working in the restricted zone is to study their movements from their spatio-temporal trajectories. The developed system has the ability to hold the changes in the purpose of the building or site locations as well as the positions of the workers over time for enabling the process of behavior extraction which can minimize safety risks, [i.e., detecting a position of workers, “implicitly tracking or detecting positions of the workers over time”, to identify a behavior of each of the plurality of workers, “enabling the process of behavior extraction”]). Further, Page 14, right-hand-column, for further tracking the unsafe movements, the corresponding speed and turning angle values associated with such movements can be traced from the time series plots, [i.e., implicitly tracking a change in the position based on tracing speed and turning angle values associated with such movements from the time series plots]);
a storage unit configured using the Viterbi algorithm; and storing trajectory data for future use, [i.e., storing the change in position of the worker as behavior information for learning data to be used by machine learning, “implicit by storing the trajectory data, for future use [as training data], which the trajectory data implicitly encompass the movement behavior of the workers]);
an identifying unit configured to judge whether or not the change in position of the feature point tracked by the tracking unit is due to irregular behavior of a worker, based on the behavior information stored by the storage unit, and to identify the management item, (see at least: Page 10, right-hand-column, last paragraph, using this method, four hidden states are defined using different values of ‘step length’ and ‘turning angle’ are shown in Table 4. Further, Page 14, left-hand-column, last paragraph through right-hand-column, 1st paragraph, to show a working of developed system … the fourth state in Fig. 19. Further, Page 14, right-hand-column, last paragraph, our BIM-based movement visualizations can help the H&S managers in identifying the workers who are not complying the safety regulations, [i.e., judge whether or not the change in position of the feature point tracked by the tracking unit is due to irregular behavior of a worker, “an entire set of trajectory points belonging to an ROI is further analyzed using the HMMs for identifying unsafe user movements”, based on the behavior information stored by the storage unit, and to identify the management item, “BIM-based movement can help identifying the workers who are not complying the safety regulations”]); and
a reporting unit configured to display the management item identified by the identifying unit on a map representing the space, so as to report the management item to the manager, (see at least: Page 15, section 5, Visualizing the output of the HMMs using the BIM software after performing computations on the semantic trajectories will enable the system users (H&S managers and building supervisors) in identifying the exact locations where the unsafe movements are occurring in real-time during construction and facility management processes. Further, Page 14, last paragraph, our BIM-based movement visualizations can help the H&S managers in identifying the workers who are not complying the safety regulations, [i.e., displaying the management item identified by the identifying unit on a map representing the space, “Visualizing the output of the HMMs using the BIM software”, so as to report the management item to the manager, “our BIM-based movement visualizations can help the H&S managers in identifying the workers who are not complying the safety regulations”]).
Arslan further discloses a site manager in the space where the plurality of workers are, and a remote manager away from the space, (Arslan, see at least: Fig. 4, where the building supervisor corresponds to the site manager; and health and safety manager corresponds to remote manager who is away from the space), and wherein the reporting unit shows the management item to both the site manager and the remote manager so as to report to them, (see at least: Fig. 4, “Messages will be sent to building supervisors in case of abnormal behavior observed in trajectory data”, and “visualizing movement behaviors to the health and safety manager”).
However, while disclosing the building supervisor, (site manager); and health and safety manager, (remote manager); Arslan fails to teach or suggest, either alone or in combination with the other cited references, wherein the management item to be managed is tagged under agreement by the site manager and the remote manager, and the tag thus applied can be deleted under agreement by the site manager and the remote manager.
A further prior art of record, Ammari et al, (“Remote interactive collaboration in facilities management using BIM-based mixed reality, Automation in Construction 107 (2019) 102940, science direct, PP 2-19), discloses wherein the management system further comprises a sharing unit that allows the site manager and the remote manager to share management index information indicating a management index expressed by a number of tags thus applied, a number of tags thus deleted, or an increase/decrease in the tags, (see at least: Page 2, left-hand-column, capture and record location-based building element data (e.g., defects) and visually share them in real time with the remote office; and Page 5, section 3.2.2 through Page 6, left-hand-column, the field worker reaches the assigned place and finds the task tag in the AR view, he/she marks the defect on the element through the AR interface, and stream the image of the defect element and the location of the defect mark to manager via VE, [i.e., allowing the site manager and the remote manager, “both manager and office”, to share management index information indicating a management index expressed by a number of tag, “the image of the defect element and the location of the defect mark expressed by the task tag”]).
However, Ammari et al fails to teach or suggest, either alone or in combination with the other cited references, wherein the management item to be managed is tagged under agreement by the site manager and the remote manager, and the tag thus applied can be deleted under agreement by the site manager and the remote manager.
Regarding claims 5 and 6, claims 5 and 6 are in condition for allowance, in view of their dependency from claim 4.
The following is a statement of reasons for the indication of allowable subject matter:
-- Claim 10 is allowable over the prior art of record
-- Claims 11-12 are allowable in view of their dependency from claim 10
With respect to claim 10, the prior art of record, alone or in reasonable combination, does not teach or suggest, the following underlined limitation(s), (in consideration of the claim as a whole):
“wherein the management item to be managed is tagged under agreement by the site manager and the remote manager, and the tag thus applied can be deleted under agreement by the site manager and the remote manager …”
Claim 10 recites substantially similar limitations as set forth in claim 4. As such,
claim 10 is in condition for allowance, for at least similar reasons, as stated above, and the prior art cited above with respect to claims 4-6 applies also to claims 10-12.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Contact Information
Any inquiry concerning this communication or earlier communications from the examiner should be directed to AMARA ABDI whose telephone number is (571)272-0273. The examiner can normally be reached 9:00am-5:30pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Vu Le can be reached at (571) 272-7332. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/AMARA ABDI/Primary Examiner, Art Unit 2668 02/19/2026