DETAIL OFFICE ACTIONS
The United States Patent & Trademark Office appreciates the response filed for the current application that is submitted on 10/23/2025. The United States Patent & Trademark Office reviewed the following documents submitted and has made the following comments below.
Amendment
Applicant submitted amendments on 10/23/2025. The Examiner acknowledges the amendment and has reviewed the claims accordingly.
Applicant Arguments:
Applicant/s state/s the claims was amended to overcome the rejection under 35 U.S.C. 101; therefore, the rejection should be withdrawn.
Applicant/s state/s that the cited prior arts do not teach the amended claims, specially, the limitation “transmitting, via a wireless communication interface, the tool-condition information to a networked maintenance server configured to initiate a maintenance or replacement action for the tool; and providing, based on confirmation data received from the maintenance server, a visual or haptic notification indicating that the maintenance or replacement action has been initiated”; therefore, the rejection under 35 U.S.C. 103 should be withdrawn.
Examiner’s Responses:
Applicant’s arguments and amendments, see Remarks, filed 10/23/2025, with respect to the rejection(s) of claim(s) 1-20 under 35 U.S.C. 101 have been fully considered and are persuasive. Therefore, the rejection has been withdrawn.
Applicant’s arguments and amendments, see Remarks, filed 10/23/2025, with respect to the rejection(s) of claim(s) 1-20 under 35 U.S.C. 103 have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration of amendments, a new ground(s) of rejection is made in view of Apo in view Sato in view Hass in view of Jadhav.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 11-18 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. The examiner strongly suggested that appropriate corrections be made to clarify the claim scope.
With respect to Claim 11, the claim recites the following, each of which renders the claim indefinite:
“the location” on line 13 (unclear antecedent basis).
Claims 12-18 are rejected for the same reason due to their dependence on the rejected claim 11.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1, 5-6, 8-10 and 19-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over
Apostolopoulos et al. (Apostolopoulos, Ioannis D. et al. "Industrial object, machine part and defect recognition towards fully automated industrial monitoring employing deep learning. The case of multilevel VGG19." arXiv (2020), hereinafter Apo) in view of
Sato et al. (US-20200065585-A1, hereinafter Sato) in view of
Hasselbusch et al. (US-20170091924-A1, hereinafter Hass) in view of
Jadhav et al. (US-20230090297-A1, filed 2021, hereinafter Jadhav)
CLAIM 1
In regards to Claim 1, Apo teaches a processor executing one or more instructions stored on memory (Apo, page 6, first paragraph: “An Intel Core i5-9400F CPU at 2.90GHz computer equipped with 6Gb RAM and a GeForce RTX 2060 Super”) to perform:
receiving an image (Apo, page 4-5, section 2.1: “extracting new features from the input data distributions (i.e., images)”; see input image in FIG. 1);
detecting a tool within the image (Apo, page 3, second paragraph: “… object recognition is a computer vision technique used to recognize and find objects within an image or video. Specifically, object detection draws bounding boxes around these detected objects to identify where the objects are in (or how they pass through) a particular scene. Object recognition consists of recognizing, identifying, and locating objects within an image with a given amount of confidence…”; page 8, Object Recognition in Industry dataset (Tech dataset). Apo teaches a machine learning model that can recognize/detect industrial equipment);
identifying whether the tool within the image is damaged (Apo, page 3-4, fourth paragraph of page 3: “automated systems can be built by object recognition to recognize defective parts or tools for immediate replacement and also to detect individual parts that require repair and replacement during the manufacturing process”; section 2.2 Image datasets for industrial object recognition and defect detection. Apo teaches a machine learning model that can detect defective equipment (with anomalies, holes, cracks, …));
Apo does not explicitly disclose a wearable device comprising: at least one camera;
Sato is in the same field of art of vision-based industrial inspection system. Further, Sato teaches a wearable device (Sato, ¶ [0036]: “the inspection assistance device is equipped on a glasses type wearable terminal”) comprising: at least one camera (Sato, ¶ [0036-0038]: “The image data acquisition unit 11 acquires image data in which an inspection target 3 is captured”)
Therefore, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the invention of Apo by incorporating the wearable device that is taught by Sato, to enable hands-free, real-time tool inspection at the point of use – addressing the well-known problem of requiring workers to transport tools to fixes inspection stations for defect detection. Both the references are in the same field of vision-based industrial inspection using compatible technologies, and one of ordinary skill in the art would have had a reasonable expectation of success combining them using well established integration techniques to achieve Sato’s goal of “enhancing efficiency of inspection work” (Sato, ¶ [0014]: “an object of the present disclosure is to provide a technology of enhancing efficiency of inspection work.”).
The combination of Apo and Sato then teaches a wearable device (Sato, ¶ [0036]: “the inspection assistance device is equipped on a glasses type wearable terminal”) for managing tools in a manufacturing environment. (Apo, page 15, section 5 Conclusion: “a novel modification proposal for the successful network architecture called VGG is proposed and evaluated for defect object and industrial object recognition tasks”; page 3, fourth paragraph: “… automated systems can be built by object recognition to recognize defective parts or tools for immediate replacement and also to detect individual parts that require repair and replacement during the manufacturing process”) (The Examiner notes when combined, Apo in view of Sato teaches a wearable device that can recognize industrial parts and monitor part health by detecting damaged or worn part)
The combination of Apo and Sato does not explicitly disclose when the tool has been identified as damaged, automatically generating tool- condition information include tool-identifying data; transmitting, via a wireless communication interface, the tool-condition information to a networked maintenance server configured to initiate a maintenance or replacement action for the tool;
Hass is in the same field of art of vision-based part wear monitoring system. Further, Hass teaches when the tool has been identified as damaged, automatically generating tool- condition information include tool-identifying data (Hass, ¶ [0059-0060]: “processor may determine the degree of wear of the wear part from the digital image … GUI display may further include a wear indicator interface element that indicates the degree of wear of the wear part, as determined from the digital image. In one embodiment, wear indicator interface element may be a meter, gauge, graph, or other graphic that processor animates to convey to the user the degree of wear of the wear part. For example, wear indicator interface element may indicate the degree of wear as a percentage (0-100%), a color code (e.g., green-red), or a scale (e.g., 1-10)”); transmitting, via a wireless communication interface (Hass, ¶ [0033]: “a wireless network communication interface”), the tool-condition information to a networked maintenance server (Hass, ¶ [0022]: “environment may additionally include a parts image processing system, an application store, and/or a dealer system. Elements of environment may be connected to an electronic communication network over which they may communicate with one another”, see FIG. 1. The part wear determining system is connected with the dealer system over an electronic network) configured to initiate a maintenance or (***The Examiner notes since a listing with “or” is disjunctive, any one of the elements found in the prior art is sufficient to reject the claim. While citations have been provided for completeness and rapid prosecution, only one element is required.) replacement action for the tool (Hass, ¶ [0060]: “In one embodiment, if processor determines that the degree of wear is above a threshold (e.g., 70%), processor may send a notification (e.g., a text message or e-mail) to dealer system. The notification may contain, for example, the identity of user, machine, the wear part, the degree of wear, and/or other information that apprises the dealer of the situation so that the dealer can take further action, if warranted”. Hass disclosing sending part wear information to a dealer system so the part can be serviced/replaced);
Therefore, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the invention of Apo and Sato by incorporating the networked system to monitor and service industrial equipment that is taught by Hass, to make a system that can monitor and automatically request maintenance/repair for a damaged equipment; thus, one of ordinary skilled in the art would be motivated to combine the references since among its several aspects, the present invention recognizes there is a need to automatically notify maintenance personnel of upcoming maintenance/service of an industrial equipment to ensure uninterrupted operation (Hass, ¶ [0030]: “ the dealer may desire to know when a wear part of machine has become sufficiently worn so that it can inspect or service machine and potentially sell replacement parts or services to user ”).
The combination of Apo, Sato and Hass does not explicitly disclose providing, based on confirmation data received from the maintenance server, a visual or haptic notification indicating that the maintenance or replacement action has been initiated.
Jadhav is in the same field of art of system to manage maintenance service for industrial equipment. Further, Jadhav teaches providing, based on confirmation data received from the maintenance server, a visual or (***The Examiner notes since a listing with “or” is disjunctive, any one of the elements found in the prior art is sufficient to reject the claim. While citations have been provided for completeness and rapid prosecution, only one element is required.) haptic notification (Jadhav, ¶ [0043]: “User interface component can be configured to receive user input and to render output to the user in any suitable format (e.g., visual, audio, tactile, etc.)”) indicating that the maintenance or replacement action has been initiated. (Jadhav, ¶ [0050]: “Work order interface component can be configured to communicatively interface with a work order management system and to send an instruction to open a new work order in response to a determination by the monitoring component that a subset of the monitored data satisfies a condition”, ¶ [0055]: “Output data that can be rendered by the user interface component can include work order information for open and closed work orders, notification data informing users that a work order for a newly discovered maintenance concern has been opened”. Jadhav teaches to monitoring an industrial machine, generate maintenance/repair service for the machine, send notification to user regarding the initiation of the maintenance service)
Therefore, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the invention of Apo, Sato and Hass by incorporating the method of sending confirmation notice of initiated service that is taught by Jadhav, to make a system that can monitor industrial tools, request maintenance service when the tool is damaged and send confirmation to the user; thus, one of ordinary skilled in the art would be motivated to combine the references since among its several aspects, the present invention recognizes there is a need to improve operational efficiency of a tools monitoring system (Jadhav, ¶ [0070]: “owners of industrial devices will see improved operational efficiency and asset performance by leveraging data and insights, as well as higher returns on equipment investments and faster analytics rollouts through standardized deployments”).
Thus, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention.
CLAIM 5
In regards to Claim 5, the combination of Apo, Sato, Hass and Jadhav teaches the device of Claim 1. In addition, the combination of Apo, Sato, Hass and Jadhav teaches retrieving data for a set of tools. (Sato, ¶ [0055-0056]: “The inspection target information acquisition unit 150 acquires information about an inspection target 3. For example, information about an inspection target 3 includes an identifier for identifying the inspection target 3, an inspection target name representing the inspection target 3, and an inspection item for the inspection target 3”; ¶ [0089]: “even when inspection targets having the same shape exist at a plurality of positions, information about an inspection target included in the image data can be identified”. Sato teaches information of multiple equipment are recorded and can be acquired later)
CLAIM 6
In regards to Claim 6, the combination of Apo, Sato, Hass and Jadhav teaches the device of Claim 5. In addition, the combination of Apo, Sato, Hass and Jadhav teaches determining a location of the wearable device (Sato, ¶ [0087]: “the wearable glasses 4 may receive positional information by use of the Global Positioning System (GPS) and attach the positional information to the image data”) and associating the set of tools with the location. (Sato, ¶ [0087]: “information about an inspection target 3 associated with an inspection target position 152 within a predetermined range from a position indicated by the positional information attached to the image data.”) (Sato discloses his wearable device can attaches its location to captured images, and associates the inspection target with a location from the image data)
CLAIM 8
In regards to Claim 8, the combination of Apo, Sato, Hass and Jadhav teaches the device of Claim 1. In addition, the combination of Apo, Sato, Hass and Jadhav teaches analyzing a condition of the tool as being useable or not useable. (Hass, ¶ [0004]: “It is desirable to know the degree of wear of a part, for example, so that its remaining useful life can be determined or estimated”, ¶ [0058-0060]: “processor 200 may determine the degree of wear of the wear part from the digital image”, see FIG. 6. Hass teaches a method of capturing image of an equipment, and determine a degree of wear, or remaining useful life, of that equipment. The Examiner notes an equipment that remains to be useful is usable.)
CLAIM 9
In regards to Claim 9, the combination of Apo, Sato, Hass and Jadhav teaches the device of Claim 1. In addition, the combination of Apo, Sato, Hass and Jadhav teaches analyzing a remaining life of the tool. (Hass, ¶ [0004]: “It is desirable to know the degree of wear of a part, for example, so that its remaining useful life can be determined or estimated”, ¶ [0046]: “In this particular example, … the track link is 30% worn, meaning that it has 70% of its useful life remaining before it should be replaced. Over time, … the track link has no remaining useful life and should be replaced immediately”, ¶ [0058-0060]: “processor 200 may determine the degree of wear of the wear part from the digital image”, see FIG. 6. Hass teaches a method of capturing image of an equipment, and determine a degree of wear, or remaining useful life, of that equipment. The Examiner notes an equipment that remains to be useful is usable.)
CLAIM 10
In regards to Claim 10, the combination of Apo, Sato, Hass and Jadhav teaches the device of Claim 1. In addition, the combination of Apo, Sato, Hass and Jadhav teaches the transmitted tool-condition information (Hass, ¶ [0060]: “In one embodiment, if processor determines that the degree of wear is above a threshold (e.g., 70%), processor may send a notification (e.g., a text message or e-mail) to dealer system. The notification may contain, for example, the identity of user, machine, the wear part, the degree of wear, and/or other information that apprises the dealer of the situation so that the dealer can take further action, if warranted”) further includes a severity classification determined by the processor. (Hass, ¶ [0059-0060]: “processor may determine the degree of wear of the wear part from the digital image … wear indicator interface element may indicate the degree of wear as a percentage (0-100%), a color code (e.g., green-red), or a scale (e.g., 1-10)”)
The combination of Apo, Sato, Hass and Jadhav does not explicitly disclose the transmitted tool-condition information further includes an image of the damaged tool.
However, Hass disclose an image of the damaged tool. (Hass, ¶ [0028]: “, parts image processing system (i.e., the server or “cloud”), for example, may receive digital images of wear parts from mobile device over network. Parts image processing system may then process the images to determine the degree of wear of the parts”)
It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the invention of Apo, Sato, Hass and Jadhav by sending images of the damaged equipment to dealer, so the dealer can have a better assessment of the equipment. (Hass, ¶ [0060]: “The notification may contain, for example, the identity of user, machine, the wear part, the degree of wear, and/or other information that apprises the dealer of the situation so that the dealer can take further action, if warranted”)
CLAIM 19
In regards to Claim 19, Apo teaches receiving an image (Apo, page 4-5, section 2.1: “extracting new features from the input data distributions (i.e., images)”; see input image in FIG. 1);
detecting that a tool has been damaged based on the image (Apo, page 3, second paragraph: “… object recognition is a computer vision technique used to recognize and find objects within an image or video. Specifically, object detection draws bounding boxes around these detected objects to identify where the objects are in (or how they pass through) a particular scene. Object recognition consists of recognizing, identifying, and locating objects within an image with a given amount of confidence…”; page 8, Object Recognition in Industry dataset (Tech dataset). Apo teaches a machine learning model that can recognize/detect industrial equipment); (Apo, page 3-4, fourth paragraph of page 3: “automated systems can be built by object recognition to recognize defective parts or tools for immediate replacement and also to detect individual parts that require repair and replacement during the manufacturing process”; section 2.2 Image datasets for industrial object recognition and defect detection. Apo teaches a machine learning model that can detect defective equipment (with anomalies, holes, cracks,…));
Apo does not explicitly disclose A tool management system comprising: at least one wearable device; a memory storing one or more instructions; and a processor executing one or more of the instructions stored on the memory to perform: receiving an image along with a location from the at least one wearable device; tagging the location of the tool that has been damaged.
Sato is in the same field of art of vision-based industrial inspection system. Further, Sato teaches A tool management system (Sato, abstract: “an inspection assistance device”) comprising: at least one wearable device (Sato, ¶ [0036]: “the inspection assistance device 10 is equipped on a glasses type wearable terminal”); a memory storing one or more instructions (Sato, ¶ [0097]: “A read only memory (ROM) 902. A random access memory (RAM) 903”); and a processor (Sato, ¶ [0097]: “A central processing unit (CPU)”) executing one or more of the instructions stored on the memory to perform: receiving an image along with a location from the at least one wearable device (Sato, ¶ [0086-0087]: “the wearable glasses 4 may receive positional information by use of the Global Positioning System (GPS) and attach the positional information to the image data”. Sato teaches a wearable device that can capture image with attached position information); tagging the location of the tool that has been damaged. (Sato, ¶ [0087]: “information about an inspection target 3 associated with an inspection target position 152 within a predetermined range from a position indicated by the positional information attached to the image data.”) (Sato discloses his wearable device can attaches its location to captured images, and associates the inspection target with a location from the image data)
The combination of Apo and Sato does not explicitly disclose transmitting tool-condition data associated with the damaged tool to a networked maintenance server to automatically initiate a maintenance or replacement action;
Hass is in the same field of art of vision-based part wear monitoring system. Further, Hass teaches transmitting tool-condition data associated with the damaged tool (Hass, ¶ [0059-0060]: “processor may determine the degree of wear of the wear part from the digital image … GUI display may further include a wear indicator interface element that indicates the degree of wear of the wear part, as determined from the digital image. In one embodiment, wear indicator interface element may be a meter, gauge, graph, or other graphic that processor animates to convey to the user the degree of wear of the wear part. For example, wear indicator interface element may indicate the degree of wear as a percentage (0-100%), a color code (e.g., green-red), or a scale (e.g., 1-10)”) to a networked maintenance server (Hass, ¶ [0022]: “environment may additionally include a parts image processing system, an application store, and/or a dealer system. Elements of environment may be connected to an electronic communication network over which they may communicate with one another”, see FIG. 1. The part wear determining system is connected with the dealer system over an electronic network) to automatically initiate a maintenance or (***The Examiner notes since a listing with “or” is disjunctive, any one of the elements found in the prior art is sufficient to reject the claim. While citations have been provided for completeness and rapid prosecution, only one element is required.) replacement action (Hass, ¶ [0060]: “In one embodiment, if processor determines that the degree of wear is above a threshold (e.g., 70%), processor may send a notification (e.g., a text message or e-mail) to dealer system. The notification may contain, for example, the identity of user, machine, the wear part, the degree of wear, and/or other information that apprises the dealer of the situation so that the dealer can take further action, if warranted”. Hass disclosing sending part wear information to a dealer system so the part can be serviced/replaced);
Therefore, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the invention of Apo and Sato by incorporating the network system between part monitoring unit and maintenance/service unit that is taught by Hass, to make a system that can automatically request maintenance/repair for a damaged part; thus, one of ordinary skilled in the art would be motivated to combine the references since among its several aspects, the present invention recognizes there is a need to automatically notify maintenance personnel of upcoming maintenance/service of an industrial equipment to ensure uninterrupted operation (Hass, ¶ [0030]: “ the dealer may desire to know when a wear part of machine has become sufficiently worn so that it can inspect or service machine and potentially sell replacement parts or services to user ”).
The combination of Apo, Sato and Hass does not explicitly disclose providing, to the wearable device, confirmation data or instructions associated with the initiated maintenance or replacement action.
Jadhav is in the same field of art of system to manage maintenance service for industrial equipment. Further, Jadhav teaches providing, to the wearable device, confirmation data or (***The Examiner notes since a listing with “or” is disjunctive, any one of the elements found in the prior art is sufficient to reject the claim. While citations have been provided for completeness and rapid prosecution, only one element is required.) instructions associated with the initiated maintenance or replacement action. (Jadhav, ¶ [0050]: “Work order interface component can be configured to communicatively interface with a work order management system and to send an instruction to open a new work order in response to a determination by the monitoring component that a subset of the monitored data satisfies a condition”, ¶ [0055]: “Output data that can be rendered by the user interface component can include work order information for open and closed work orders, notification data informing users that a work order for a newly discovered maintenance concern has been opened”. Jadhav teaches to monitoring an industrial machine, generate maintenance/repair service for the machine, send notification to user regarding the initiation of the maintenance service)
Therefore, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the invention of Apo, Sato and Hass by incorporating the method of sending confirmation notice of initiated service that is taught by Jadhav, to make a system that can monitor industrial tools, request maintenance service when the tool is damaged and send confirmation to the user; thus, one of ordinary skilled in the art would be motivated to combine the references since among its several aspects, the present invention recognizes there is a need to improve operational efficiency of a tools monitoring system (Jadhav, ¶ [0070]: “owners of industrial devices will see improved operational efficiency and asset performance by leveraging data and insights, as well as higher returns on equipment investments and faster analytics rollouts through standardized deployments”).
Thus, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention.
CLAIM 20
In regards to Claim 20, the combination of Apo, Sato, Hass and Jadhav teaches the system of claim 19. In addition, the combination of Apo, Sato, Hass and Jadhav teaches gathering additional information about the tool. (Sato, ¶ [0087]: “information about an inspection target associated with an inspection target position within a predetermined range from a position indicated by the positional information attached to the image data… the inspection target information acquisition unit acquires … an inspection target name and an inspection item”, ¶ [0091-0092]: “Time information at the time of photographing image data may be attached to the image data.”) (Sato discloses gathering additional information of inspection target, such as name, issue, position or time)
Claim(s) 2-3 is/are rejected under 35 U.S.C. 103 as being unpatentable over Apo in view of Sato in view of Hass in view of Jadhav, and further in view of Thiel et al. (US-20210289174-A1, hereinafter Thiel).
CLAIM 2
In regards to Claim 2, the combination of Apo, Sato, Hass and Jadhav teaches the device of Claim 1.
The combination of Apo, Sato, Hass and Jadhav does not explicitly disclose at least one camera is embedded within the wearable device.
Thiel is in the same field of art of wearable imaging sensor. Further, Thiel teaches at least one camera is embedded within the wearable device. (Thiel, ¶ [0045]: “at least one camera is incorporated in and/or removably attachable to the load-bearing garment”)
Therefore, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the invention of Apo, Sato, Hass and Jadhav by incorporating a vest with embedded cameras and batteries that is taught by Thiel, to an inspection system that can be wear on body’s vest with embedded cameras, because a vest-mounted cameras system provides a stable imaging platform secured to the torso that is not subject to the head movement and nodding motions in glasses-mounted cameras like Sato’s, thereby reducing motion blur during tool inspections , while also enabling extended battery life (Thiel, ¶ [0025]: “to be carried out for three to seven days through a twin self-power supply system of a portable battery and a belt-type power supply unit even without need for charging power,”) through belt-mounted high-capacity batteries not feasible with weight limited head mounted systems. One of ordinary skill would have had a reasonable expectation of success adapting Thiel’s vest-mounted cameras to perform Apo’s defect detection, as all references are in the same field of wearable vision-based inspection.
Thus, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention.
CLAIM 3
In regards to Claim 3, the combination of Apo, Sato, Hass and Jadhav teaches the device of Claim 1.
The combination of Apo, Sato, Hass and Jadhav does not explicitly disclose at least one camera is positioned on a front strap of the wearable device.
Thiel is in the same field of art of wearable imaging sensor. Further, Thiel teaches at least one camera is positioned on a front strap of the wearable device. (Thiel, ¶ [0045]: “at least one camera is incorporated in and/or removably attachable to the load-bearing garment”; see annotated FIG. 1 below)
PNG
media_image1.png
1114
1564
media_image1.png
Greyscale
Therefore, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the invention of Apo, Sato, Hass and Jadhav by incorporating a vest with embedded cameras and batteries that is taught by Thiel, to an inspection system that can be wear on body’s vest with embedded cameras, because a vest-mounted cameras system provides a stable imaging platform secured to the torso that is not subject to the head movement and nodding motions in glasses-mounted cameras like Sato’s, thereby reducing motion blur during tool inspections , while also enabling extended battery life (Thiel, ¶ [0025]: “to be carried out for three to seven days through a twin self-power supply system of a portable battery and a belt-type power supply unit even without need for charging power,”) through belt-mounted high-capacity batteries not feasible with weight limited head mounted systems. One of ordinary skill would have had a reasonable expectation of success adapting Thiel’s vest-mounted cameras to perform Apo’s defect detection, as all references are in the same field of wearable vision-based inspection.
Thus, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention.
CLAIM 4
Claim(s) 4 is/are rejected under 35 U.S.C. 103 as being unpatentable over Apo in view of Sato in view of Hass in view of Jadhav, and further in view of Umasudhan (Umasudhan, Mrinall. "Image Depth Estimation Using Stereo Vision." Authorea Preprints (2022), hereinafter Uma).
In regards to Claim 4, the combination of Apo, Sato, Hass and Jadhav teaches the device of claim 1.
The combination of Apo, Sato, Hass and Jadhav does not explicitly disclose two cameras for detecting a depth of the image.
Uma is in the same field of art of computer vision. Further, Uma teaches two cameras (Uma, Page 4-5. Section 2.2. Uma teaches a two cameras setup. See FIG. 2 below) for detecting a depth of the image. (Uma, section 2. Triangulation: “stereo vision algorithm is to find the depth of a pixel using multiple two-dimensional views of the scene, more formally this process is known as the backward projection of a camera from image coordinates into three-dimensional world coordinates”)
PNG
media_image2.png
883
1050
media_image2.png
Greyscale
Therefore, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the invention of Apo, Sato, Hass and Jadhav by incorporating stereo camera system that is taught by Uma, to make a wearable system that can estimate depth of image using two cameras; thus, one of ordinary skilled in the art would be motivated to combine the references since among its several aspects, the present invention recognizes there is a need to improve the performance of the task object detection (Uma, section 1.1, first paragraph: “stereo vision algorithms are able to easily work in conjunction with other computer vision techniques such as machine learning-based object detection models, when compared to the previous depth estimation approaches as it operates under the same image plane that a object detection model may be used on, eliminating the need for sensor data conversion. These factors allow for a more streamlined analysis of the various shapes and angles in an image leading to its usage in various fields.”).
Thus, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention.
CLAIM 7
Claim(s) 7 is/are rejected under 35 U.S.C. 103 as being unpatentable over Apo in view of Sato in view of Hass in view of Jadhav, and further in view of Tamersoy (Birgi Tamersoy “Background Subtraction” The University of Texas at Austin, published 2009, hereinafter Tamersoy).
In regards to Claim 7, the combination of Apo, Sato, Hass and Jadhav teaches the device of claim 1.
The combination of Apo, Sato, Hass and Jadhav does not explicitly disclose detecting the tool within the image comprises removing background environmental objects from the image.
Wikipedia is in the same field of art of foreground object detection. Further, Wikipedia teaches detecting the tool within the image comprises removing background environmental objects from the image. (Tamersoy, page 5, Simple Approach, see reconstructed text below. Tamersoy teaches detecting foreground objects by subtracting the estimated background from the image frame)
PNG
media_image3.png
203
952
media_image3.png
Greyscale
Therefore, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the invention of Apo, Sato, Hass and Jadhav by incorporating background subtraction method that is taught by Tamersoy, to make an inspection system that detect object by background subtraction; thus, one of ordinary skilled in the art would be motivated to combine the references since among its several aspects, the present invention recognizes there is a need for a simple and fast method for object detection (Tamersoy, Page 12, Advantages vs. Shortcomings: “Extremely easy to implement and use…All pretty fast…”).
Thus, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention.
Allowable Subject Matter
Claims 11-18 are rejected under 35 USC 112(b), but would be allowable if the 112(b) is addressed.
The closest prior arts for Claim 11 are:
Apo (Apostolopoulos et al. (Apostolopoulos, Ioannis D. et al. "Industrial object, machine part and defect recognition towards fully automated industrial monitoring employing deep learning. The case of multilevel VGG19." arXiv). Apo teaches a vision-based machine learning system to detect and identify defective industrial parts.
Hass (US-20170091924-A1). Hass teaches a vision-based system to monitor an industrial equipment, automatically generate the wear condition of the monitoring equipment and transmit the wear condition information to a networked dealer to initiate a replacement service.
While both Apo and Hass teach identifying broken industrial equipment. Neither Apo, or Hass, nor the combination teaches “receiving, from the server, response data indicating that a maintenance or replacement action has been scheduled; and
providing a visual or haptic notification identifying the tool and the location and confirming initiation of the maintenance or replacement action”
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to NHUT HUY (JEREMY) PHAM whose telephone number is (703)756-5797. The examiner can normally be reached Mo - Fr. 8:30am - 6pm ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, O'Neal Mistry can be reached on (313)446-4912. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
NHUT HUY (JEREMY) PHAMExaminerArt Unit 2674
/Ross Varndell/Primary Examiner, Art Unit 2674