Prosecution Insights
Last updated: April 19, 2026
Application No. 18/654,187

WORK MEASURING METHOD AND WELDING SYSTEM

Non-Final OA §101§102
Filed
May 03, 2024
Examiner
SAMPLE, JONATHAN L
Art Unit
3657
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Kabushiki Kaisha Kobe Seiko Sho (Kobe Steel Ltd. )
OA Round
1 (Non-Final)
83%
Grant Probability
Favorable
1-2
OA Rounds
2y 11m
To Grant
94%
With Interview

Examiner Intelligence

Grants 83% — above average
83%
Career Allow Rate
786 granted / 951 resolved
+30.6% vs TC avg
Moderate +12% lift
Without
With
+11.9%
Interview Lift
resolved cases with interview
Typical timeline
2y 11m
Avg Prosecution
28 currently pending
Career history
979
Total Applications
across all art units

Statute-Specific Performance

§101
5.5%
-34.5% vs TC avg
§103
40.6%
+0.6% vs TC avg
§102
29.9%
-10.1% vs TC avg
§112
16.6%
-23.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 951 resolved cases

Office Action

§101 §102
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Pursuant to communications filed on 03 May 2024, this is a First Action Non-Final Rejection on the Merits. Claims 1-9 are currently pending in the instant application. Information Disclosure Statement The information disclosure statement (IDS) submitted on 03 May 2024 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement has been considered by the Examiner. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-7 are rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. The claim(s) does/do not fall within at least one of the four categories of patent eligible subject matter because the claimed invention is directed to an abstract idea without significantly more. Regarding claim 1, a work measuring method for measuring a work including a columnar member and a diaphragm and held by a positioner, the work measuring method comprising: an acquiring step of acquiring point cloud data on a surface of the work held by the positioner and photographed in a predetermined direction; a detecting step of identifying a portion where distribution of the point cloud data changes in a three-dimensional coordinate system of the positioner and detecting the portion as a boundary position between the columnar member and the diaphragm; and a deriving step of deriving the number of columnar members and diaphragms included in the work on the basis of the boundary position detected in the detecting step. Step 1: Statutory Category – Yes. The claim(s) recite(s) a work measuring method for measuring a work (i.e. process), therefore the claim(s) fall within one of the four statutory categories. MPEP 2106.03. Step 2A, Prong One evaluation: Judicial Exception – Yes. The Office submits that the foregoing bolded limitation(s) constitutes judicial exceptions in terms of “mental processes” because under the broadest reasonable interpretation, the claim covers performance using mental processes. The claim recites the limitation of “acquiring point cloud data on a surface of the work held by the positioner and photographed in a predetermined direction”, in the context of this claim is an abstract idea, wherein a human acquires (i.e. receives, obtains, etc.) data from one or more sensors (i.e. cameras, imaging devices, etc.) taking a photograph. Humans have the ability to obtain, recognize and interpret data from multiple sources including other humans and machines (an implicit imaging device in this case), and therefore the Examiner submits that this action can be done within the human mind. The claim additionally recites the limitation of “identifying a portion where distribution of the point cloud data changes in a three-dimensional coordinate system of the positioner and detecting the portion as a boundary position between the columnar member and the diaphragm”, in the context of this claim is an abstract idea, wherein a human evaluates/analyzes the image data (i.e. point cloud data) and further detects/identifies a portion as a boundary position (i.e. seam location) between a columnar member and a diaphragm. The claim additionally recites the limitation of “deriving the number of columnar members and diaphragms included in the work on the basis of the boundary position detected in the detecting step”, in the context of this claim is an abstract idea, wherein a human evaluates/analyzes how many column(s) and/or diaphragm(s) are included based on the detected boundary position(s) in the previously evaluated detecting step. Step 2A, Prong Two evaluation: Practical Application – No. Claim 1 is evaluated whether as a whole it integrates the recited judicial exception into a practical application. As noted in the 2019 PEG, it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception. In the present case, there are no additional elements currently provided in the claim limitations to integrate the abstract idea into a practical application because there are no additional elements that would impose any meaningful limit on practicing the abstract idea. Therefore, the claim is ineligible. Step 2B, evaluation: Inventive Concept – No. Claim 1 is evaluated as to whether the claims as a whole amount to significantly more than the recited exception, i.e., whether any additional element, or combination of additional elements, adds an inventive concept to the claim. As discussed with respect to Step 2A Prong Two, there are no additional elements currently provided in the claim limitation(s). The same analysis applies here in Step 2B, i.e., since there are no additional elements currently provided in the claim limitation(s) the judicial exception cannot be integrated into a practical application at Step 2A or provide an inventive concept in Step 2B. Thus, since independent claim 1 is: (a) directed toward an abstract idea, (b) does not recite additional elements that integrate the judicial exception into a practical application, and (c) does not recite additional elements that amount to significantly more than the judicial exception, it is clear that independent claim 1 is directed towards non-statutory subject matter. Regarding claims 2-7, these claims do not recite any further limitations that cause the claim(s) to be directed towards statutory subject matter. The claims merely recite an abstract idea. Each of the further limitations expound upon the abstract idea and do not recite additional elements that are not well understood, routine or conventional. Therefore, claims 2-7 are similarly rejected as being directed towards non-statutory subject matter. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim(s) 1-9 is/are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Schwenker et al (US 12,521,884 B2, hereinafter Schwenker). Regarding claim 1, Schwenker teaches a work measuring method for measuring a work including a columnar member (Figure 1, part 135; Figure 6, part 602) and a diaphragm (Figure 1, part 136; Figure 6, part 604) and held by a positioner (Figure 1, fixture 127), the work measuring method comprising: an acquiring step of acquiring point cloud data on a surface of the work held by the positioner and photographed in a predetermined direction (Figures 1, 5 & 6; at least as in column 25, lines 9-26, wherein “controller 152 is configured to receive information, such as images or image data, audio data, EM data, or a combination thereof, from sensor 109. Controller 152 may generate a 3D representation, such as a point cloud, of one or more structures associated with the received information. For example, the one or more structures may be depicted in the images. A point cloud can be a set of points each of which represents a location in 3D space of a point on a surface of a part (e.g., 135 or 136) and/or fixture 127”); a detecting step of identifying a portion where distribution of the point cloud data changes in a three-dimensional coordinate system of the positioner and detecting the portion as a boundary position between the columnar member and the diaphragm (Figures 1, 5 & 6; at least as in column 11, lines 19-30 and column 26, line 35-column 27, line 11, wherein “Controller 152 may then use the point cloud 500 or 600, image data, or a combination thereof, to identify and locate a seam, such as the seam 506 or 606, to plan a welding path along the seam 506 or 606, and to lay a weld material along seam 506 or 606 according to the path plan and using robot 120. In some implementations, controller 152 may execute instructions 103 (e.g., path planning logic 105, machine learning logic 107, or multipass logic 111), executable code 113, or a combination thereof, to perform one or more operations, such as seam identification, path planning, model training or updating, or a combination thereof” and further wherein “controller 152 may perform the pixel-wise classification and/or the point-wise classification to identify one or more imaged structures within workspace 130 as a part (e.g., 135 or 136), as a seam on the part or at an interface between multiple parts (referred to herein collectively as a candidate seam), as fixture 127, as robot 120, etc.”); and a deriving step of deriving the number of columnar members and diaphragms included in the work on the basis of the boundary position detected in the detecting step (Figures 1, 5 & 6; at least as in column 26, lines 47-60 and column 29, lines 10-23, wherein “After identifying a candidate seam that is an actual seam, controller 152 may perform additional processing referred to herein as registration … controller may perform registration using a priori information, such as a CAD model (or a point cloud version of the CAD model). For example, there may exist a difference between seam dimensions associated with a part (e.g., 135 or 136) and seam dimensions in the CAD model. In some implementations, the CAD model (or a copy of the CAD model) may be deformed (e.g., updated) to account for any such differences. It is noted that the updated CAD model may be used to perform path planning”, and further as shown in at least Figure 6 wherein at least one column and one diaphragm is shown). Regarding claim 2, Schwenker further teaches wherein in the detecting step, the point cloud data is projected onto a plane parallel to the predetermined direction and defined by a first axis corresponding to the predetermined direction and a second axis orthogonal to the first axis, and a change point of the point cloud data is detected as a boundary position between the columnar member and the diaphragm (Figures 1 & 5-7; at least as in column 24, lines 6-42, column 26, line 35-column 27, line 38 and column 29, line 58-column 30, line 36). Regarding claim 3, Schwenker teaches the work measuring method further comprising a calculating step of calculating a dimension and position of the work on the basis of the boundary position detected in the detecting step (Figures 1 & 5-7; at least as in column 24, lines 6-42, column 26, line 35-column 27, line 38 and column 29, line 58-column 30, line 36). Regarding claim 4, Schwenker further teaches wherein in the calculating step, at least one of a diameter of the columnar member, a length of the columnar member, and a plate thickness of the diaphragm is calculated (Figures 1 & 5-7; at least as in column 24, lines 6-42, column 26, line 35-column 27, line 38 and column 29, line 58-column 30, line 36). Regarding claim 5, Schwenker further teaches wherein in the acquiring step, a plurality of pieces of point cloud data are acquired by rotating the work a predetermined rotation angle about the second axis with the positioner, and in the calculating step, the dimension and position of the work are calculated on the basis of the plurality of pieces of point cloud data (Figures 1 & 5-7; at least as in column 24, lines 6-42, column 26, line 35-column 27, line 38 and column 29, line 58-column 30, line 36). Regarding claim 6, Schwenker further teaches wherein in the detecting step, the boundary position is detected by using a part of point cloud data projected onto a plane orthogonal to the predetermined direction and defined by the second axis and a third axis orthogonal to the first axis and the second axis, the part being in a predetermined range from a center in a direction of the third axis (Figures 1 & 5-7; at least as in column 24, lines 6-42, column 26, line 35-column 27, line 38 and column 29, line 58-column 30, line 36). Regarding claim 7, Schwenker teaches further comprising a correcting step of correcting, through sensing, at least one of the dimension and position of the work calculated in the calculating step (Figure 7; at least as in column 29, line 58-column 30, line 36 and column 31, lines 31-62). Regarding claim 8, Schwenker teaches a welding system (Figure 1, system 100) comprising: a welding robot (Figure 1, robot 120) including a welding torch (Figure 1, tool 121 (e.g., a welding tool); at least as in column 8, line 57-column 9, line 10 and lines 37-40, wherein “Robot 120 may include any suitable 121, such as a manufacturing tool” and further wherein “tool 121 may include a manufacturing tool (e.g., a welding tool)”); a welding control device (Figure 1, control system 110, controller 152) configured to control the welding robot (Figure 1; at least as in column 9, lines 1-9, wherein “Robot 120 (e.g., a weld head of robot 120) may be configured to move within the workspace 130 according to a path plan and/or weld plan received from control system 110 or a controller 152”); a positioner (Figure 1, fixture 127) configured to hold a work including a columnar member (Figure 1, part 135; Figure 6, part 602) and a diaphragm (Figure 1, part 136; Figure 6, part 604) (Figures 1 & 6; at least as in column 10, lines 12-46, wherein “Fixture 127 may be configured to hold, position, and/or manipulate one or more parts (135, 136). In some implementations, fixture 127 may include or correspond to tool 121 or manufacturing tool 126. Fixture may include a clamp, a platform, a positioner, or other types of fixture, as illustrate, non-limiting examples. In some examples, fixture 127 is adjustable, either manually by a user or automatically by a motor. For example, fixture 127 may dynamically adjust its position, orientation, or other physical configuration prior to or during a welding process”); and a sensor (Figure 1, sensor 109) configured to photograph the work in a predetermined direction to acquire point cloud data (Figure 1; at least as in column 23, lines 51-55, column 24, lines 43-59, wherein “sensor 109 may collect or generate information, such as images or image data, about one or more physical structures in workspace 130. In some instances, sensor 109 may be configured to image or monitor a weld laid by robot 120, before, during, or after weld deposition. Stated another way, the information may include or correspond to a geometric configuration of a seam, the weld laid by robot 120, or a combination thereof. The geometric configuration may include 3D point cloud information, mesh, image of a slice of the weld, point cloud of the slice of the weld, or a combination thereof, as illustrative, non-limiting examples”), wherein the welding control device includes an acquiring unit configured to acquire point cloud data on a surface of the work held by the positioner and photographed by the sensor in the predetermined direction (Figures 1, 5 & 6; at least as in column 25, lines 9-26, wherein “controller 152 is configured to receive information, such as images or image data, audio data, EM data, or a combination thereof, from sensor 109. Controller 152 may generate a 3D representation, such as a point cloud, of one or more structures associated with the received information. For example, the one or more structures may be depicted in the images. A point cloud can be a set of points each of which represents a location in 3D space of a point on a surface of a part (e.g., 135 or 136) and/or fixture 127”), a detecting unit configured to identify a portion where distribution of the point cloud data changes in a three-dimensional coordinate system of the positioner and detect the portion as a boundary position between the columnar member and the diaphragm (Figures 1, 5 & 6; at least as in column 11, lines 19-30 and column 26, line 35-column 27, line 11, wherein “Controller 152 may then use the point cloud 500 or 600, image data, or a combination thereof, to identify and locate a seam, such as the seam 506 or 606, to plan a welding path along the seam 506 or 606, and to lay a weld material along seam 506 or 606 according to the path plan and using robot 120. In some implementations, controller 152 may execute instructions 103 (e.g., path planning logic 105, machine learning logic 107, or multipass logic 111), executable code 113, or a combination thereof, to perform one or more operations, such as seam identification, path planning, model training or updating, or a combination thereof” and further wherein “controller 152 may perform the pixel-wise classification and/or the point-wise classification to identify one or more imaged structures within workspace 130 as a part (e.g., 135 or 136), as a seam on the part or at an interface between multiple parts (referred to herein collectively as a candidate seam), as fixture 127, as robot 120, etc.”), and a calculating unit configured to calculate a dimension and position of the work on the basis of the boundary position detected by the detecting unit (Figures 1, 5 & 6; at least as in column 26, lines 47-60 and column 29, lines 10-23, wherein “After identifying a candidate seam that is an actual seam, controller 152 may perform additional processing referred to herein as registration … controller may perform registration using a priori information, such as a CAD model (or a point cloud version of the CAD model). For example, there may exist a difference between seam dimensions associated with a part (e.g., 135 or 136) and seam dimensions in the CAD model. In some implementations, the CAD model (or a copy of the CAD model) may be deformed (e.g., updated) to account for any such differences. It is noted that the updated CAD model may be used to perform path planning”). Regarding claim 9, Schwenker further teaches wherein the welding torch includes a second sensor; and the welding control device includes a correcting unit configured to correct, through sensing by the second sensor, at least one of the dimension and position of the work calculated by the calculating unit (Figure 7; at least as in column 29, line 58-column 30, line 36 and column 31, lines 31-62). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. See attached PTO-892 – Notice of References Cited form. Examiner additionally notes the following prior art references, in the same field of endeavor as the instant invention, and also appear to read on some of the currently provided claim limitations above; US 2023/0390934 A1, issued to Christy, which is directed towards a method of correcting angles of a welding torch positioned by a user while training a robot of a robotic welding system. WO 2022/182894 A1, issued to Lonsberry et al, which is directed towards an autonomous welding robot that identifies a candidate seam between two parts to be welded based on modelling and imaging data and generates welding instructions based on said data. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JONATHAN L SAMPLE whose telephone number is (571)270-5925. The examiner can normally be reached Monday-Friday 7:00am-4:00pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Adam Mott can be reached at (571)270-5376. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JONATHAN L SAMPLE/Primary Examiner, Art Unit 3657
Read full office action

Prosecution Timeline

May 03, 2024
Application Filed
Feb 07, 2026
Non-Final Rejection — §101, §102 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12599445
SURGICAL ROBOTIC SYSTEM AND METHOD FOR CART POWER SWITCHOVER
2y 5m to grant Granted Apr 14, 2026
Patent 12594675
SURGICAL ROBOT, METHOD FOR GUIDING SURGICAL ARM TO MOVE THEREOF, AND COMPUTER READABLE STORAGE MEDIUM THEREOF
2y 5m to grant Granted Apr 07, 2026
Patent 12589755
SYSTEM FOR PROVIDING AN OUTPUT SIGNAL BASED ON A GENERATED SURROUNDINGS MODEL OF SURROUNDINGS OF A MOBILE PLATFORM
2y 5m to grant Granted Mar 31, 2026
Patent 12589728
CHARGING CONTROL SYSTEM FOR IN-VEHICLE BATTERY
2y 5m to grant Granted Mar 31, 2026
Patent 12583102
METHOD AND SYSTEM FOR HANDLING A LOAD ARRANGEMENT WITH A ROBOT GRIPPER
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
83%
Grant Probability
94%
With Interview (+11.9%)
2y 11m
Median Time to Grant
Low
PTA Risk
Based on 951 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month