DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA. Information Disclosure Statement The information disclosure statement (IDS) submitted on 1/29/2026,2/20/2024 and 10/26/2023 was filed. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Drawings The drawings are objected to under 37 CFR 1.83(a). The drawings must show every feature of the invention specified in the claims. Therefore, the robot arm must be shown or the feature(s) canceled from the claim(s). No new matter should be entered. Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. The figure or figure number of an amended drawing should not be labeled as “amended.” If a drawing figure is to be canceled, the appropriate figure must be removed from the replacement sheet, and where necessary, the remaining figures must be renumbered and appropriate changes made to the brief description of the several views of the drawings for consistency. Additional replacement sheets may be necessary to show the renumbering of the remaining figures. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance. Color photographs and color drawings are not accepted in utility applications unless a petition filed under 37 CFR 1.84(a)(2) is granted. Any such petition must be accompanied by the appropriate fee set forth in 37 CFR 1.17(h), one set of color drawings or color photographs, as appropriate, if submitted via the USPTO p atent e lectronic f iling s ystem or three sets of color drawings or color photographs, as appropriate, if not submitted via the via USPTO p atent e lectronic f iling s ystem , and, unless already present, an amendment to include the following language as the first paragraph of the brief description of the drawings section of the specification: The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee. Color photographs will be accepted if the conditions for accepting color drawings and black and white photographs have been satisfied. See 37 CFR 1.84(b)(2). Claim Objections Claim 15 is objected to because of the following informalities: Claim 15 has a typo and seems to be unfinished. Appropriate correction is required. Examiner is assuming the end of the claim is “determined based on the expected position”. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: robot controller in claim 1 and 9 . Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. The specification describes the robot controller as an application-specific integrated circuit (ASIC) ( paragraph 0026, lines 5-8 ). If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis ( i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 2-6,8-12 and 14-20 are rejected under 35 U.S.C. 103 as being unpatentable over Wanner et al (US20100152870A1) in view of Ge et al ( US20180117701 A1 ). With regards to claim 2 , Wanner et al discloses a welding robotic system ( means of a robot being a bent arm robot for welding, paragraph 0017, lines 1-2 ) , comprising: a robot arm positioned in a workspace ( bent arm robot, paragraph 0017, lines 1-2 ) , the robot arm coupled to a welding tool configured to weld two objects together along a gap between the two objects positioned in the workspace ( the profiles 3 and the steel plate 2, are welded by means of a robot, for example a bent arm robot, the welding being intended to be controlled in an automated manner, paragraph 0017, lines 1-2 ) ; a robot controller ( the control unit of the welding robot, paragraph 0027, lines 1-3 ) configured to: receive one or more images from one or more sensors, each sensor configured to generate sensor data associated with the workspace ( the image detection region 10 of the laser scanner 11 is prescribed by the frame 8 or by the balls 7 situated externally thereon. The reference values of the centres of the reference balls 7 are known in any coordinate system, i.e. in a coordinate system prescribed by the arrangement according to FIG. 3, paragraph 0033, linens 3-5 ). Wanner et al does not disclose a robot controller configured to: determine, based on a computer aided design (CAD) model depicting the two objects, an expected position of the gap within the workspace; determine a position of the gap in the workspace based on: the expected position, the one or more images, or a combination thereof; identify, based on the one or more images, at least one former weld along the gap; and from the at least one former weld, generate a welding path for the robotic arm to follow to weld at least a portion of the gap. Ge et al teaches to determine, based on a computer aided design (CAD) model depicting the two objects ( virtual environment 200 are contained a three-dimensional model 210 for a welding robot and another three-dimension model 220 for a weld object. The models 210 and 220 are both CAD models, paragraph 0034, lines 3-4 ) , an expected position of the gap within the workspace ( two vertical plates 222, 223 are arranged in the same plane with a gap therebetween and main surfaces thereof are parallel to two opposite side edges of the base plate 224, paragraph 0035, lines 4-5 ) ; determine a position of the gap in the workspace based on: the expected position ( a new seam can be also added if the seam has not been identified because of assembly gap or a gap resulting mismatch in the CAD model, paragraph 0051, lines 3-4 ) ; identify, based on the one or more images, at least one former weld along the gap ( the remaining one 221 is arranged immediately before the two vertical plates covers the gap between the two vertical plates 222, 223, paragraph 0035, lines 5-6 ) ; and from the at least one former weld, generate a welding path for the robotic arm to follow to weld at least a portion of the gap ( a new seam can be also added if the seam has not been identified because of assembly gap or a gap resulting mismatch in the CAD model, paragraph 0051, lines 3-4 ) . Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art, having the teachings of Wanner et al and Ge et al before him or her, to modify the control unit of Wanner et al with the process of Ge et al in order to provide a combination for a welding system that provides support for both three dimensional and CAD environments. With regards to claim 3 , Ge et al teaches w herein the expected position is determined in accordance with one or more annotations associated with the gap in the CAD model ( a new seam can be also added if the seam has not been identified because of assembly gap or a gap resulting mismatch in the CAD model wherein seams are tagged by annotations S1-15, paragraph 0051, lines 3-4, Fig. 6 ) . With regards to claim 4 , Ge et al teaches wherein the one or more annotations are provided by a user ( seam editor 700 can select and edit welding seams 701 and the associated annotations, Fig. 7 ) . With regards to claim 5 , Wanner et al discloses wherein the robot controller is configured to identify the at least one former weld in the one or more images using pixel-wise classification ( the image detection system directly delivers the point clouds (three-dimensional pixels) required for further processing of a measured object, paragraph 0009, lines 1-2 ) . With regards to claim 6 , Wanner et al discloses wherein: the robot controller is configured to transform the one or more images into a point cloud ( an image detection system which comprises an image detection module which receives the data of the image detection sensor and combines the point clouds of the individual images to form a total image of the working space, paragraph 0030, lines 1-3 ) ; and the robot controller is configured to identify a representation of the at least one former weld in the point cloud using point-wise classification ( a geometrical detection module determines, as explained above, the geometrical data of the workpieces in the working space from the image data of the recorded scene and the necessary weld seam data and a control device containing a control algorithm implements the control/regulation of the welding process by using the control unit of the robot, paragraph 0030, lines 3-5 ) . With regards to claim 8 , Wanner et al discloses wherein the at least one former weld includes a tack weld ( two micro panels 1 which normally comprise steel plates 2 which are typical for shipbuilding and have steel profiles 3 tack-welded thereon, paragraph 0016, lines 1-2 ). With regards to claim 9 , Wanner et al discloses a welding robotic system ( means of a robot being a bent arm robot for welding, paragraph 0017, lines 1-2 ) , comprising: a robot arm positioned in a workspace ( bent arm robot, paragraph 0017, lines 1-2 ) , the robot arm coupled to a welding tool configured to weld two objects together along a gap between the two objects positioned in the workspace ( the profiles 3 and the steel plate 2, are welded by means of a robot, for example a bent arm robot, the welding being intended to be controlled in an automated manner, paragraph 0017, lines 1-2 ) ; and a robot controller ( the control unit of the welding robot, paragraph 0027, lines 1-3 ) configured to: determine a position of the gap in the workspace ( unequivocal information relating to the precise course of the subsequent weld seams can be obtained, the intersection points 4, 5 (see FIG. 2) of the profiles 3 with each other and also the inner and outer sides thereof must be determined since the intersecting edges on which the ascending seams connecting the profiles 3 to each other extend are determined by this, paragraph 0024, lines 1-4 ) based on: one or more images associated with the workspace ( the device according to the invention is provided with an image detection system which comprises an image detection module which receives the data of the image detection sensor and combines the point clouds of the individual images to form a total image of the working space, paragraph 0030, lines 1-3 ) ; identify, based on the one or more images, a former weld positioned along the gap ( after the coordinates of the weld seams which are horizontal seams and ascending seams have been found, these are delivered to the control unit of the welding robot, paragraph 0027, lines 1-2 ) ; and generate, based on a position of the gap in the workspace and based on the former weld, a welding path for the robotic arm to follow to weld at least a portion of the gap ( this control unit allocates to the robot assigned, parameterisable movement patterns, so-called macros, the respective weld seams being assigned to the stored macros with the help of data delivered as parameters, i.e. the macros are adapted by means of the parameters, starting and end coordinates, profile height and material thickness inter alia to the conditions of the concrete weld seam, paragraph 0027, lines 2-8 ) . Wanner et al does not disclose to d etermine a position of the gap in the workspace based on: a computer aided design (CAD) model depicting the two objects . Ge et al teaches to d etermine a position of the gap ( two vertical plates 222, 223 are arranged in the same plane with a gap therebetween and main surfaces thereof are parallel to two opposite side edges of the base plate 224, paragraph 0035, lines 4-5 ) in the workspace based on: a computer aided design (CAD) model depicting the two object ( virtual environment 200 are contained a three-dimensional model 210 for a welding robot and another three-dimension model 220 for a weld object. The models 210 and 220 are both CAD models, paragraph 0034, lines 3-4 ). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art, having the teachings of Wanner et al and Ge et al before him or her, to modify the control unit of Wanner et al with the process of Ge et al in order to provide a combination for a welding system that provides support for both three dimensional and CAD environments. With regards to claim 10 , Wanner et al discloses wherein; to identify the former weld, the robot controller is further configured to perform a finding operation ( the image detection system directly delivers the point clouds ( three-dimensional pixels) required for further processing of a measured object, paragraph 0009, lines 1-2 ) . With regards to claim 11 , Wanner et al discloses wherein, to perform the finding operation, the robot controller is further configured to: perform a pixel-wise classification operation on the one or more images; or perform a point-wise classification operation on a point cloud, the point cloud generated based on the one or more images ( the image detection system directly delivers the point clouds (three-dimensional pixels) required for further processing of a measured object, paragraph 0009, lines 1-2 ) . With regards to claim 12 , Wanner et al discloses the former weld includes a tach weld ( two micro panels 1 which normally comprise steel plates 2 which are typical for shipbuilding and have steel profiles 3 tack-welded thereon, paragraph 0016, lines 1-2 ). With regards to claim 14 , Ge et al teaches wherein: the robot controller is further configured to determine, based on the CAD model, an expected position of the gap within the workspace; and the position of the gap in the workspace is determined based on the expected position ( a new seam can be also added if the seam has not been identified because of assembly gap or a gap resulting mismatch in the CAD model, paragraph 0051, lines 3-4 ) . With regards to claim 15 , Ge et al teaches wherein: the robot controller is further configured to identify an annotation included in the CAD model, the annotation associated with the gap ( welding seams S1-S16 are annotation associated with gaps between workpieces, Fig. 6,7 ) ; and the expected position of the gap within the workspace is determined based on the expected position ( a new seam can be also added if the seam has not been identified because of assembly gap or a gap resulting mismatch in the CAD model, paragraph 0051, lines 3-4 ) . With regards to claim 16 , Wanner et al discloses wherein: the robot controller is further configured to receive the one or more images from one or more sensors (the image detection region 10 of the laser scanner 11 is prescribed by the frame 8 or by the balls 7 situated externally thereon, paragraph 0033, lines 3-4 ) ; and each sensor of the one or more sensors is configured to generate sensor data associated with the workspace ( the image detection system directly delivers the point clouds (three-dimensional pixels) required for further processing of a measured object, paragraph 0009, lines 1-2 ) . With regards to claim 17 , Wanner et al discloses wherein: the robot controller is further configured to: generate welding instructions based on the welding path ( the intersection points 4, 5 (see FIG. 2) of the profiles 3 with each other and also the inner and outer sides thereof must be determined since the intersecting edges on which the ascending seams connecting the profiles 3 to each other extend are determined by this, paragraph 0024, lines 2-4 ) ; and transmit the welding instructions to the robot arm coupled to the welding tool ( after the coordinates of the weld seams which are horizontal seams and ascending seams have been found, these are delivered to the control unit of the welding robot, paragraph 0027, lines 1-2 ) ; and the robot arm is configured to operate the welding tool to weld the two objects together along at least a portion of the gap based on the welding instructions ( the specific commands for controlling the welding process, such as switching the gas supply on and off, igniting the arc, activating the seam tracking, end crater filling and back burning of the wire end are already stored in the macros, paragraph 0027, lines 5-7 ) . With regards to claim 18 , Ge et al teaches wherein the portion of the gap is adjacent to the former weld ( a new seam can be also added if the seam has not been identified because of assembly gap or a gap resulting mismatch in the CAD model, paragraph 0051, lines 3-4 ) . With regards to claim 19 , Wanner et al discloses a computer-implemented method of generating welding instructions for a welding robot ( method for controlling robots for welding workpieces, Title ) , the computer-implemented method comprising: determining a position of a gap between multiple objects positioned in a workspace ( information relating to the precise course of the subsequent weld seams can be obtained, the intersection points 4, 5 (see FIG. 2) of the profiles 3 with each other and also the inner and outer sides thereof must be determined since the intersecting edges on which the ascending seams connecting the profiles 3 to each other extend are determined by this, paragraph 0024, lines 1-4 ) , the position of the gap determined based on: one or more images associated with the workspace ( a geometrical detection module determines, as explained above, the geometrical data of the workpieces in the working space from the image data of the recorded scene and the necessary weld seam data and a control device containing a control algorithm implements the control/regulation of the welding process by using the control unit of the robot , paragraph 0030, lines 3-8 ) ; identifying, based on the one or more images, a former weld positioned along the gap ( the remaining one 221 is arranged immediately before the two vertical plates covers the gap between the two vertical plates 222, 223, paragraph 0035, lines 5-6 ) ;and generating, based on a position of the gap in the workspace and based on the former weld, welding instructions for the welding robot to weld at least a portion of the gap ( a new seam can be also added if the seam has not been identified because of assembly gap or a gap resulting mismatch in the CAD model, paragraph 0051, lines 3-4 ). Wanner et al does not disclose the position of the gap determined based on: a computer aided design (CAD) model depicting the multiple objects and generating, based on a position of the gap in the workspace and based on the former weld, welding instructions for the welding robot to weld at least a portion of the gap . Ge et al teaches the position of the gap determined based on: a computer aided design (CAD) model depicting the multiple objects ( models 210 and 220 are both CAD models, paragraph 0034, lines 4-5 ) ; and generating, based on a position of the gap in the workspace and based on the former weld, welding instructions for the welding robot to weld at least a portion of the gap ( a new seam can be also added if the seam has not been identified because of assembly gap or a gap resulting mismatch in the CAD model, paragraph 0051, lines 3-4 ) . Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art, having the teachings of Wanner et al and Ge et al before him or her, to modify the control unit of Wanner et al with the process of Ge et al in order to provide a combination for a welding system that provides support for both three dimensional and CAD environments. With regards to claim 20 , Wanner et al discloses wherein : the welding robot includes a robot arm positioned in the workspace; the robot arm is coupled to a welding tool configured to weld two objects together along the gap ( the profiles 3 and the steel plate 2, are welded by means of a robot, for example a bent arm robot, paragraph 0017, lines 1-2 ) ; and the former weld includes a tack weld ( the profiles 3 are tack-welded onto the plate 2 by their standing surface, paragraph 0023, lines 1-2 ) . Claim(s) 7 is rejected under 35 U.S.C. 103 as being unpatentable over Wanner et al and Ge et al as applied to claim 2 above, and further in view of Meess et al (US20180130226 A1 ). With regards to claim 7 , Wanner et al and Ge et al does not teach wherein the one or more sensors are coupled to the robot arm. Meess et al teaches wherein the one or more sensors are coupled to the robot arm ( positions sensors on the welding gun may be used to calibrate the depth of the image. Such sensors can include, but are not limited to, magnetic sensors, optical sensors, acoustics sensors, and the like, which are sensed using an appropriate sensing system to allow for the positioning of the welding gun to be determined, paragraph 0045, lines 6-8 ). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art, having the teachings of Wanner et al, Ge et al and Meess et al before him or her, to modify the robot arm and welding tool of Wanner et al and Ge et al with the sensors of Meess et al in order to provide a combination for a welding system t hat precisely senses the location of the welding tool in space. Claim(s) 13 and 21 are rejected under 35 U.S.C. 103 as being unpatentable over Wanner et al and Ge et al as applied to claim s 9 and 19 above, and further in view of Tateno et al ( EP2639766 ). With regards to claim 13 , Wanner et al and Ge et al does not teach wherein the position of the gap in the workspace is determined based on the CAD model and the one or more images. Tateno et al teaches wherein the position of the gap in the workspace is determined based on the CAD model and the one or more images ( a three-dimensional shape model, data obtained by capturing an image of a target object may be used in addition to a CAD model, paragraph 0041, lines 1-3 ) . Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art, having the teachings of Wanner et al, Ge et al and Tateno et al before him or her, to modify the robot controller of Wanner et al and Ge et al with the ima ging of Tateno et al in order to provide a more well-defined image model for a welding environment. With regards to claim 21 , Wanner et al and Ge et al teaches wherein the welding instructions cause the welding robot to weld the portion of the gap in a direction away from the former weld ( a new seam can be also added if the seam has not been identified because of assembly gap or a gap resulting mismatch in the CAD model, paragraph 0051, lines 3-4 ). Wanner et al and Ge et al does not teach identifying an expected position of the gap based on the CAD model; and receiving, from a sensor, the one or more images associated with the workspace; and wherein the position is determined based on the expected position and the one or more images . Tateno et al teaches identifying an expected position of the gap based on the CAD model; and receiving, from a sensor, the one or more images associated with the workspace; and wherein the position is determined based on the expected position and the one or more images ( three-dimensional shape model, data obtained by capturing an image of a target object may be used in addition to a CAD model. Note that in this embodiment, "geometric features" generally indicates both plane and line geometric features, paragraph 0041, lines 1-4 ). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art, having the teachings of Wanner et al, Ge et al and Tateno et al before him or her, to modify the robot controller of Wanner et al and Ge et al with the imaging of Tateno et al in order to provide a more well-defined image model for a welding environment. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to FILLIN "Examiner name" \* MERGEFORMAT THOMAS JOHN WARD whose telephone number is FILLIN "Phone number" \* MERGEFORMAT (571)270-1786 . The examiner can normally be reached FILLIN "Work Schedule?" \* MERGEFORMAT Monday - Friday, 7am - 4pm . Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, FILLIN "SPE Name?" \* MERGEFORMAT STEVEN CRABB can be reached at FILLIN "SPE Phone?" \* MERGEFORMAT 5712705095 . The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /THOMAS J WARD/ Examiner, Art Unit 3761 /EDWARD F LANDRUM/ Supervisory Patent Examiner, Art Unit 3761