DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-5, 8-10, 18, and 20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Elyan et al., “Deep learning for symbols detection and classification in engineering drawings” (hereinafter Elyan) in view of Auh et al. (US 2021/0150081) (hereinafter Auh).
Examiner’s Note: The Elyan reference is best understood in color as originally published to best see the generated equipment tag text and location data. Unfortunately the antiquated computer systems at the USPTO convert documents to grayscale. Applicants’ representative(s) are invited to contact the examiner to obtain a color version of the Elyan reference via email in the interest of compact prosecution.
Regarding claims 1 and 20, Elyan teaches a line diagram conversion (LDC) platform and method comprising:
facilitate generation of a graphical user interface to interact with a user of a client device (fig. 9, GUI displaying diagram with various recognized symbols);
import single line diagram image data (pg. 93, fig. 1, input images; fig. 2, annotated P&ID diagrams);
generate, via an artificial intelligence (AI) equipment detection function and based on the single line diagram image data, equipment type and location data associated with one or more pieces of equipment detected in the single line diagram image data (pg. 93, fig. 2, deep learning equipment detection function that used the input single line diagram image data; pg. 95, fig. 4, on right side, red box location of sensor “103 TW 1097” with equipment type label of “sensor”; pg. 99, fig. 9, color coded location boxes around all the equipment, each with equipment type labels);
generate equipment tag text and location data associated with one or more equipment tags detected in the single line diagram image data (pg. 95, fig. 4, on right side, red box location of sensor “103 TW 1097” with equipment type label of “sensor”; pg. 99, fig. 9, color coded location boxes around all the equipment, each with equipment type labels);
generate, based on the equipment tag text and location data and the equipment type and location data, single line diagram display data having one or more equipment tags associated with the one or more equipment tags detected in the single line diagram image data and further having visual equipment indications associated with the one or more pieces of equipment detected in the single line diagram image data (fig. 9, color coded location boxes around all the equipment, each with equipment type labels); and
send the interactive single line diagram display data to the client device for display and interaction via the graphical user interface (fig. 9).
Elyan doesn’t explicitly teach the single line diagram display data is interactive. However, Auh teaches a platform comprising:
a network interface configured to communicate via a network (fig. 1, I/O interface 112); and
a processing system that includes a memory that stores operational instructions and at least one processor configured to execute the operational instructions (fig. 1, processors 126 and memory devices 128), wherein the operational instructions cause the at least one processor to:
facilitate generation of a graphical user interface to interact with a user of a client device (ph. [0075], “The visual representation may be rendered at the client device via a web browser or native application executing thereon.”);
import, via the network interface, single line diagram image data associated with an electrical wiring diagram (ph. [0102], “With this in mind, in some embodiments, the customer may provide the load list remotely via a network to supervisors, subordinates, clients, customers, vendors, suppliers, contractors, etc. As such, a server computing system may receive and analyze the load list to identify various electrical equipment that are part of the load list.”; ph. [0125], “the processor 126 receives client data from a user via the GUI (block 302). The client data may include an electrical load list (e.g., upload a spreadsheet indicative of the electrical load list via the GUI),”);
generate, based on the equipment tag text and location data and the equipment type and location data, interactive single line diagram display data having one or more interactive equipment tags associated with the one or more equipment tags detected in the single line diagram image data and further having visual equipment indications associated with the one or more pieces of equipment detected in the single line diagram image data (ph. [0148], “Visual and textual representation within the single-line drawing, the elevation drawing, and/or the engineering proposal document may be linked to each other or configured as interactive objects that cause the processor 126 to provide additional information regarding the object that corresponds to the visual and/or textual representation.”; ph. [0149], “Further, because features of the layout, the single-line drawing, the elevation drawing, the engineering proposal document are linked to each other, the processor 126 may dynamically update any of the features represented on each document or drawings in an efficient manner. For example, if the processor 126 receives an input of modification to an electrical component from the electrical load list, the processor 126 may update the electrical component within the base design, the layout, the single-line drawing, the elevation drawing, the engineering proposal document, and the like without independently changing inputs for each respective document. Furthermore, a change to an electrical component within the layout, the single-line drawing, the elevation drawing, the engineering proposal document also be automatically applied to other visualizations. For example, a change to an electrical component within the layout is automatically reflected as a similar change to the electrical component in the single-line drawing, the elevation drawing, and the engineering proposal document.”); and
send, via the network interface, the interactive single line diagram display data to the client device for display and interaction via the graphical user interface (ph. [0075], “The visual representation may be rendered at the client device via a web browser or native application executing thereon.”).
One of ordinary skill in the art before the effective filing date would have been motivated to modify Elyan in the manner taught by Auh to increase user engagement with the data and improve data exploration and refinement.
Regarding claim 2, the Elyan/Auh combination teaches the platform of claim 1. Elyan further teaches the Al equipment detection function is implemented via a computer vision model (pg. 94, “For locating and recognising symbols in the P&IDs, we propose to use YOLO method”).
Regarding claim 3, the Elyan/Auh combination teaches the platform of claim 2. Elyan further teaches the computer vision model is trained based on a training dataset including line diagram image data with annotations (pg. 93, fig. 2, Annotated P&ID Diagrams split into training set and testing set). Elyan does not explicitly teach the line diagram image data is SLD image data. However, Auh teaches the line diagram image data is SLD image data (ph. [0061], “a single-line diagram representing the motor control lineup. The single-line diagram may be a visual representation of the electrical system(s) of the industrial automation project 130. The electrical system may be complex and the single-line diagram provides a simplified visual representation of the electrical system in two-dimensions.”). One of ordinary skill in the art before the effective filing date would have been motivated to modify the P&ID system of Elyan to analyze SLD data of breakers, transformers, etc. in a similarly manner to increase user engagement with the data and improve data exploration and refinement of SLD data.
Regarding claim 4, the Elyan/Auh combination teaches the platform of claim 3. Elyan further teaches the computer vision model is retrained in response to at least one of: when images of new electrical symbols are added to the training dataset; or when errors are found in the Al equipment detection function (pg. 98, “The experiment aims first at generating more symbols using MFC-GAN model. Then these synthesised samples will be used to augment the training set aiming at improving classification results.”).
Regarding claim 5, the Elyan/Auh combination teaches the platform of claim 1. Elyan further teaches the equipment tag text and location data indicates the text of the one or more equipment tags and location information indicating a position of each of the one or more equipment tags in the diagram (pg. 95, fig. 4, right diagram, red box sensor, pink box DBBPV, purple box DB&BBV; see also pg. 99, fig 9).
Regarding claim 8, the Elyan/Auh combination teaches the platform of claim 4. Elyan further teaches the equipment tag text and location data includes: a tag name and tag location; and a tag description and equipment specification with a corresponding location (pg. 95, fig. 4, right diagram, red box sensor, pink box DBBPV, purple box DB&BBV; see also pg. 99, fig 9).
Regarding claim 9, the Elyan/Auh combination teaches the platform of claim 1. Auh further teaches the one or more interactive equipment tags includes links corresponding to the one or more equipment tags (ph. [0148], “Visual and textual representation within the single-line drawing, the elevation drawing, and/or the engineering proposal document may be linked to each other or configured as interactive objects that cause the processor 126 to provide additional information regarding the object that corresponds to the visual and/or textual representation.”; ph. [0149], “Further, because features of the layout, the single-line drawing, the elevation drawing, the engineering proposal document are linked to each other, the processor 126 may dynamically update any of the features represented on each document or drawings in an efficient manner. For example, if the processor 126 receives an input of modification to an electrical component from the electrical load list, the processor 126 may update the electrical component within the base design, the layout, the single-line drawing, the elevation drawing, the engineering proposal document, and the like without independently changing inputs for each respective document. Furthermore, a change to an electrical component within the layout, the single-line drawing, the elevation drawing, the engineering proposal document also be automatically applied to other visualizations. For example, a change to an electrical component within the layout is automatically reflected as a similar change to the electrical component in the single-line drawing, the elevation drawing, and the engineering proposal document.”). One of ordinary skill in the art before the effective filing date would have been motivated to modify Elyan in the manner taught by Auh to increase user engagement with the data and improve data exploration and refinement.
Regarding claim 10, the Elyan/Auh combination teaches the platform of claim 1. Elyan further teaches the one or more interactive equipment tags includes an interactive button presented in conjunction with the graphical user interface and associated with a corresponding one of the one or more equipment tags (ph. [0095], “The GUI 175 may also include various functional buttons that can be used during design of the motor control lineup or after a final layout is determined. For example, a first button 258 may be used to save any changes made to the motor control lineup. A second button 260 may generate and submit a request for proposal to the manufacturer of the motor control lineup. A third button 262 may generate one or more documents related to the motor control lineup. For example, the documents related to the motor control lineup may include a request for proposal, a request for quotation, a corresponding single-line diagram, a formal elevation view, and the like.”).
Regarding claim 18, the Elyan/Auh combination teaches the platform of claim 1. Auh further teaches the operational instructions cause the at least one processor to: generate recommendations data via a recommendations engine that is coupled to a database that includes equipment information, electrical specifications and project data (ph. [0086], “Each recommended motor control lineup 232, 242, 244 may include a title 234, a list of one or more features 236, and an estimated cost 238. The title 234 and the list of features 236 may be generated by the computing system based on the input data provided by the customer.”);
wherein the interactive single line diagram display data is generated further to indicate the recommendations data (fig. 4, recommendation 232).
Claim(s) 6 is/are rejected under 35 U.S.C. 103 as being unpatentable over the Elyan/Auh combination as applied to claim 1 above, and further in view of Sowell et al. (US 2021/0174077) (hereinafter Sowell).
Regarding claim 6, the Elyan/Auh combination teaches the platform of claim 1. The combination does not explicitly teach using a text extraction and location function that operates via text recognition. However, Sowell teaches a text extraction and location function that operates via text recognition (figs. 2A-2B; ph. [0059], “In some embodiments, the first result comprises separating the recognized text and saving the text locations in CSV format for use in text NLP based search and display of piping and instrumentation diagrams.”).
Allowable Subject Matter
Claims 7, 11-17, and 19 objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
Xu et al. (US 2022/0156418) teaches progress tracking with automatic symbol detection.
Jahjah et al. (US 2022/0043547) teaches techniques for labeling, reviewing and correcting label predictions for P&IDs.
Frey et al. (US 2021/0103686) teaches generating a digital model of a building.
Powles et al. (US 2022/0391627) teaches rapid and accurate modeling of a building construction structure using AI.
Tyulyaev et al. (US 2020/0387553) teaches digitization of technical documentation driven by machine learning.
Austern et al. (US 2021/0256180) teaches automatic extraction of data from 2D floor plans for retention in building information models.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to BRIAN W WATHEN whose telephone number is (571)270-5570. The examiner can normally be reached M-F 9-5:30pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, James Trujillo can be reached at 571-272-3677. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
BRIAN W. WATHEN
Primary Examiner
Art Unit 2151
/BRIAN W WATHEN/ Primary Examiner, Art Unit 2151