DETAILED ACTION
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claims 1-20 are pending.
This Action is Non-Final.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-3, 9-11, 17, and 18 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Hoban (Pulumi AI GitHub repository from 11 April 2023, pages 3-4 showing the date for the relied up “9f5cf31” commit).
As per claims 1, 9, and 17, Hoban discloses a medium with instructions, a system with memory storing instructions executed by a processor (see pages 6-7 where to install and run the software the hardware must be present) to execute a computer-implemented method for configuring a computing infrastructure using a cloud platform, the computer-implemented method comprising: storing a plurality of schemas, each schema describing application programming interfaces for interacting with a cloud platform of a plurality of cloud platforms (see page 7 where the different cloud environments are configured and pages 17-28 showing a required library that includes specific schema for the cloud providers – AWS and GCP are shown as exemplary cloud schemas);
receiving, via a user interface, a natural language request for generating infrastructure-as-code for configuring a computing infrastructure using a target cloud platform (see pages 7-9 showing natural language requests, with examples for different AWS environments);
generating a prompt for providing to a machine learning based language model, the prompt requesting the machine learning based language model to generate infrastructure-as-code using a configuration language to configure the computing infrastructure for the target cloud platform in accordance with the natural language request (see pages 7-9 and pages 11-13 showing the code for generating a prompt that will be sent to the AI model requesting the IaC in a specific “langcode” language);
providing, to the machine learning based language model, the prompt with a request for executing the machine learning based language model; receiving, from the machine learning based language model, infrastructure-as-code specified using the configuration language (see pages 7-9 and 14-15 where the code shows the prompt is sent to the AI model and the response is received); and
displaying via the user interface, the infrastructure-as-code specified using the configuration language (see pages 7-8 where the “!program” command displays the code).
As per claims 2, 10, and 18, Hoban discloses configuring the target cloud platform using the infrastructure-as-code specified using the configuration language (see pages 7-9 where the generated code is deployed).
As per claims 3 and 11, Hoban discloses the configuration language is one of following: JavaScript, TypeScript, Python, Go, C#, F#, or HCL (see page 13).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 4-6, 12-14, 19, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Hoban as applied to claims 1, 9, and 17 above, in view of Havewala et al. (US 20250123810).
As per claims 4-6, 12-14, 19, and 20, Hoban discloses the use of schema as part of generating IaC using a pretrained machine learning model (see pages 8-9 where the prompt will include the specific portion/type of container involved, e.g. VPC v. S3, and uses OpenAI models, i.e. pretrained models), where the schema is stored in a structured index (see pages 17-28 where the schema are stored in specific structures), but fails to explicitly disclose the prompt including a portion of the schema so that the machine learning model uses the prompt and the schema to generated the IaC and further training the model based on the schemas.
However, Havewala et al. teaches a prompt including a portion of the schema so that the machine learning model uses the prompt and the schema to generated the IaC and further training the model based on the schemas (see paragraph [0036]-[0039]).
At a time before the effective filing date of the invention, it would have been obvious to one of ordinary skill in the art to include the schema details of Havewala et al. in the Hoban system.
Motivation, as recognized by one of ordinary skill in the art, to do so would have been to receive better and more precise results from the model.
Claims 7, 8, 15, and 16 are rejected under 35 U.S.C. 103 as being unpatentable over Hoban as applied to claims 1 and 9 above, in view of Bathula (US 20240311087).
As per claims 7, 8, 15, and 16, Hoban discloses the limitations substantially similar to those in claims 1 and 9: receiving, via a user interface, a natural language request for generating infrastructure-as-code for configuring a computing infrastructure using a target cloud platform (see pages 7-9 showing natural language requests, with examples for different AWS environments); generating a prompt for providing to a machine learning based language model, the prompt requesting the machine learning based language model to generate infrastructure-as-code using a configuration language to configure the computing infrastructure for the target cloud platform in accordance with the natural language request (see pages 7-9 and pages 11-13 showing the code for generating a prompt that will be sent to the AI model requesting the IaC in a specific “langcode” language); providing, to the machine learning based language model, the prompt with a request for executing the machine learning based language model; receiving, from the machine learning based language model, infrastructure-as-code specified using the configuration language (see pages 7-9 and 14-15 where the code shows the prompt is sent to the AI model and the response is received); and displaying via the user interface, the infrastructure-as-code specified using the configuration language (see pages 7-8 where the “!program” command displays the code); and furthermore has the ability to have prompts for different cloud platforms and different configuration languages, but fails to explicitly disclose performing these steps.
However, Bathula teaches the ability to select different cloud platforms and different configuration languages for generating code (see paragraph [0114] and Fig. 19).
At a time before the effective filing date of the invention, it would have been obvious to one of ordinary skill in the art to allow the user to generate code for different cloud platforms and/or using different configuration languages in the Hoban system.
Motivation to do so would have been to allow the user to generate multiple sets of source code in different programming languages from a single project design codebase (see Bathula paragraph [0114]).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure: the remaining references put forth on the PTO-892 form are directed towards the generation of IaC.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MICHAEL J PYZOCHA whose telephone number is (571)272-3875. The examiner can normally be reached Monday-Thursday 7:30am-5:00pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Hadi Armouche can be reached at (571) 270-3618. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Michael Pyzocha/ Primary Examiner, Art Unit 2409