Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
This action is responding to the amendment filed on 11/10/2025.
Claims 1-20 are pending in the application.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1, 9, 10 and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Wilson et al. (US20230146421, hereafter Wilson, cited) in view of Sayre et al. (US20220043971, hereafter Sayre) and Stone et al. (US20220269738, hereafter Stone).
Per claim 1:
Wilson teaches:
A computer-implemented method, comprising: receiving, by a processor set and through a back-end user interface, input data comprising at least one code snippet and data types which define aspects of a front-end user interface (Wilson, see at least [0003], receive, by way of a platform UI builder, selection of a UI component definition from the plurality of UI component definitions; bind, by way of input entered into the platform UI builder, data to the UI component definition, wherein the data is from a data source, and wherein the input is a programmatic statement that references the data source or a set of values that references the data source; generate, by way of the platform UI builder, metadata representing the input; [0039], the developer may define the data model, which specifies the types of data that the application uses and the relationships therebetween; [0060] Server devices 202 may be configured to transmit data to and receive data from data storage 204. This transmission and retrieval may take the form of SQL queries or other types of database queries, and the output of such queries, respectively. Additional text, images, video, and/or audio may be included as well. Furthermore, server devices 202 may organize the received data into web page or web application representations; [0125]; [0172]; Fig. 9 and associated texts; Note that the UI components definition specifies data types);
generating, by the processor set, rendering instructions based on the input data (Wilson, see at least [0160] The generation of JSON file 912 from no-code graphical user interface 900 need not rely on context-free grammars. Instead, the hierarchy of blocks and elements of JSON file 912 can be generated directly from what is specified in no-code graphical user interface 900. To that point, each operation specified in no-code graphical user interface 900 is associated with a number of operands also specified in no-code graphical user interface 900; [0174] Block 1104 may involve generating, by way of the platform UI builder, metadata representing the input; [0128]; [0159]; 0175); Note that a custom UI component is created that incorporates the data into the UI definition based on the metadata).
receiving, by the processor set and from a front-end platform, a request to view the front- end user interface (Wilson, see at least [0163] this graphical user interface updates the display of the UI components in real time based on changes made to their configurations. Thus, the user is presented with an accurate representation of how these UI components would actually appear when integrated into an application;
[0169] When the resulting custom UI components are incorporated into a graphical user interface, platform runtime 708 may populate these custom UI components with data obtained from data sources specified in the metadata and possibly transformed as specified in the metadata. Further, when a data source is updated (e.g., an application state change) or a custom UI component is modified by an end user (e.g., by selection of an option), platform runtime 708 may update the custom UI component as displayed to be consistent with the update and/or the modification; [0179]; [0188] wherein the component selector section displays the plurality of UI component definitions and allows selection of the UI component definition, wherein the component display workspace section displays a live representation of the custom UI component populated with the data, and wherein the component configuration section allows entry of the input. In some embodiments, entry of the input causes regeneration of the live representation of the custom UI component populated with the data; Note that the platform runtime is displaying the front-end UI with the generated UI components based on a request to display (view) the interface).
in response to receiving the request to view the front-end user interface, communicating, by the processor set, the rendering instructions to the front-end platform, wherein the rendering instructions cause the front-end platform to dynamically render the front-end user interface (Wilson, see at least [0151] the string displayed in text box 812 is dynamically determined based on application state (e.g., state stored in a server device or provided by a client device). This means that as the state changes, the displayed string may also change accordingly; [0166] This text box contains a low-code statement being used to dynamically control the content displayed at the top left of the list shown in section 1002; [0129]; [0163]; [0175]; [0177]; Note that the platform runtime 708 facilitates the display of custom graphical user interfaces and updates the display of the UI components in real time based on changes made to their configurations).
Wilson does not explicitly teach input data comprising at least one prompt type and at least one associated answer type. However, a particular UI component specific to a prompt/answer such as a questionnaire or survey is mere data type variance or selection. Nonetheless, Sayre particularly teaches input data comprising at least one prompt type and at least one associated answer type (Sayre, see at least [0015] the reflexive questionnaire with the set of questions based on the version identifier and the set of answers based on the set of metadata responsive to the request). It would have been obvious for one having ordinary skill in the art before the effective filing date of the claimed invention to have combined Wilson’s custom GUI generation with Sayre’s dynamically rendering online questionnaire to modify Wilson’s system to combine the questionnaire type as taught by Sayre, with a reasonable expectation of success, since they are analogous art because they are from the same field of endeavor related to user interface generation. Combining Sayre’s functionality with that of Wilson results in a system that incorporates survey or questionnaire UI components. The modification would be obvious because one having ordinary skill in the art would be motivated to make this combination to allow dynamic rendering of a reflexive questionnaire as the UI components if desired as a design option (Sayre, see at least [0015] the reflexive questionnaire with the set of questions based on the version identifier and the set of answers based on the set of metadata responsive to the request).
Wilson and Sayre do not explicitly teach determining, by the processor set, that the front-end user interface is displayed on a web-browser by parsing the at least one code snippet; creating, by the processor set, a customized widget based on a determination that the front-end user interface is displayed on the web browser; (Stone, see at least [0028] When a user of a client device 106 navigates a browser or similar client application 108 to a uniform resource locator (URL) address associated with the web application that includes a reference to the configured process flow web component 124, 200, the client application 108 retrieves or otherwise obtains the configured process flow web component 124, 200 via the page generator application 150 and processes or otherwise executes the process flow HTML or other presentation code 202 and the process flow JavaScript or other behavioral code 204 to generate a web page GUI display within the client application 108 at the client device 106. …based on the process flow behavioral code 204 and/or the behavioral code 214 associated with the currently displayed web component(s) 124, 210, the client application 108 and/or the page generator application 150 responds to user actions or other events to dynamically update the web page GUI display at the client application 108 to advance or progress through the process flow associated with the web application to obtain and render configured web component(s) 124, 210 in accordance with the configured component sequence metadata 206; [0030] customizable web components; [0046] when a developer user selects or otherwise manipulates an activate button or similar GUI element within the process flow builder GUI display to deploy or otherwise implement the process flow defined within the process flow editing region within a web application, … the browser application 108 retrieves or otherwise obtains the HTML file for the web page associated with the web application at that address from the network 110 and then parses or otherwise executes the HTML code to generate the web page GUI display associated with the web application within the browser application 108 – note that a user interface is displayed on a web browser and when a user navigates to a URL, the browser acts as an HTTP client to fetch, parse and render HTML and web components into the interface). It would have been obvious for one having ordinary skill in the art before the effective filing date of the claimed invention to have combined Stone’s customization based in a user interface displayed on a web browser with Wilson’s custom GUI generation and Sayre’s dynamically rendering online questionnaire to modify Wilson’s system to combine the customization associated with a web browser as taught by Stone, with a reasonable expectation of success, since they are analogous art because they are from the same field of endeavor related to a user interface. Combining Stone’s functionality with that of Wilson and Sayre results in a system that incorporates user interface customization associated with a web browser. The modification would be obvious because one having ordinary skill in the art would be motivated to make this combination to provide a customized web page GUI display at a client device (Stone, see at least [0028] When a user of a client device 106 navigates a browser or similar client application 108 to a uniform resource locator (URL) address associated with the web application that includes a reference to the configured process flow web component 124, 200, the client application 108 retrieves or otherwise obtains the configured process flow web component 124, 200 via the page generator application 150 and processes or otherwise executes the process flow HTML or other presentation code 202 and the process flow JavaScript or other behavioral code 204 to generate a web page GUI display within the client application 108 at the client device 106. …based on the process flow behavioral code 204 and/or the behavioral code 214 associated with the currently displayed web component(s) 124, 210, the client application 108 and/or the page generator application 150 responds to user actions or other events to dynamically update the web page GUI display at the client application 108 to advance or progress through the process flow associated with the web application to obtain and render configured web component(s) 124, 210 in accordance with the configured component sequence metadata 206; [0030] customizable web components; [0046] when a developer user selects or otherwise manipulates an activate button or similar GUI element within the process flow builder GUI display to deploy or otherwise implement the process flow defined within the process flow editing region within a web application, … the browser application 108 retrieves or otherwise obtains the HTML file for the web page associated with the web application at that address from the network 110 and then parses or otherwise executes the HTML code to generate the web page GUI display associated with the web application within the browser application 108 – note that a user interface is displayed on a web browser and when a user navigates to a URL, the browser acts as an HTTP client to fetch, parse and render HTML and web components into the interface).
9. The computer-implemented method of claim 1, wherein the rendering instructions cause the front-end platform to dynamically render the front-end user interface by causing the front-end platform to dynamically render a plurality of different types of user interaction components configured to cause at least one action selected from a group consisting of receive at least one input and generate at least one output (Wilson, see at least [0043]; [0163]; [0118] In order to make graphical user interfaces interactive, each UI component can be programmed with code that specifies one or more data sources as well as transformations to be performed on data from these data sources that specify how the data is to be displayed within the component; [0019] FIGS. 8C and 8D depict dynamic UI components that can be generated from interpreting the metadata with associated application state; [0151] In this manner, the string displayed in text box 812 is dynamically determined based on application state (e.g., state stored in a server device or provided by a client device). This means that as the state changes, the displayed string may also change accordingly; [0163] this graphical user interface updates the display of the UI components in real time based on changes made to their configurations. Thus, the user is presented with an accurate representation of how these UI components would actually appear when integrated into an application; [0166] This text box contains a low-code statement being used to dynamically control the content displayed at the top left of the list shown in section 1002; Note that the different user interaction components can be specified so that different types).
Per claims 10 and 19, they are the product and system versions of claim 1, respectively, and are rejected for the same reasons set forth in connection with the rejection of claim 1 above.
Claims 2-4 and 11 are rejected under 35 U.S.C. 103 as being unpatentable over Wilson in view of Sayre, Stone and Procopio et al. (US 20230185544, hereafter Procopio).
Per claim 2:
Wilson further teaches:
compiling the at least one code snippet; and rendering the customized widget in the front-end platform, wherein generating the rendering instructions comprises converting the at least one prompt type and the at least one associated answer type into a JavaScript Object Notation (JSON) formatted input data (Wilson, see at least [0128] metadata 706 may take the form of structured data, such as XML or JavaScript Object Notation (JSON); [0142] FIG. 8A depicts translation between low-code statement 800 (which is the IF statement given above) and JSON file … low-code module 704A uses a compiler-like function to translate low-code statement 800 into the syntax tree of JSON file 802; [0125] In FIG. 7, data source 700 and UI component definition 702 serve as input to platform UI builder 704, which in turn produces metadata that can be used to customize UI component definition 702; [0159], note that no-code module is triggered to create JSON file).
Wilson does not explicitly teach determining credentials of a superuser by performing back-end service checks of the at least one code snippet. Procopio teaches determining credentials of a superuser by performing back-end service checks of the at least one code snippet (Procopio, see at least [0046] the no-code backend 160 stores credentials from the user 12a for providing authorization (e.g., via OAuth or other authentication standards) when calling the function 164 during execution of the application 190. The permissions 196b of the function 164 may be a subset of the permissions 196a of the application 190. The permissions 196 required by the function 164 may be dependent upon the values or expressions provided by the user 12a for the parameters 430 associated with the function 164 (e.g., based on the data required to be accessed to evaluate the expressions). In some implementations, the no-code backend 160 (e.g., via the GUI 14a) may require re-authorization when the required permissions 196 for the script 162 and/or function 164 change (e.g., in response to changing the values for the parameters 430). The GUI 14a may provide a list of authorizations needed for each selected function 164; [0045], The no-code backend 160 requires the user 12a to provide authorization for execution of the function 164 (e.g., via user interaction with the authorization user input 510 using cursor 210) prior to publishing or deploying the application 190).
It would have been obvious for one having ordinary skill in the art before the effective filing date of the claimed invention to have combined Procopio’s credential check with Stone’s customization based in a user interface displayed on a web browser, Wilson’s custom GUI generation and Sayre’s dynamically rendering online questionnaire to modify Wilson’s system to combine the credential check as taught by Procopio with a reasonable expectation of success, since they are analogous art because they are from the same field of endeavor related to a user interface or code development. Combining Procopio’s functionality with that of Wilson, Stone and Sayre results in a system that incorporates the credential check. The modification would be obvious because one having ordinary skill in the art would be motivated to make this combination to provide an access control for a security purpose (Procopio, see at least [0046] the no-code backend 160 stores credentials from the user 12a for providing authorization (e.g., via OAuth or other authentication standards) when calling the function 164 during execution of the application 190. The permissions 196b of the function 164 may be a subset of the permissions 196a of the application 190. The permissions 196 required by the function 164 may be dependent upon the values or expressions provided by the user 12a for the parameters 430 associated with the function 164 (e.g., based on the data required to be accessed to evaluate the expressions). In some implementations, the no-code backend 160 (e.g., via the GUI 14a) may require re-authorization when the required permissions 196 for the script 162 and/or function 164 change (e.g., in response to changing the values for the parameters 430). The GUI 14a may provide a list of authorizations needed for each selected function 164; [0045], The no-code backend 160 requires the user 12a to provide authorization for execution of the function 164 (e.g., via user interaction with the authorization user input 510 using cursor 210) prior to publishing or deploying the application 190).
3. The computer-implemented method of claim 2, wherein generating the rendering instructions comprises parsing the JSON formatted input data into at least one pre-built widget (Wilson, see at least [0142] FIG. 8A depicts translation between low-code statement 800 (which is the IF statement given above) and JSON file; [0033] The aPaaS system may support standardized application components, such as a standardized set of widgets for graphical user interface (GUI) development; [0043] Further, user interaction with GUI elements, such as buttons, menus, tabs, sliders, checkboxes, toggles, etc. may be referred to as “selection”, “activation”, or “actuation” thereof. These terms may be used regardless of whether the GUI elements are interacted with by way of keyboard, pointing device, touchscreen, or another mechanism; [0082]; [0118] These UI components could be, but are not limited to, avatars, badges, buttons, calendars, cards, checkboxes, containers, forms, icons, images, lists, menus, messages, notifications, pickers, progress bars, sidebars, sliders, tabs, text boxes, toggles, and so on. In order to make graphical user interfaces interactive; [0033]; [0129]; [0159], Note that a widget is a specific interactive type of UI component and can be elements like buttons and icons, a calendar a music player).
4. The computer-implemented method of claim 3, wherein communicating the rendering instructions comprises communicating the at least one pre-built widget to the front-end platform, wherein the at least one pre-built widget causes the front-end platform to dynamically render the front-end user interface (Wilson, see at least [0151] the string displayed in text box 812 is dynamically determined based on application state (e.g., state stored in a server device or provided by a client device). This means that as the state changes, the displayed string may also change accordingly; [0166] This text box contains a low-code statement being used to dynamically control the content displayed at the top left of the list shown in section 1002; [0129]; [0163]; [0175]; [0177]; Note that the platform runtime 708 facilitates the display of custom graphical user interfaces and updates the display of the UI components in real time based on changes made to their configurations).
Per claim 11, it is the product version of claim 2, respectively, and is rejected for the same reasons set forth in connection with the rejection of claim 2 above.
Claims 5-8 are rejected under 35 U.S.C. 103 as being unpatentable over Wilson in view of Sayre, Stone and Edelblut et al. (US20230418442, hereafter Edelblut).
Per claim 5:
Wilson further teaches: detecting credentials of a superuser at the back-end user interface; and detecting whether the front-end user is using a web-based interface that supports at least one language selected from a group consisting of Cascading Style Sheets (CSS), JavaScript, and Hypertext Markup Language (HTML), or a mobile interface (Wilson, see at least [0042] Such an aPaaS system may represent a GUI in various ways. For example, a server device of the aPaaS system may generate a representation of a GUI using a combination of HTML and JAVASCRIPT®; [0060]; [0111] In order for discovery to take place in the manner described above, proxy servers 312, CMDB 500, and/or one or more credential stores may be configured with credentials for one or more of the devices to be discovered. Credentials may include any type of information needed in order to access the devices. These may include userid/password pairs, certificates, and so on. In some embodiments, these credentials may be stored in encrypted fields of CMDB 500. Proxy servers 312 may contain the decryption key for the credentials so that proxy servers 312 can use these credentials to log on to or otherwise access devices being discovered; [0159]; [0162]; [0166], Note that a JSON file is generated and the UI is displayed on the user device and JavaScript, CSS, or HTML can be used for the representation).
Wilson teaches displaying a customized widget (Wilson, see at least [0154] No-code module 704B may display a graphical user interface through which certain options relating to data sources and UI component definitions can be selected, so that specific data from a data source may be used in a custom UI component; [0165]; [0163] this graphical user interface updates the display of the UI components in real time based on changes made to their configurations). Wilson does not explicitly teach displaying a widget on a virtual web browser instance. Edelblut teaches displaying a widget on a virtual web browser instance (Edelblut, see at least [0060] the VR control can be disposed alongside content displayed by the browser or embedded in the general UI provided by the virtual web browser 604; [0064], the user can select a VR control associated with the tab and displayed as part of the UI of the virtual web browser 604. This way, the user can become further educated about the advertisement first seen in the current virtual world and learn about additional information and experiences associated with the advertiser). It would have been obvious for one having ordinary skill in the art before the effective filing date of the claimed invention to have combined Edelblut’s displaying a UI on a virtual web browser with Wilson’s custom GUI generation and Sayre’s dynamically rendering online questionnaire to modify Wilson’s system to combine the display on a virtual web browser as taught by Edelblut, with a reasonable expectation of success, since they are analogous art because they are from the same field of endeavor related to a user interface. Combining Edelblut’s functionality with that of Wilson, Stone and Sayre results in a system that incorporates user interface display on a virtual web browser. The modification would be obvious because one having ordinary skill in the art would be motivated to make this combination to provide a secure and isolated environment (Edelblut, see at least [0060] the VR control can be disposed alongside content displayed by the browser or embedded in the general UI provided by the virtual web browser 604; [0064], the user can select a VR control associated with the tab and displayed as part of the UI of the virtual web browser 604. This way, the user can become further educated about the advertisement first seen in the current virtual world and learn about additional information and experiences associated with the advertiser).
6. The computer-implemented method of claim 5 further comprising, in response to detecting that the web-based interface supports at least one language selected from a group consisting of CSS, JavaScript, and HTML: receiving at least one code snippet as user input; parsing the at least one code snippet; modifying the at least one code snippet into JSON format; and wherein communicating the rendering instructions to the front-end platform comprises communicating a modified code snippet to the front-end platform, wherein the modified code snippet causes the front-end platform to dynamically render the front-end user interface (Wilson, see at least [0042] Such an aPaaS system may represent a GUI in various ways. For example, a server device of the aPaaS system may generate a representation of a GUI using a combination of HTML and JAVASCRIPT®; [0060]; [0142] FIG. 8A depicts translation between low-code statement 800 (which is the IF statement given above) and JSON file 802; [0151] the string displayed in text box 812 is dynamically determined based on application state (e.g., state stored in a server device or provided by a client device). This means that as the state changes, the displayed string may also change accordingly; [0166] This text box contains a low-code statement being used to dynamically control the content displayed at the top left of the list shown in section 1002; [0129]; [0163]; [0175]; [0136]; [0146]; [0164] Each of these UI component definitions may be an off-the-shelf definition or may be customized in some fashion. [0169] platform runtime 708 may populate these custom UI components with data obtained from data sources specified in the metadata and possibly transformed as specified in the metadata. Further, when a data source is updated (e.g., an application state change) or a custom UI component is modified by an end user (e.g., by selection of an option), platform runtime 708 may update the custom UI component as displayed to be consistent with the update and/or the modification. In this manner, platform runtime 708 can be thought of as “executing” the metadata each time the custom UI component needs to be updated; [0177]; [0159]; [0162]; [0166], Note that a JSON file is generated and the UI is displayed on the user device and JavaScript, CSS, or HTML can be used for the representation; Note that the platform runtime 708 facilitates the display of custom graphical user interfaces and updates the display of the UI components in real time based on changes made to their configurations).
7. The computer-implemented method of claim 5, further comprising, in response to detecting that the web-based interface supports a mobile interface: performing at least one back-end service check of at least one code snippet; wherein communicating the rendering instructions comprises communicating the at least one code snippet to the front-end platform; and compiling the at least one code snippet wherein the at least one code snippet causes the front-end platform to dynamically render the front-end user interface (Wilson, see at least [0145] Thus, low-code statement 800 can be “compiled” into JSON file 802, and JSON file 802 can be “reverse-compiled” into low-code statement 800. Likewise, the arrow between low-code statement 800 and JSON tree 804 is also bidirectional; [0172]; [0170] [0177], [0041] The aPaaS system may also support a rich set of pre-defined functionality that can be added to applications. These features include support for searching, email, templating, workflow design, reporting, analytics, social media, scripting, mobile-friendly output, and customized GUIs; [0170] the process could be carried out by a computational instance of … a portable computer, such as… a tablet device; Note that the user interface is displayed on the portable computer such as a tablet device with a mobile-friendly output).
8. The computer-implemented method of claim 5, further comprising receiving at least one code snippet in at least one of JavaScript or CSS via a cross-platform development framework (Wilson, see at least [0060]; [0078]; [0042] Such an aPaaS system may represent a GUI in various ways. For example, a server device of the aPaaS system may generate a representation of a GUI using a combination of HTML and JAVASCRIPT®; [0081] These servers may be virtualized (i.e., the servers may be virtual machines). Examples of public cloud networks 340 may include AMAZON WEB SERVICES® and MICROSOFT® AZURE®; [0128]; [0131] platform UI builder 704 may incorporate data from multiple data sources and/or multiple UI component definitions in order to generate metadata 706. Further, platform UI builder 704 could be on the same or a different platform as that of platform runtime 708; Note that AWS is cross-platform and different platforms/ multi-platform refers to cross-platform).
Claims 12-18 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Wilson in view of Sayre, Stone, Procopio et al. (US 20230185544, hereafter Procopio) and Edelblut et al. (US20230418442, hereafter Edelblut).
Per claim 12:
wherein generating the rendering instructions comprises parsing the JSON formatted input data into at least one pre-built widget (Wilson, see at least [0142] FIG. 8A depicts translation between low-code statement 800 (which is the IF statement given above) and JSON file; [0033] The aPaaS system may support standardized application components, such as a standardized set of widgets for graphical user interface (GUI) development; [0043] Further, user interaction with GUI elements, such as buttons, menus, tabs, sliders, checkboxes, toggles, etc. may be referred to as “selection”, “activation”, or “actuation” thereof. These terms may be used regardless of whether the GUI elements are interacted with by way of keyboard, pointing device, touchscreen, or another mechanism; [0082]; [0118] These UI components could be, but are not limited to, avatars, badges, buttons, calendars, cards, checkboxes, containers, forms, icons, images, lists, menus, messages, notifications, pickers, progress bars, sidebars, sliders, tabs, text boxes, toggles, and so on. In order to make graphical user interfaces interactive; [0033]; [0129]; [0159], Note that a widget is a specific interactive type of UI component and can be elements like buttons and icons, a calendar a music player).
Wilson teaches displaying a customized widget (Wilson, see at least [0154] No-code module 704B may display a graphical user interface through which certain options relating to data sources and UI component definitions can be selected, so that specific data from a data source may be used in a custom UI component; [0165]; [0163] this graphical user interface updates the display of the UI components in real time based on changes made to their configurations).
Wilson does not explicitly teach displaying a widget on a virtual web browser instance. Edelblut teaches displaying a widget on a virtual web browser instance (Edelblut, see at least [0060] the VR control can be disposed alongside content displayed by the browser or embedded in the general UI provided by the virtual web browser 604; [0064], the user can select a VR control associated with the tab and displayed as part of the UI of the virtual web browser 604. This way, the user can become further educated about the advertisement first seen in the current virtual world and learn about additional information and experiences associated with the advertiser). It would have been obvious for one having ordinary skill in the art before the effective filing date of the claimed invention to have combined Edelblut’s displaying a UI on a virtual web browser with Wilson’s custom GUI generation and Sayre’s dynamically rendering online questionnaire to modify Wilson’s system to combine the display on a virtual web browser as taught by Edelblut, with a reasonable expectation of success, since they are analogous art because they are from the same field of endeavor related to a user interface. Combining Edelblut’s functionality with that of Wilson, Stone and Sayre results in a system that incorporates user interface display on a virtual web browser. The modification would be obvious because one having ordinary skill in the art would be motivated to make this combination to provide a secure and isolated environment (Edelblut, see at least [0060] the VR control can be disposed alongside content displayed by the browser or embedded in the general UI provided by the virtual web browser 604; [0064], the user can select a VR control associated with the tab and displayed as part of the UI of the virtual web browser 604. This way, the user can become further educated about the advertisement first seen in the current virtual world and learn about additional information and experiences associated with the advertiser).
Per claims 13-18, they are the product versions of claims 4-9, respectively, and are rejected for the same reasons set forth in connection with the rejection of claims 4-9 above.
Per claim 20, this is the system version of claim 12, respectively, and is rejected for the same reasons set forth in connection with the rejection of claim 12 above.
Examiner’s Note
The Examiner has pointed out particular references contained in the prior art of record within the body of this action for the convenience of the Applicant. Although the specified citations are representative of the teachings in the art and are applied to the specific limitations within the individual claim, other passages and figures may apply. Applicant, in preparing the response, should consider fully the entire reference as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the Examiner.
Response to Arguments
Applicant’s arguments with respect to claim(s) 1-20 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
US 20230186376 is related to a system front-end 104 including different types of front-end clients.
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to INSUN KANG whose telephone number is (571)272-3724. The examiner can normally be reached M-TR 8 -5pm; week 2: Tu-F 8-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chat Do can be reached at 571-272-3721. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/INSUN KANG/ Primary Examiner, Art Unit 2193