Prosecution Insights
Last updated: April 19, 2026
Application No. 18/929,363

DETERMINING AND APPLYING ATTRIBUTE DEFINITIONS TO DIGITAL SURVEY DATA TO GENERATE SURVEY ANALYSES

Non-Final OA §101§103
Filed
Oct 28, 2024
Examiner
WALTON, CHESIREE A
Art Unit
3624
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Qualtrics LLC
OA Round
1 (Non-Final)
30%
Grant Probability
At Risk
1-2
OA Rounds
3y 5m
To Grant
58%
With Interview

Examiner Intelligence

Grants only 30% of cases
30%
Career Allow Rate
63 granted / 211 resolved
-22.1% vs TC avg
Strong +29% interview lift
Without
With
+28.6%
Interview Lift
resolved cases with interview
Typical timeline
3y 5m
Avg Prosecution
52 currently pending
Career history
263
Total Applications
across all art units

Statute-Specific Performance

§101
38.8%
-1.2% vs TC avg
§103
48.9%
+8.9% vs TC avg
§102
4.7%
-35.3% vs TC avg
§112
5.6%
-34.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 211 resolved cases

Office Action

§101 §103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Notice to Applicant Claims 21-40 have been examined in this application. This communication is the first action on the merits. Information Disclosure Statement (IDS) filed 5/29/2025 and 1/29/2026 are acknowledged. Priority The Examiner has noted this Application is a Continuation of Application 16/928,897 filed July 14, 2020. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 21- 40 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Claims 21-29 are directed to a method for survey analysis, Claims 30-36 are directed to an article of manufacture for generating survey analysis and Claims 37-40 ARE directed to system for generating survey analysis. Claim 21 recites a method for generating survey analysis, Claim 30 recites an article of manufacture for generating survey analysis and Claim 37 recites a system for generating survey analysis, which include identifying a digital survey question corresponding to a digital survey; aligning data associated with the digital survey based on a plurality of recommended attribute definitions comprising an organization of attribute definitions; applying, based on a user interaction with a selectable element provided for display at the administrator client device associated with administration of the digital survey, an attribute definition of the organization of attribute definitions to the digital survey; identifying a digital survey response submitted by a respondent device based on the administration of the digital survey; automatically applying, based on determining that the attribute definition is applied to the digital survey, the attribute definition corresponding to the selectable element to the digital survey response; and generating a digital survey analysis based on the digital survey response and the attribute definition.. As drafted, this is, under its broadest reasonable interpretation, within the Abstract idea grouping of “Methods of Organizing Human Activity- sales and marketing activity; business relations. The recitation of “graphical user interface”; “administrator client device”, “schema module”, “respondent device”; “computer-readable medium”, “processor”, “system” and “computer system” does not take claims out of the certain methods of organizing human activity grouping. Accordingly, the claim recites an abstract idea. This judicial exception is not integrated into a practical application. The claims primarily recite the additional element of using computer components to perform each step. The “graphical user interface”; “administrator client device”, “schema module”, “respondent device”; “computer-readable medium”, “processor”, “system” and “computer system” is recited at a high-level of generality, such that it amounts no more than mere instructions to apply the exception using a computer component. See MPEP 2106.05(f). Accordingly, the additional elements do not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. The claims also fail to recite any improvements to another technology or technical field, improvements to the functioning of the computer itself, use of a particular machine, effecting a transformation or reduction of a particular article to a different state or thing, and/or an additional element applies or uses the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception. See 84 Fed. Reg. 55. In particular, there is a lack of improvement to a computer or technical field in surveying. The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception because the additional elements when considered both individually and as an ordered combination do not amount to significantly more than the abstract idea. As discussed above with respect to integration of the abstract idea into a practical application, the additional elements of “graphical user interface”; “administrator client device”, “schema module”, “respondent device”; “computer-readable medium”, “processor”, “system” and “computer system” is insufficient to amount to significantly more. (See MPEP 2106.05(f) – Mere Instructions to Apply an Exception – “Thus, for example, claims that amount to nothing more than an instruction to apply the abstract idea using a generic computer do not render an abstract idea eligible.” Alice Corp., 134 S. Ct. at 235). Mere instructions to apply an exception using a generic computer component cannot provide an inventive concept. The claim fails to recite any improvements to another technology or technical field, improvements to the functioning of the computer itself, use of a particular machine, effecting a transformation or reduction of a particular article to a different state or thing, adding unconventional steps that confine the claim to a particular useful application, and/or meaningful limitations beyond generally linking the use of an abstract idea to a particular environment. See 84 Fed. Reg. 55. Viewed individually or as a whole, these additional claim element(s) do not provide meaningful limitation(s) to transform the abstract idea into a patent eligible application of the abstract idea such that the claim(s) amounts to significantly more than the abstract idea itself. With regards to receiving, identifying and analyzing data and step 2B, it is M2106.05(d)- Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information). Regarding the machine learning analysis and step 2B, it a tool to perform the instructions of the abstract idea. Examiner concludes that the additional elements in combination fail to amount to significantly more than the abstract idea based on findings that each element merely performs the same function(s) in combination as each element performs separately. The claim is not patent eligible. Thus, taken alone, the additional elements do not amount to significantly more than the above-identified judicial exception (the abstract idea). Looking at the limitations as an ordered combination adds nothing that is not already present when looking at the elements taken individually. Dependent Claims 22-29, 31-36 and 38-40 recite the additional elements identifying digital survey responses collected across a plurality of distribution channels; and generating the organization of attribute definitions using the digital survey responses collected across the plurality of distribution channels; identifying a prior digital survey question of a prior digital survey aligned to the global schema module; and aligning the data associated with the digital survey to the global schema module based on a determined similarity to the prior digital survey question; generating the organization of the attribute definitions by applying attribute tags to the attribute definitions, wherein the attribute tags comprise attribute definition categories; and wherein providing the selectable element for display at the administrator client device associated with the administration of the digital survey further comprises providing for display at the administrator client device associated with administration of the digital survey, the selectable element corresponding to an attribute tag of the attribute tags; identifying an entity associated with the digital survey; and generating attribute tags utilizing a trained neural network to categorize one or more of the plurality of recommended attribute definitions into entity-specific attribute tags associated with the entity; restricting access to at least a portion of the digital survey analysis based on a sensitivity attribute definition applied to the digital survey response; indexing digital survey responses according to the attribute definition of the digital survey question; and generating a searchable data structure that associates indexed digital survey responses with corresponding attribute definitions from the global schema module; receiving an attribute definition threshold; and selecting the attribute definition based on a confidence score for the attribute definition satisfying an attribute definition threshold; identifying a plurality of digital survey responses submitted by a respondent device based on the administration of the digital survey; and automatically applying, based on determining that the attribute definition is applied to the digital survey, the attribute definition to the plurality of digital survey responses; and further narrowing the abstract idea. These recited limitations in the dependent claims are mere instructions for applying the abstract idea on a computerized system which are operating such that they do not amount to significantly more than the above-identified judicial exceptions in Claims 21, 30 and 37. Regarding Claim 22, 24, 28, 31, 33, 35, 40 and the additional element of “database” and “client device” -it is it is M2106.05(d)- Receiving or transmitting data over a network, e.g., Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information); using the Internet to gather data and Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); and the additional element of “client device” - M2106.05(h)- field of use . Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 21-23, 26-32 and 37-40 are rejected under 35 U.S.C. 103 as being unpatentable over of Doyle., US Publication No. 20200019561A1, [hereinafter Doyle] , in view of Noter et al., US Publication No. 20180060890A1, [hereinafter Noter]. Regarding Claim 21, A computer-implemented method comprising: identifying, …, a digital survey question corresponding to a digital survey; (Doyle Par. 13-15-“ In some of these embodiments where it is determined that the inquiry can be classified into the at least one recognized class with the first additional input, classifying the inquiry into the one or more classes with at least the first additional input. In some embodiments where it is determined that the inquiry cannot be classified into the at least one recognized class with the first additional input, one or more second links between the inquiry and the one or more classes may be iteratively determined all at once or in separate instances; a second custom question that seeks a second additional input from the user may be formulated with at least one of the one or more second links or information therefor; and the second custom question may be presented in the user interface of the user computing or communication device to the user.) aligning data associated with the digital survey to a global schema module based on a plurality of recommended attribute definitions corresponding to the global schema module comprising an organization of attribute definitions (Doyle Par. 149-150-“ wherein the plurality of recommended attribute definitions correspond to a schema module comprising an organization of attribute definitions) applying, based on a user interaction with a selectable element provided for display at the administrator client device associated with administration of the digital survey, an attribute definition of the organization of attribute definitions to the digital survey; (Doyle Par. 224-225; Par. 315-“This process may repeat for every class in the hierarchical data structure until each recognized class at a class hierarchy is associated with a corresponding set of rules at the corresponding rule hierarchy. With these sets of rules corresponding to a plurality of classes determined, a real-time data model may be generated at 732B in the selected target programming language by expressing the rules in the syntactical requirements of the target programming language and arranging these rules according to the relations (e.g., hierarchical relations between rules or classes) using, for example, constructs and/or nested conditional statements.”); identifying a digital survey response submitted by a respondent device based on the administration of the digital survey; (Doyle Par. 87; Par. 123-124-“ One or more recommended actions may be determined or identified at 206A for the inquiry based at least in part upon the one or more classes determined at 204A by the data model. These one or more recommended actions may include, for example, presentation of one or more media files (e.g., video, pictures, screen shots, help documentation, frequently asked questions (FAQs), etc.) in one or more presentation formats, initiation of one or more guided software application flows with custom flow nodes that address or respond to the inquiry, invocation of live technical or support personnel through online chat sessions, telephone sessions, email communications, or any other suitable actions that may fulfill the inquiry from the user, etc. in some embodiments. It shall be noted that these one or more recommended actions may be determined based on the inquiry alone in some embodiments or based on the inquiry and one or more additional inputs from the user in response to one or more automatically generated questions in one or more automated chat sessions in some other embodiments….At least one recommended action of the one or more recommended actions may be presented at 208A to the user in response to the inquiry. In some embodiments, these techniques may also automatically generate a survey or questionnaire to collect feedback from the user to determine whether the user is satisfied with the presented at least one recommended action. This user feedback may be collected and used in further tweaking or adjusting the data model.”) automatically applying, based on determining that the attribute definition is applied to the digital survey, the attribute definition corresponding to the selectable element to the digital survey response; (Doyle Par. 20-22-“Some embodiments are directed to methods for classifying inquiries in real-time or nearly real-time. These techniques identify or generate a data model that receives and determine one or more classes for the inquiry in real-time or nearly real-time at least by applying a hierarchical set of rules in the data model to the inquiry. A hierarchical class data structure at least by storing and indexing the one or more classes based in part or in whole upon a hierarchical structure of the one or more classes in a non-transitory computer memory. In some of these embodiments, a data set comprising a plurality of inquiries may be identified; the plurality of inquiries may be normalized into a plurality of normalized inquiries; and the plurality of normalized inquiries in the data set may be transformed into a plurality of inquiry vectors in a vector space at least by applying a term embedding process to the plurality of normalized inquiries. In addition or in the alternative, the plurality of inquiries may be classified into a plurality of classes at least by grouping the plurality of inquiry vectors in the vector space based in part or in whole upon vector distances among the plurality of inquiry vectors; and the plurality of classes may be stored in a hierarchical class data structure at least by referencing parent-child relations among the plurality of classes in the hierarchical class data structure.”; Par. 123-124) and generating a digital survey analysis based on the digital survey response and the attribute definition. (Doyle Par. 118-121-“ For example, a context analysis may determine the meaning of a particular word or a particular symbol based on the preceding and/or the subsequent words, symbols, or expressions. For example, an exclamation mark “!” has different meaning depending on the context in which the exclamation mark is used. In a literal construction, the exclamation mark may indicate a sharp or sudden utterance expressive of strong feeling of the user. On the other hand, the exclamation mark in a relational operator means “not equal to” when the exclamation mark is followed by “=”. ) Doyle teaches feedback analysis and the feature is expounded upon by Noter: …via a digital survey creation graphical user interface of an administrator client device… (Noter Par. 8-9; Par. 31- FIG. 3 illustrates an example environment 340 according to the disclosure. The environment 340 may include an ITSM entity workflow process 342, a definition 344, an ITSM entity survey 346, collected data 348, an analytical database storage table 350, common field column 352, response field column 354, non-linked context field column 356, and/or linked entity field column 358. The definition 344 may be a number of definitions and/or configurations provided from a process owner, assignee, and/or responsible party for the IT process underlying the ITSM entity being surveyed. The definition 344 may define the customized data model of the ITSM entity. That is, the definition 344 may define the attributes and properties of the display, title, instructions, and inquiries included in a survey of the ITSM survey. For example, the process owner may define a new survey including the survey metadata and the questions to be included in the questionnaire portion of the survey. The definition 344 may also include a definition of the ITSM entity type and an associated ITSM entity model that the survey is defined on (e.g., a support request).) Doyle and Noter are directed to feedback analysis. It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have improve upon analysis of Doyle, as taught by Noter, by utilizing classification techniques with a reasonable expectation of success of arriving at the claimed invention. One of ordinary skill in the art would have been motivated to make the modification to the teachings of Doyle with the motivation of measuring end-user perspectives on a variety of aspects (Noter Par.3). Regarding Claim 22, and Claim 31 Doyle in view of Noter teach The computer-implemented method of claim 21, further comprising:,… ; and The non-transitory computer-readable medium of claim 30, further comprising instructions that, when executed by the at least one processor, cause the computer system to: … Doyle teaches feedback analysis and the feature is expounded upon by Noter: identify a survey response database comprising digital survey responses collected across a plurality of distribution channels (Noter Par. 11-FIG. 1 illustrates an example of a survey system 100, according to examples of the present disclosure. As illustrates in FIG. 1, the survey system 100 may include a database 102, a survey manager 104, and a number of engines 106, 108, 110, and 112. The survey manager 104 may include the number of engines 106, 108, 110, and 112 in communication with the database 102 via a communication link. The system may include additional or fewer engines than illustrates to perform various tasks described herein. The survey system 100 may represent instructions and/or hardware of an ITSM software/software-as-a-service tool. ; Par. 18-The survey results included in the storage table may include the responses and context collected for each corresponding survey. The response and context of each survey may be standardized (e.g., transformed via a normalization defined by a survey specific mapping) into a genericized structure adapted to the storage table. The storage table may contain a plurality of records, each record corresponding to the standardized response and context of a particular survey of the plurality of surveys. A portion of the plurality of surveys represented within the storage table may have customized data models that are distinct from one another.; Par. 21) and generate the organization of attribute definitions using the digital survey responses collected across the plurality of distribution channels. ( Noter Par. 26-Configuring the data model may further include receiving a definition of display attributes, a title, instructions, and/or inquiries to be included in the survey. Configuring the data model may include receiving and configuring the data collection for and calculation of key performance indicators (KPI) from the survey.) Doyle and Noter are directed to feedback analysis. It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have improve upon analysis of Doyle, as taught by Noter, by utilizing classification techniques with a reasonable expectation of success of arriving at the claimed invention. One of ordinary skill in the art would have been motivated to make the modification to the teachings of Doyle with the motivation of measuring end-user perspectives on a variety of aspects (Noter Par.3). Regarding Claim 23 and Claim 32, Doyle in view of Noter teach The computer-implemented method of claim 21, further comprising: … ; and The non-transitory computer-readable medium of claim 30, wherein the digital survey comprises a survey template question corresponding to the attribute definition, and further comprising instructions that, when executed by the at least one processor, cause the computer system to: … Doyle teaches feedback analysis and the feature is expounded upon by Noter: identifying a prior digital survey question of a prior digital survey aligned to the global schema module; and aligning the data associated with the digital survey to the global schema module based on a determined similarity to the prior digital survey question (Noter Par. 33-35- The definition 344 may include a configuration of survey execution. For example, the definition 344 may include a definition of automatically executable rules in the ITSM entity workflow process 342. The rules may define the exact changes in the ITSM entity workflow process 342 that trigger sending the ITSM entity survey 346 and to whom the ITSM entity survey 346 is sent in response to those triggers. ; Par. 45) Doyle and Noter are directed to feedback analysis. It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have improve upon analysis of Doyle, as taught by Noter, by utilizing classification techniques with a reasonable expectation of success of arriving at the claimed invention. One of ordinary skill in the art would have been motivated to make the modification to the teachings of Doyle with the motivation of measuring end-user perspectives on a variety of aspects (Noter Par.3). Regarding Claim 26 and Claim 34, Doyle in view of Noter teach The computer-implemented method of claim 21, further comprising … ; and The non-transitory computer-readable medium of claim 30, further comprising instructions that, when executed by the at least one processor, cause the computer system to… Doyle teaches survey analysis and the feature is expounded upon by Noter: restricting access to at least a portion of the digital survey analysis based on a sensitivity attribute definition applied to the digital survey response (Noter Par. 9 - Surveys may interface was an ITSM tool in a limited manner. For example, an ITSM tool may support sending of an IT related survey relating to one of a limited closed list of a few ITSM entity types (e.g., incident or change). While most ITSM tools do not support survey sending trigger events, some may support a closed list of very few events (e.g., the closing of a support request) over a limited amount of ITSM entity types to act as a survey trigger. Surveys may be limited in the amount of data they can collect. For example, a survey may be limited to collecting the response of four total inquiries. A survey may be unable to integrate into the workflow of an IT entity lifecycle. ) Doyle and Noter are directed to feedback analysis. It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have improve upon analysis of Doyle, as taught by Noter, by utilizing classification techniques with a reasonable expectation of success of arriving at the claimed invention. One of ordinary skill in the art would have been motivated to make the modification to the teachings of Doyle with the motivation of measuring end-user perspectives on a variety of aspects (Noter Par.3). Regarding Claim 27 and Claim 39, Doyle in view of Noter teach The computer-implemented method of claim 21, further comprising: … ; and The system of claim 37, further comprising instructions that, when executed by the at least one processor, cause the system to:… indexing digital survey responses according to the attribute definition of the digital survey question; and generating a searchable data structure that associates indexed digital survey responses with corresponding attribute definitions from the global schema module (Doyle Par. 24 - In some embodiments, a hierarchy traversing module may be identified for the hierarchical set of rules; an indexing data structure may also be identified for the hierarchical set of rules or a hierarchical set of classes; and a determination may be made to decide whether the inquiry satisfies one or more rules in the hierarchical set of rules using at least the traversing scheme and the indexing data structure.; Par. 106-Indexing employs an indexing data structure that stores the indices and may be a separate data structure or a part (e.g., a column) of an existing data structure. The indices in the indexing data structure may be sorted in a particular order to facilitate rapid random lookups and queries as well as efficient access of ordered information in the hierarchical data structure. A hierarchical data structure described herein may also contain references to the physical disk block addresses or links thereto to further improves access to the data and/or functions (e.g., recommended actions for a predetermined class, applications of rules to data for clustering and/or classification purposes, etc.) stored in the hierarchical data structure. ) Regarding Claim 28 , Claim 35 and Claim 38, Doyle in view of Noter teach The computer-implemented method of claim 21, wherein automatically applying the attribute definition further comprises: … ; and The non-transitory computer-readable medium of claim 30, wherein automatically apply the attribute definition further comprises: … and The system of claim 37, further comprising instructions that, when executed by the at least one processor, cause the system to:…. receiving, from the administrator client device, an attribute definition threshold; and selecting the attribute definition based on a confidence score for the attribute definition satisfying an attribute definition threshold. (Doyle Par.136 -137- In some embodiments, these sub-processes 228B through 232B may be iteratively performed until the one or more actions may be deterministically determined. In some other embodiments, these sub-processes 228B through 232B may be iteratively performed subject to a threshold limit beyond which the inquiry, its classification results, and/or other pertinent information may be referred to domain expert review… In some other embodiments where the determination result at 232B is affirmative, these one or more actions may be determined at 236B; and these one or more actions may also be optionally ranked at 238B into one or more ranked actions based in part or in whole upon, for example, popularity of these one or more actions among a plurality of users, other users' feedback on these one or more actions, relative confidence levels, complexity levels of these one or more actions, any other suitable ranking measures, or any combinations thereof. ) Regarding Claim 29 , Claim 36 and Claim 40, Doyle in view of Noter teach The computer-implemented method of claim 21, further comprising: … ; and The non-transitory computer-readable medium of claim 30, further comprising instructions that, when executed by the at least one processor, cause the computer system to: … and The system of claim 37, further comprising instructions that, when executed by the at least one processor, cause the system to:…. identifying a plurality of digital survey responses submitted by a respondent device based on the administration of the digital survey; and automatically applying, based on determining that the attribute definition is applied to the digital survey, the attribute definition to the plurality of digital survey responses. (Doyle Par. 20-22; Par. 87; Par. 123-124-“ One or more recommended actions may be determined or identified at 206A for the inquiry based at least in part upon the one or more classes determined at 204A by the data model. These one or more recommended actions may include, for example, presentation of one or more media files (e.g., video, pictures, screen shots, help documentation, frequently asked questions (FAQs), etc.) in one or more presentation formats, initiation of one or more guided software application flows with custom flow nodes that address or respond to the inquiry, invocation of live technical or support personnel through online chat sessions, telephone sessions, email communications, or any other suitable actions that may fulfill the inquiry from the user, etc. in some embodiments. It shall be noted that these one or more recommended actions may be determined based on the inquiry alone in some embodiments or based on the inquiry and one or more additional inputs from the user in response to one or more automatically generated questions in one or more automated chat sessions in some other embodiments….At least one recommended action of the one or more recommended actions may be presented at 208A to the user in response to the inquiry. In some embodiments, these techniques may also automatically generate a survey or questionnaire to collect feedback from the user to determine whether the user is satisfied with the presented at least one recommended action. This user feedback may be collected and used in further tweaking or adjusting the data model.”) Regarding Claim 30, A non-transitory computer-readable medium storing instructions that, when executed by at least one processor, cause a computer system to: identify,…, a digital survey question corresponding to a digital survey;; (Doyle Par. 41; Par. 13-15-“ In some of these embodiments where it is determined that the inquiry can be classified into the at least one recognized class with the first additional input, classifying the inquiry into the one or more classes with at least the first additional input. In some embodiments where it is determined that the inquiry cannot be classified into the at least one recognized class with the first additional input, one or more second links between the inquiry and the one or more classes may be iteratively determined all at once or in separate instances; a second custom question that seeks a second additional input from the user may be formulated with at least one of the one or more second links or information therefor; and the second custom question may be presented in the user interface of the user computing or communication device to the user.) align data associated with the digital survey to a global schema module based on a plurality of recommended attribute definitions corresponding to the global schema module comprising an organization of attribute definitions; (Doyle Par. 149-150-“ wherein the plurality of recommended attribute definitions correspond to a schema module comprising an organization of attribute definitions) apply, based on a user interaction with a selectable element provided for display at the administrator client device associated with administration of the digital survey, an attribute definition of the organization of attribute definitions to the digital survey; (Doyle Par. 224-225; Par. 315-“This process may repeat for every class in the hierarchical data structure until each recognized class at a class hierarchy is associated with a corresponding set of rules at the corresponding rule hierarchy. With these sets of rules corresponding to a plurality of classes determined, a real-time data model may be generated at 732B in the selected target programming language by expressing the rules in the syntactical requirements of the target programming language and arranging these rules according to the relations (e.g., hierarchical relations between rules or classes) using, for example, constructs and/or nested conditional statements.”); identify a digital survey response submitted by a respondent device based on the administration of the digital survey; (Doyle Par. 87; Par. 123-124-“ One or more recommended actions may be determined or identified at 206A for the inquiry based at least in part upon the one or more classes determined at 204A by the data model. These one or more recommended actions may include, for example, presentation of one or more media files (e.g., video, pictures, screen shots, help documentation, frequently asked questions (FAQs), etc.) in one or more presentation formats, initiation of one or more guided software application flows with custom flow nodes that address or respond to the inquiry, invocation of live technical or support personnel through online chat sessions, telephone sessions, email communications, or any other suitable actions that may fulfill the inquiry from the user, etc. in some embodiments. It shall be noted that these one or more recommended actions may be determined based on the inquiry alone in some embodiments or based on the inquiry and one or more additional inputs from the user in response to one or more automatically generated questions in one or more automated chat sessions in some other embodiments….At least one recommended action of the one or more recommended actions may be presented at 208A to the user in response to the inquiry. In some embodiments, these techniques may also automatically generate a survey or questionnaire to collect feedback from the user to determine whether the user is satisfied with the presented at least one recommended action. This user feedback may be collected and used in further tweaking or adjusting the data model.”) automatically apply, based on determining that the attribute definition is applied to the digital survey, the attribute definition corresponding to the selectable element to the digital survey response; (Doyle Par. 20-22-“Some embodiments are directed to methods for classifying inquiries in real-time or nearly real-time. These techniques identify or generate a data model that receives and determine one or more classes for the inquiry in real-time or nearly real-time at least by applying a hierarchical set of rules in the data model to the inquiry. A hierarchical class data structure at least by storing and indexing the one or more classes based in part or in whole upon a hierarchical structure of the one or more classes in a non-transitory computer memory. In some of these embodiments, a data set comprising a plurality of inquiries may be identified; the plurality of inquiries may be normalized into a plurality of normalized inquiries; and the plurality of normalized inquiries in the data set may be transformed into a plurality of inquiry vectors in a vector space at least by applying a term embedding process to the plurality of normalized inquiries. In addition or in the alternative, the plurality of inquiries may be classified into a plurality of classes at least by grouping the plurality of inquiry vectors in the vector space based in part or in whole upon vector distances among the plurality of inquiry vectors; and the plurality of classes may be stored in a hierarchical class data structure at least by referencing parent-child relations among the plurality of classes in the hierarchical class data structure.”; Par. 123-124) and generate a digital survey analysis based on the digital survey response and the attribute definition. (Doyle Par. 118-121-“ For example, a context analysis may determine the meaning of a particular word or a particular symbol based on the preceding and/or the subsequent words, symbols, or expressions. For example, an exclamation mark “!” has different meaning depending on the context in which the exclamation mark is used. In a literal construction, the exclamation mark may indicate a sharp or sudden utterance expressive of strong feeling of the user. On the other hand, the exclamation mark in a relational operator means “not equal to” when the exclamation mark is followed by “=”. ) Doyle teaches feedback analysis and the feature is expounded upon by Noter: …via a digital survey creation graphical user interface of an administrator client device… (Noter Par. 8-9; Par. 31- FIG. 3 illustrates an example environment 340 according to the disclosure. The environment 340 may include an ITSM entity workflow process 342, a definition 344, an ITSM entity survey 346, collected data 348, an analytical database storage table 350, common field column 352, response field column 354, non-linked context field column 356, and/or linked entity field column 358. The definition 344 may be a number of definitions and/or configurations provided from a process owner, assignee, and/or responsible party for the IT process underlying the ITSM entity being surveyed. The definition 344 may define the customized data model of the ITSM entity. That is, the definition 344 may define the attributes and properties of the display, title, instructions, and inquiries included in a survey of the ITSM survey. For example, the process owner may define a new survey including the survey metadata and the questions to be included in the questionnaire portion of the survey. The definition 344 may also include a definition of the ITSM entity type and an associated ITSM entity model that the survey is defined on (e.g., a support request).) Doyle and Noter are directed to feedback analysis. It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have improve upon analysis of Doyle, as taught by Noter, by utilizing classification techniques with a reasonable expectation of success of arriving at the claimed invention. One of ordinary skill in the art would have been motivated to make the modification to the teachings of Doyle with the motivation of measuring end-user perspectives on a variety of aspects (Noter Par.3). Regarding Claim 37, A system comprising: at least one processor; and at least one non-transitory computer-readable storage medium storing instructions that, when executed by the at least one processor, cause the system to: identify, …, a digital survey question corresponding to a digital survey; (Doyle Par. 41; Par. 13-15-“ In some of these embodiments where it is determined that the inquiry can be classified into the at least one recognized class with the first additional input, classifying the inquiry into the one or more classes with at least the first additional input. In some embodiments where it is determined that the inquiry cannot be classified into the at least one recognized class with the first additional input, one or more second links between the inquiry and the one or more classes may be iteratively determined all at once or in separate instances; a second custom question that seeks a second additional input from the user may be formulated with at least one of the one or more second links or information therefor; and the second custom question may be presented in the user interface of the user computing or communication device to the user.) align data associated with the digital survey to a global schema module based on a plurality of recommended attribute definitions corresponding to the global schema module comprising an organization of attribute definitions; (Doyle Par. 149-150-“ wherein the plurality of recommended attribute definitions correspond to a schema module comprising an organization of attribute definitions apply, based on a user interaction with a selectable element provided for display at the administrator client device associated with administration of the digital survey, an attribute definition of the organization of attribute definitions to the digital survey; (Doyle Par. 224-225; Par. 315-“This process may repeat for every class in the hierarchical data structure until each recognized class at a class hierarchy is associated with a corresponding set of rules at the corresponding rule hierarchy. With these sets of rules corresponding to a plurality of classes determined, a real-time data model may be generated at 732B in the selected target programming language by expressing the rules in the syntactical requirements of the target programming language and arranging these rules according to the relations (e.g., hierarchical relations between rules or classes) using, for example, constructs and/or nested conditional statements.”); identify a digital survey response submitted by a respondent device based on the administration of the digital survey; (Doyle Par. 87; Par. 123-124-“ One or more recommended actions may be determined or identified at 206A for the inquiry based at least in part upon the one or more classes determined at 204A by the data model. These one or more recommended actions may include, for example, presentation of one or more media files (e.g., video, pictures, screen shots, help documentation, frequently asked questions (FAQs), etc.) in one or more presentation formats, initiation of one or more guided software application flows with custom flow nodes that address or respond to the inquiry, invocation of live technical or support personnel through online chat sessions, telephone sessions, email communications, or any other suitable actions that may fulfill the inquiry from the user, etc. in some embodiments. It shall be noted that these one or more recommended actions may be determined based on the inquiry alone in some embodiments or based on the inquiry and one or more additional inputs from the user in response to one or more automatically generated questions in one or more automated chat sessions in some other embodiments….At least one recommended action of the one or more recommended actions may be presented at 208A to the user in response to the inquiry. In some embodiments, these techniques may also automatically generate a survey or questionnaire to collect feedback from the user to determine whether the user is satisfied with the presented at least one recommended action. This user feedback may be collected and used in further tweaking or adjusting the data model.”) automatically apply, based on determining that the attribute definition is applied to the digital survey, the attribute definition corresponding to the selectable element to the digital survey response; (Doyle Par. 20-22-“Some embodiments are directed to methods for classifying inquiries in real-time or nearly real-time. These techniques identify or generate a data model that receives and determine one or more classes for the inquiry in real-time or nearly real-time at least by applying a hierarchical set of rules in the data model to the inquiry. A hierarchical class data structure at least by storing and indexing the one or more classes based in part or in whole upon a hierarchical structure of the one or more classes in a non-transitory computer memory. In some of these embodiments, a data set comprising a plurality of inquiries may be identified; the plurality of inquiries may be normalized into a plurality of normalized inquiries; and the plurality of normalized inquiries in the data set may be transformed into a plurality of inquiry vectors in a vector space at least by applying a term embedding process to the plurality of normalized inquiries. In addition or in the alternative, the plurality of inquiries may be classified into a plurality of classes at least by grouping the plurality of inquiry vectors in the vector space based in part or in whole upon vector distances among the plurality of inquiry vectors; and the plurality of classes may be stored in a hierarchical class data structure at least by referencing parent-child relations among the plurality of classes in the hierarchical class data structure.”; Par. 123-124) and generate a digital survey analysis based on the digital survey response and the attribute definition. (Doyle Par. 118-121-“ For example, a context analysis may determine the meaning of a particular word or a particular symbol based on the preceding and/or the subsequent words, symbols, or expressions. For example, an exclamation mark “!” has different meaning depending on the context in which the exclamation mark is used. In a literal construction, the exclamation mark may indicate a sharp or sudden utterance expressive of strong feeling of the user. On the other hand, the exclamation mark in a relational operator means “not equal to” when the exclamation mark is followed by “=”. ) Doyle teaches feedback analysis and the feature is expounded upon by Noter: …via a digital survey creation graphical user interface of an administrator client device… (Noter Par. 8-9; Par. 31- FIG. 3 illustrates an example environment 340 according to the disclosure. The environment 340 may include an ITSM entity workflow process 342, a definition 344, an ITSM entity survey 346, collected data 348, an analytical database storage table 350, common field column 352, response field column 354, non-linked context field column 356, and/or linked entity field column 358. The definition 344 may be a number of definitions and/or configurations provided from a process owner, assignee, and/or responsible party for the IT process underlying the ITSM entity being surveyed. The definition 344 may define the customized data model of the ITSM entity. That is, the definition 344 may define the attributes and properties of the display, title, instructions, and inquiries included in a survey of the ITSM survey. For example, the process owner may define a new survey including the survey metadata and the questions to be included in the questionnaire portion of the survey. The definition 344 may also include a definition of the ITSM entity type and an associated ITSM entity model that the survey is defined on (e.g., a support request).) Doyle and Noter are directed to feedback analysis. It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have improve upon analysis of Doyle, as taught by Noter, by utilizing classification techniques with a reasonable expectation of success of arriving at the claimed invention. One of ordinary skill in the art would have been motivated to make the modification to the teachings of Doyle with the motivation of measuring end-user perspectives on a variety of aspects (Noter Par.3). Claims 24-25 and 33 are rejected under 35 U.S.C. 103 as being unpatentable over Doyle., US Publication No. 20200019561A1, [hereinafter Doyle] , in view of Noter et al., US Publication No. 20180060890A1, [hereinafter Noter], in further view of Silverstein et al., US Publication No. 20210182282 A1, [hereinafter Silverstein], Regarding Claim 24 Doyle in view of Noter teach The method of claim 1, wherein providing the plurality of selectable elements comprises:… Doyle in view of Noter teach feedback analysis and the feature is expounded upon by Silverstein: generating the organization of the attribute definitions by applying attribute tags to the attribute definitions, wherein the attribute tags comprise attribute definition categories; (Silverstein Par. 210-“ The interrogator in the selection of the one or more questions in the selection step preferably selects the construct(s) only from this group of primary constructs. This allows the querying to be focused on only these more relevant or important constructs, thereby reducing the amount of question signals that need to be sent and answer signals to be received for the completion of the survey.”; Par. 272-“ In one embodiment, the system enables visual comparison and evaluation of entities and/or users (or parts of the data Persona that belongs to these entities and/or users) by generating side-by-side or overlay-layer comparison visualizations of the data associated with these entities and/or users. An evaluator can score and/or sort these entities via button clicks or added tags. The infographic templates and/or variants to be applied to the data can be specified, and one or more master styles (such as colors and fonts) can optionally be assigned to those visualizations. A subset from a full set of available data (in the data Persona or Data Store) can be selected for inclusion in the comparison visualizations. The system may also provide methods for aiding comparison techniques, such as displaying means and/or benchmark lines for scored values and interval scales, totals for collections of objects, and equivalent time segments for time-based data. Job applicants or slates of candidates (such as potential contractors) can be compared via this system. The precise parts of the résumé/CV or pre-existing system data to include in an applicant comparison can be specified.); and wherein providing the selectable element for display at the administrator client device associated with the administration of the digital survey further comprises providing for display at the administrator client device associated with administration of the digital survey, the selectable element corresponding to an attribute tag of the attribute tags. (Silverstein Par. 157; Par. 166;Par. 301-“ End users of the system may browse infographic templates in the system randomly, through the items cataloged by the schema elements they depict, by tags, or using another method. For example, the user may see a list of schema elements to be depicted, click on the name of that element, and see a list of infographic templates that contain the ability to render that component. The user may also type in or click from a list of tags to see all infographic templates that are assigned to a given tag. This process is facilitated by basic selection techniques from the database.; Par. 354) Doyle , Noter and Silverstein are directed to data evaluation and result analysis. It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have improve upon analysis of Doyle in view of Noter, as taught by Silverstein, by utilizing tags in the class analysis with a reasonable expectation of success of arriving at the claimed invention. One of ordinary skill in the art would have been motivated to make the modification to the teachings of Doyle in view of Noter with the motivation of i visual comparison and evaluation of entities and/or users (Silverstein Par.272). Regarding Claim 25, The computer-implemented method of claim 24, further comprising: identifying an entity associated with the digital survey; and generating attribute tags utilizing a trained neural network to categorize one or more of the plurality of recommended attribute definitions into entity-specific attribute tags associated with the entity (Doyle Par. 146- “The machine learning module 304A may use one or more different techniques including, for example, supervised learning techniques (e.g., support vector machines, decision trees, artificial neural network, inductive logic programming, statistical relational leaning, etc.) and/or unsupervised learning techniques (e.g., clustering, expectation-maximization techniques, multivariate analysis, etc.) for various pattern recognition and identification of terms of interest as well as helper items.”). Regarding Claim 33, The non-transitory computer-readable medium of claim 30, further comprising instructions that, when executed by the at least one processor, cause the computer system to: identify an entity associated with the digital survey; and generate attribute tags utilizing a trained neural network to categorize one or more of the plurality of recommended attribute definitions into entity-specific attribute tags associated with the entity; (Doyle Par. 146- “The machine learning module 304A may use one or more different techniques including, for example, supervised learning techniques (e.g., support vector machines, decision trees, artificial neural network, inductive logic programming, statistical relational leaning, etc.) and/or unsupervised learning techniques (e.g., clustering, expectation-maximization techniques, multivariate analysis, etc.) for various pattern recognition and identification of terms of interest as well as helper items.”). Doyle in view of Noter teaches evaluation analysis and the feature is expounded upon by Silverstein: and wherein provide the selectable element for display at the administrator client device associated with the administration of the digital survey further comprises provide for display at the administrator client device associated with administration of the digital survey, …the selectable element corresponding to an attribute tag of the entity-specific attribute tags associated with the entity. (Silverstein Par. 157; Par. 166;Par. 301-“ End users of the system may browse infographic templates in the system randomly, through the items cataloged by the schema elements they depict, by tags, or using another method. For example, the user may see a list of schema elements to be depicted, click on the name of that element, and see a list of infographic templates that contain the ability to render that component. The user may also type in or click from a list of tags to see all infographic templates that are assigned to a given tag. This process is facilitated by basic selection techniques from the database.; Par. 354) Doyle, Noter and Silverstein are directed to data evaluation and result analysis. It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have improve upon analysis of Doyle in view of Noter, as taught by Silverstein, by utilizing tags in the class analysis with a reasonable expectation of success of arriving at the claimed invention. One of ordinary skill in the art would have been motivated to make the modification to the teachings of Doyle in view of Noter with the motivation of i visual comparison and evaluation of entities and/or users (Silverstein Par.272). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure: US Publication No. US 20190164182 A1 to Abdullah et al.- Par. 21-“ In one or more embodiments, the electronic survey system further facilitates a search of the electronic survey responses to identify information contained therein. For example, in one or more embodiments, the electronic survey system receives a search query from a client device (e.g., administrator device). In response to receiving the search query, the electronic survey system identifies one or more electronic survey questions having a question classification that corresponds to information requested by the search query and/or a topic identified by the search query.”; Par. 33-“ As mentioned above, and as shown in FIG. 1, the administrator device 112 includes a survey application 114 shown thereon. In one or more embodiments, the survey application 114 refers to a software application associated with the electronic survey system 104 that facilitates receiving a search query, analyzing electronic survey responses, and providing results of the analysis to the user 116 via a graphical user interface on the administrator device 112.” Any inquiry concerning this communication or earlier communications from the examiner should be directed to Chesiree Walton, whose telephone number is (571) 272-5219. The examiner can normally be reached from Monday to Friday between 8 AM and 5 PM. If any attempt to reach the examiner by telephone is unsuccessful, the examiner’s supervisor, Patricia Munson, can be reached at (571) 270-5396. The fax telephone numbers for this group are either (571) 273-8300 or (703) 872-9326 (for official communications including After Final communications labeled “Box AF”). Another resource that is available to applicants is the Patent Application Information Retrieval (PAIR). Information regarding the status of an application can be obtained from the (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAX. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, please feel free to contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). Applicants are invited to contact the Office to schedule an in-person interview to discuss and resolve the issues set forth in this Office Action. Although an interview is not required, the Office believes that an interview can be of use to resolve any issues related to a patent application in an efficient and prompt manner. Sincerely, /CHESIREE A WALTON/ Examiner, Art Unit 3624
Read full office action

Prosecution Timeline

Oct 28, 2024
Application Filed
May 12, 2025
Response after Non-Final Action
Feb 17, 2026
Non-Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12591903
SELF-SUPERVISED SYSTEM GENERATING EMBEDDINGS REPRESENTING SEQUENCED ACTIVITY
2y 5m to grant Granted Mar 31, 2026
Patent 12561640
METHOD AND SYSTEM TO STREAMLINE RETURN DECISION AND OPTIMIZE COSTS
2y 5m to grant Granted Feb 24, 2026
Patent 12555047
SYSTEMS AND METHODS FOR FORMULATING OR EVALUATING A CONSTRUCTION COMPOSITION
2y 5m to grant Granted Feb 17, 2026
Patent 12518292
HIERARCHY AWARE GRAPH REPRESENTATION LEARNING
2y 5m to grant Granted Jan 06, 2026
Patent 12333460
DISPLAY OF MULTI-MODAL VEHICLE INDICATORS ON A MAP
2y 5m to grant Granted Jun 17, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
30%
Grant Probability
58%
With Interview (+28.6%)
3y 5m
Median Time to Grant
Low
PTA Risk
Based on 211 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month