Prosecution Insights
Last updated: April 19, 2026
Application No. 18/506,678

COMPUTER IMPLEMENTED METHOD OF EXTRACTING DATA FROM SURVEYS

Non-Final OA §101§102§103
Filed
Nov 10, 2023
Examiner
WASHINGTON, JAMARES
Art Unit
2681
Tech Center
2600 — Communications
Assignee
Research Grid Ltd.
OA Round
1 (Non-Final)
81%
Grant Probability
Favorable
1-2
OA Rounds
2y 6m
To Grant
93%
With Interview

Examiner Intelligence

Grants 81% — above average
81%
Career Allow Rate
545 granted / 671 resolved
+19.2% vs TC avg
Moderate +12% lift
Without
With
+12.1%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
32 currently pending
Career history
703
Total Applications
across all art units

Statute-Specific Performance

§101
10.9%
-29.1% vs TC avg
§103
54.4%
+14.4% vs TC avg
§102
24.5%
-15.5% vs TC avg
§112
8.0%
-32.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 671 resolved cases

Office Action

§101 §102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Information Disclosure Statement The information disclosure statement (IDS) submitted on 05/01/2025 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-21 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more. The claims recite extracting data from surveys, including questions and answers, identifying answers of previously filled surveys to predict answers to subsequent surveys without significantly more. The abstract idea further classifies regions of the survey to improve accuracy of predicted answers correlating to predefined questions. This judicial exception is not integrated into a practical application because there is no meaningful limitations beyond generally linking the use of an abstract idea to a particular technical environment. Furthermore, the process or method steps performed are not enough to qualify as “significantly more” than the abstract idea itself as the steps may be performed in the human mind and/or displayed on with pen and paper. The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception because the additional elements when considered both individually and as an ordered combination do not amount to significantly more than the abstract idea. The claims recite applying a machine learning model to predict answers. Machine learning is known in the art as simply applying mathematical algorithms using generic computer components to determine an output given a series of inputs. Generic computer components recited as performing generic functions that are well-understood, routine and conventional amount to no more than implementing the abstract idea with a computerized system. Thus, taken alone, the additional element does not amount to significantly more than the above-identified abstract idea. There is no indication that the elements improve the functioning of a computer or improves any other technology. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1, 2, 7 and 8 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Dontá Lamar Wilson et al (US 20240037585 A1). Regarding claim 1, Wilson et al discloses a computer implemented method of extracting data from surveys (¶ [88]), the method comprising: obtaining completed surveys (¶ [88-89]), each completed survey comprising answers to preconfigured questions (¶ [88] and ¶ [91]); identifying portions of each completed survey corresponding to answers (¶ [131-132]); applying a machine learning model to the portions identified as answers to predict answers (¶ [132] and ¶ [135]); and accumulating a response to the survey based on the predicted answers (¶ [143]). Regarding 2, Wilson et al discloses the computer implemented method of claim 1 (see rejection of claim 1), wherein accumulating a response to the survey based on the identified answers comprises matching the identified answers to the preconfigured questions (¶ [98]). Regarding claim 7, Wilson et al discloses the computer implemented method of claim 1 (see rejection of claim 1), wherein obtaining completed surveys comprises obtaining an image of completed surveys (¶ [66] image of the completed survey obtained for machine learning algorithm; ¶ [88] learning program receives completed surveys). Regarding claim 8, Wilson et al discloses the computer implemented method of claim 1 (see rejection of claim 1), further comprising obtaining from a user an indication from a list of preconfigured surveys of the survey being obtained (¶ [83-84] user initiates/selects survey to be input). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 3, 4, 10, 14, 15 are rejected under 35 U.S.C. 103 as being unpatentable over Wilson et al in view of Lance Parker et al (US 20020052774 A1). Regarding claim 3, Wilson et al discloses the computer implemented method of claim 1 (see rejection of claim 1). Wilson et al fails to explicitly disclose outputting a display of the answers to each question in a graphical user interface to the user, along with an indication of a location of the answer in the survey and a confidence level that the answer has been correctly predicted by a second machine learning model. Parker et al, in the same field of endeavor of formulating questions, obtaining responses and analyzing the responses mathematically to obtain desired information (¶ [30]), teaches outputting a display of the answers to each question in a graphical user interface to the user (¶ [72]), along with an indication of a location of the answer in the survey (¶ [41]) and a confidence level that the answer has been correctly predicted by a second machine learning model (¶ [52] and ¶ [65]). It would have been obvious to one of ordinary skill in the art before the invention was effectively filed for the method as disclosed by Wilson et al comprising obtaining completed surveys, identifying portions of each completed survey corresponding to answers, and applying a machine learning model to the portions identified as answers to predict answers to utilize the teachings of Parker et al which teaches outputting a display of the answers to each question in a graphical user interface to the user, along with an indication of a location of the answer in the survey and a confidence level that the answer has been correctly predicted by a second machine learning model to conduct surveys more quickly and efficiently than conventional manual methods. Regarding claim 4, Wilson et al discloses the computer implemented method of claim 1 (see rejection of claim 1). Wilson et al fails to explicitly disclose classifying sections of the survey into one of a selected number of different types; wherein identifying portions of the survey corresponding to answers comprises identifying a portion corresponding to answers for each classified section of the survey. Parker et al teaches classifying sections of the survey into one of a selected number of different types (¶ [29]); wherein identifying portions of the survey corresponding to answers comprises identifying a portion corresponding to answers for each classified section of the survey (¶ [29]). It would have been obvious to one of ordinary skill in the art before the invention was effectively filed for the method as disclosed by Wilson et al comprising obtaining completed surveys, identifying portions of each completed survey corresponding to answers, and applying a machine learning model to the portions identified as answers to predict answers to utilize the teachings of Parker et al which teaches classifying sections of the survey into one of a selected number of different types; wherein identifying portions of the survey corresponding to answers comprises identifying a portion corresponding to answers for each classified section of the survey to elicit attitude, behavior and demographic of a respondent thus providing a more in-depth understanding of a user’s profile. Regarding claim 10, Wilson et al discloses the computer implemented method of claim 1 (see rejection of claim 1). Wilson fails to explicitly disclose further comprising identifying portions of the survey corresponding to questions. Parker et al teaches identifying portions of the survey corresponding to questions (¶ [80-83]). It would have been obvious to one of ordinary skill in the art before the invention was effectively filed for the method as disclosed by Wilson et al comprising obtaining completed surveys, identifying portions of each completed survey corresponding to answers, and applying a machine learning model to the portions identified as answers to predict answers to utilize the teachings of Parker et al which teaches identifying portions of the survey corresponding to questions to conduct surveys more quickly and efficiently than conventional manual methods. Regarding claim 14, Wilson et al discloses a computer-implemented method of extracting data from surveys (see rejection of claim 1), the method comprising: obtaining completed surveys, each completed survey comprising answers to preconfigured questions (see rejection of claim 1); applying a question machine learning model to identify and predict instances of questions in each completed survey (see rejection of claim 10); applying a separate answer machine learning model to identify and predict answers in each completed survey (see rejection of claim 3); and accumulating a response to the survey based on the predicted answers (see rejection of claim 1). Regarding claim 15, Wilson et al discloses the computer implemented method of claim 14 (see rejection of claim 14), wherein obtaining completed surveys comprises obtaining an image of each completed survey (see rejection of claim 7). Claim 5 is rejected under 35 U.S.C. 103 as being unpatentable over Wilson et al in view of Parker et al as applied to claim 4 above, and further in view of Adam Votava et al (US 20200402080 A1). Regarding claim 5, Wilson et al discloses the computer implemented method of claim 4 (see rejection of claim 4). Wilson et al fails to explicitly disclose wherein the selected number of different types comprise: option, table, scale, image, and text. Votava et al, in the same field of endeavor of gathering evaluation information from a user in the form of a survey (¶ [6]), teaches the selected number of different types comprise: option, table, scale, image, and text (¶ [197-199]). It would have been obvious to one of ordinary skill in the art before the invention was effectively filed for the method as disclosed by Wilson et al comprising obtaining completed surveys, identifying portions of each completed survey corresponding to answers, and applying a machine learning model to the portions identified as answers to predict answers to utilize the teachings of Votava et al which teaches the selected number of different types comprise: option, table, scale, image, and text as providing predetermined response options afford for linkage with a value of an evaluation score, thereby determining comparative evaluation scores easily without extensive analyses or computations. Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Wilson et al in view of Parker et al as applied to claim 4 above, and further in view of Milind Kopikare et al (US 20210035132 A1). Regarding claim 6, Wilson et al discloses the computer implemented method of claim 4 (see rejection of claim 4). Wilson et al fails to explicitly disclose wherein applying a machine learning model to the portions identified as answers to predict answers comprises applying a machine learning model to the portions identified as answers to predict answers for each classified section based on the classified type of the section. Kopikare et al, in the same field of endeavor of response prediction system to optimize responses to surveys (Abstract), teaches applying a machine learning model to the portions identified as answers to predict answers comprises applying a machine learning model to the portions identified as answers to predict answers for each classified section based on the classified type of the section (¶ [33-34] and ¶ [65] wherein the prediction model predicts answers based on question characteristics). It would have been obvious to one of ordinary skill in the art before the invention was effectively filed for the method as disclosed by Wilson et al comprising obtaining completed surveys, identifying portions of each completed survey corresponding to answers, and applying a machine learning model to the portions identified as answers to predict answers to utilize the teachings of Kopikare et al which teaches applying a machine learning model to the portions identified as answers to predict answers comprises applying a machine learning model to the portions identified as answers to predict answers for each classified section based on the classified type of the section to improve the accuracy of survey response quality predictions. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to JAMARES Q WASHINGTON whose telephone number is (571)270-1585. The examiner can normally be reached Mon-Fri 8:30am-4:30pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Akwasi M. Sarpong can be reached at (571) 270-3438. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JAMARES Q WASHINGTON/Primary Examiner, Art Unit 2681 November 1, 2025
Read full office action

Prosecution Timeline

Nov 10, 2023
Application Filed
Nov 01, 2025
Non-Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602832
SIGNAL PROCESSING DEVICE, CONTROL CIRCUIT, STORAGE MEDIUM, AND SIGNAL PROCESSING METHOD
2y 5m to grant Granted Apr 14, 2026
Patent 12602937
SYSTEMS, METHODS, AND INTERFACES FOR IDENTIFYING COATING SURFACES
2y 5m to grant Granted Apr 14, 2026
Patent 12602741
SYSTEMS AND METHODS REGULATING FILTER STRENGTH FOR TEMPORAL FILTERING
2y 5m to grant Granted Apr 14, 2026
Patent 12603966
IMAGE PROCESSING APPARATUS, CONTROL METHOD, AND STORAGE MEDIUM CAPABLE OF SUPPRESSING IMAGE DEGRADATION
2y 5m to grant Granted Apr 14, 2026
Patent 12561779
PREDICTING RAILROAD BALLAST FOULING CONDITIONS BASED ON BALLAST IMAGE
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
81%
Grant Probability
93%
With Interview (+12.1%)
2y 6m
Median Time to Grant
Low
PTA Risk
Based on 671 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month