Prosecution Insights
Last updated: April 19, 2026
Application No. 17/876,042

INFORMATION PROCESSING APPARATUS, FLOW GENERATION METHOD, AND COMPUTER PROGRAM PRODUCT

Final Rejection §103
Filed
Jul 28, 2022
Examiner
KHUU, HIEN DIEU THI
Art Unit
2116
Tech Center
2100 — Computer Architecture & Software
Assignee
Ricoh Company Ltd.
OA Round
4 (Final)
87%
Grant Probability
Favorable
5-6
OA Rounds
2y 9m
To Grant
99%
With Interview

Examiner Intelligence

Grants 87% — above average
87%
Career Allow Rate
392 granted / 451 resolved
+31.9% vs TC avg
Strong +15% interview lift
Without
With
+15.3%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
28 currently pending
Career history
479
Total Applications
across all art units

Statute-Specific Performance

§101
17.2%
-22.8% vs TC avg
§103
24.7%
-15.3% vs TC avg
§102
31.6%
-8.4% vs TC avg
§112
19.0%
-21.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 451 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims Claims 1-20 are currently pending in this application in response to the amendment and remarks filed on 11/10/2025. Response to Applicant’s Remarks With respect to the claim objections: Applicant’s claim amendments and remarks filed on 11/10/2025 overcame the claim objection as presented in the Non-Final Office Action mailed on 08/20/2025. With respect to U.S.C. §103 rejections: Applicant’s claim amendments and remarks filed on 11/10/2025 have been fully considered but are not persuasive. Applicant argues that Hayashi fails to teach “determine a plurality of candidate character strings within a range of the entry field” and “determine a title of the entry field as: a closest candidate character string to the entry field of the plurality of candidate character strings”. See Remarks at 9-13. Examiner disagrees. Hayashi teaches: determine a plurality of candidate character strings within a range of the entry field (Such as extracting character strings from an entry field [based on information specified by a user, p.6] and output character strings based on recognition result, p.12); and determine a title of the entry field as: a closest candidate character string to the entry field of the plurality of candidate character strings (Such as: extracting character string that is a form title candidate…the start position and the width are determined as form title information based on the character string information, p.10; extracting title based on conditions that are form title candidates and evaluation values when the respective conditions are satisfied [compared to] previously stored, p.13; evaluating values determined for all character strings after character recognition to determine whether or not the character string can be a form title, p.13; and comparing based on the largest total value to the total value of other character strings to determine the form title, p.13). Examiner interprets that Hayashi teaches the determining of “a closest candidate character string to the entry field” based on the comparing values when the respective conditions are satisfied of previously stored values to determine the title candidates. Thus, Mayer et al. (US 11,314,531) in view of Hayashi et al. (JP-H10-11531-A) teaches the combination of limitations as recited in each of claims 1, 9 and 17 as follows. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-20 are rejected under 35 U.S.C. 103 as being unpatentable over Mayer et al. (US 11,314,531) in view of Hayashi et al. (JP-H10-11531-A). With respect to claims 1, 9 and 17, Mayer teaches an information processing apparatus, method, and a computer program product for being executed on an information processing apparatus, comprising instructions, which, when executed by the information processing apparatus, cause the information processing apparatus to carry out a flow generation method comprising (robot process automation 20 carry out user interface automation activities, figs.1-23 and col.8 lines 50-67 and col.9 lines 1-40), comprising circuitry configured to: display one or more screens that receive an operation by a user (receive user input via robot design interface GUI, col.9 lines 22-26); record operation information including an operation log (record a sequence of user activities, col.18 lines 8-10) and target information indicating a position being operated (identifies the respective element among the plurality of UI elements of UI interface 38, indicating a position of the respective UI element in an object hierarchy of UI interface 38, col.9 lines 58-67); identify an operation item corresponding to the operation based on display information of the one or more screens, according to the operation log and the target information (robot design interface GUI respond to the respect input, col.9 lines 22-39; identifies the respective element among the plurality of UI elements of UI interface 38, indicating a position of the respective UI element in an object hierarchy of UI interface 38, col.9 lines 58-67; identifying respective user selected UI element among the plurality of UI elements…selector is automatically filled [or manually edit] in response to a positive identification of the respective target element, col.15 lines 1-7); determine an entry field operated by the user based on the operation log (an operand is the UI element that is acted upon by a current activity such as a click or a keyboard event, col.9 lines 48-56; respective UI element is a form field of a specific form displayed within a specific UI window, col.9 lines 67 and col.10 line 1; identifying which UI element [form field] is displayed, col.14 lines 21-23); generate an operation component associated with a condition corresponding to the operation item based on the title of the entry field and an analysis result of user data entered in the entry field (in response to user selecting the respective form field as target of the activity, user is presented with popup activity configuration window to enable the user to indicate values of various parameters specific to the respective activity…such parameters include text [form fields “Test User”, “Send Text” or “Username” are interpreted as equivalent to the recited feature “title”, See col.14 lines 58-67 and col.15 lines 1-16] to be written to the respective target form field, [col.14 lines 61-67]); and generate a flow of operations based on the operation component according to an order of operations including the operation by the user (generate a set of activities to be executed by RPA robots 12, col.15 lines 21-23), the flow of operations being a flow to be processed by a computer that executes an application (displays a diagram of a set of activities to be carried out by robot, the activities effectively mimicking the flow of a business process being automated, col.15 lines 58-67). With respect to claims 1, 9 and 17, Mayer does not appear to teach: determine a plurality of candidate character strings within a range of the entry field; determine a title of the entry field as: a closest candidate character string to the entry field of the plurality of candidate character strings, or a highest priority candidate character string of the plurality of candidate character strings based on a priority criteria1. However, it is known by Hayashi to teach of a form reading device to create form format information (Hayashi: abstract), particularly, Hayashi teaches: determine a plurality of candidate character strings within a range of the entry field (extracting character strings from an entry field, p.12, output character strings based on recognition result, p.12); and determine a title of the entry field as: a closest candidate character string to the entry field of the plurality of candidate character strings (extract character string that is a form title candidate…the start position and the width are determined as form title information based on the character string information, page 10; title extraction based on conditions that are form title candidates and evaluation values when the respective conditions are satisfied are previously stored, p.13, evaluation values determined for all character strings after character recognition…determine whether or not the character string can be a form title, p.13; largest total value compared to the total value of other character strings to determine the form title, p.13; Examiner interprets that Hayashi teaches the determining of “a closest candidate character string to the entry field” based on the comparing values when the respective conditions are satisfied of previously stored values to determine the title candidates). Because Hayashi is also directed to an information processing apparatus (Hayashi: abstract; Mayer: process automation 20 carry out user interface automation activities, figs.1-23), it would have been obvious to one of ordinary skill in the art before the effective filing date to incorporate the teaching of determining a plurality of candidate character strings within a range of the entry field and determining a title of the entry field as: a closest candidate character string to the entry field of the plurality of candidate character strings as taught by Hayashi with the information processing apparatus as taught by Mayer in order to create form format information with character recognition…to automatically extract the structure of the form from the read image, and make manual corrections, page 5). With respect to claims 2 and 10, Mayer teaches further wherein the circuitry is further configured to search an area on the one or more screens for one or more candidates of the operation item corresponding to the operation, based on the target information, to identify the one or more candidates of the operation item (UI interface indicates a position of the respective UI element in an object hierarchy of UI interface, col.9 lines 58-67). With respect to claims 3/2 and 11/10, Mayer teaches further wherein the circuitry is further configured to display the identified one or more candidates of the operation item (indicate that the respective UI element is a form field of a specific form displayed within a specific UI window, col.9 line 67 and col.10 line 1). With respect to claims 4 and 12, Mayer teaches further wherein the condition corresponding to the operation item is an input condition for an entry field on the one or more screens corresponding to the operation (selector may indicate that the respective UI element is a form field of a specific form displayed within a specific UI window; the selector of a target UI element may be specified at design time by including an encoding of the respective selector in an RPA script configured to carry out an activity on the respective UI element, col.9 lines 66-67 and col.10 lines 1-5). With respect to claims 5/4 and 13/12, Mayer teaches further wherein the input condition for the entry field on the one or more screens is inherited2 (the selector of a target UI element may be specified at design time by including an encoding of the respective selector in an RPA script configured to carry out an activity on the respective UI element, col.9 lines 66-67 and col.10 lines 1-5). With respect to claims 6 and 14, Mayer teaches further wherein the circuitry is further configured to receive an operation for changing the identified operation item by the user (trigger events that may trigger a change in the appearance of the display and warrant to fetch contents of the UI tree of the currently displayed GUI context, where triggered event is received via a user input, col.14 lines 1-5). With respect to claims 7 and 15, Mayer teaches further wherein the circuitry is further configured to: display the generated operation component of the flow (displays a set of activities to be carried out by robots, the activities effectively mimicking the flow of a business process being automated, col.15 lines 58-66); and receive an operation of editing the operation component by the user (created activity containers be displayed to user for editing, col.18 lines 49-50). With respect to claims 8 and 16, Mayer teaches further wherein the circuitry is further configured to control a software robot to operate the application based on the flow of operations (displays a set of activities to be carried out by robots, the activities effectively mimicking the flow of a business process being automated, col.15 lines 58-66). With respect to claim 18, Mayer teaches wherein the circuitry is further configured to: display an option for a recording function (user to select element among the plurality of UI elements displayed within a specific UI window, col.9-10, lines 58-67 and 1-10 respectively; identifying features of the selected target UI element and display to the user, col.17 lines 52-62); and record the operation information in response to a selection, by the user, of the recording function (record a sequence of user activities…GUI 80 may display a recorded activity window 82 showing a recorded sequence of actions (swipe-tap-set text-tap-set text-tap for the exemplary sequence of user actions described above), col.18 lines 6-22). With respect to claim 19, Mayer teaches wherein the circuitry is further configured to determine the analysis result by validating the user data according to at least one rule (validating the selection (for instance by clicking “Connect”)…to establish a connection with the selected RPA model device having the respective characteristics and executing the respective target application, col.12 lines 56-60). With respect to claim 20, wherein the circuitry is further configured to: wherein the priority criteria3 is based on a direction of a respective candidate character string from the entry field. Conclusion The additional prior arts made of record and have not been relied upon are considered pertinent to applicant's disclosure as follows: US-2002/0034328-A1, JP_2011108189_A , and JP_4982587_B2. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to HIEN (CINDY) D KHUU whose telephone number is (571)272-8585. The examiner can normally be reached on Monday-Friday 8a-8p. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kamini Shah can be reached on 571-272-2279. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /HIEN D KHUU/Primary Examiner, Art Unit 2116 December 10, 2025 1 Optional limitation. 2 A review of the Specification reveals that it does not provide any special meaning for the claim term “inherited”. The Merriam-Webster’s Collegiate Dictionary defines “inherited” as “to receive from a parent by genetic transmission”. Hence, the BRI for the term “inherited” is taught by Mayer as the target UI element being specified at design time. 3 Optional limitation as recited in each of claims 1, 9, and 17: “or a highest priority candidate character string of the plurality of candidate character strings based on a priority criteria”. This limitation appears to be well-known and taught by Naoi et al. (US 2002/0034328-A1; fig.2B and [0200,0204,0211]).
Read full office action

Prosecution Timeline

Jul 28, 2022
Application Filed
Nov 30, 2024
Non-Final Rejection — §103
Jan 31, 2025
Interview Requested
Feb 06, 2025
Examiner Interview Summary
Feb 06, 2025
Applicant Interview (Telephonic)
Feb 26, 2025
Response Filed
May 23, 2025
Final Rejection — §103
Aug 07, 2025
Request for Continued Examination
Aug 14, 2025
Response after Non-Final Action
Aug 18, 2025
Non-Final Rejection — §103
Nov 10, 2025
Response Filed
Dec 10, 2025
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602027
OPERATION CONTROL DEVICE AND PROGRAM
2y 5m to grant Granted Apr 14, 2026
Patent 12591177
METHOD FOR OBTAINING TRAINING DATA FOR TRAINING A MODEL OF A SEMICONDUCTOR MANUFACTURING PROCESS
2y 5m to grant Granted Mar 31, 2026
Patent 12585253
ASSISTANCE DEVICE AND MECHANICAL SYSTEM
2y 5m to grant Granted Mar 24, 2026
Patent 12585250
SYSTEM AND METHOD FOR CYCLE TIME ANALYSIS AND BOTTLENECK DETECTION IN SMART FACTORY ASSEMBLY LINES
2y 5m to grant Granted Mar 24, 2026
Patent 12578714
Gateway And Method For Transforming A Data Model Of A Manufacturing Process Equipment
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
87%
Grant Probability
99%
With Interview (+15.3%)
2y 9m
Median Time to Grant
High
PTA Risk
Based on 451 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month