Prosecution Insights
Last updated: April 19, 2026
Application No. 18/655,288

FILE EDITING PROCESSING METHOD AND APPARATUS AND ELECTRONIC DEVICE

Non-Final OA §102
Filed
May 05, 2024
Examiner
HAILU, TADESSE
Art Unit
2174
Tech Center
2100 — Computer Architecture & Software
Assignee
Vivo Mobile Communication Co., Ltd.
OA Round
1 (Non-Final)
78%
Grant Probability
Favorable
1-2
OA Rounds
3y 4m
To Grant
82%
With Interview

Examiner Intelligence

Grants 78% — above average
78%
Career Allow Rate
747 granted / 960 resolved
+22.8% vs TC avg
Minimal +4% lift
Without
With
+4.5%
Interview Lift
resolved cases with interview
Typical timeline
3y 4m
Avg Prosecution
29 currently pending
Career history
989
Total Applications
across all art units

Statute-Specific Performance

§101
5.8%
-34.2% vs TC avg
§103
38.1%
-1.9% vs TC avg
§102
41.1%
+1.1% vs TC avg
§112
9.0%
-31.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 960 resolved cases

Office Action

§102
DETAILED ACTION Notice of Pre-AIA or AIA Status 1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . 2. This Office Action is in response to the application filed on 05/05/2024. 3. The IDS filed on 03/10/2025 is considered and entered into the application file. 4. All the pending claims 1-17 are examined herein. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. 5. Claims 1-17 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Kotler et al (US 20130019172 A1). As per claim 1, Kotler discloses a file editing processing method (FIG. 9 illustrates a logic flow diagram for a process of employing a launcher mechanism for context based menus according to embodiments), comprising: displaying a target interface (text and/or graphic document editor interface of Figs. 2A, 2B, and 2C) , wherein the target interface comprises at least one target object (see textual and/or graphic document) and at least one functional control (see textual menu 204 and command icons 206) , and the functional control corresponds to at least one function; ([0028] Referring to user interface 202, a launcher indicator 214 may be used in vicinity of a selection between selection handles 210 and 212 on the user interface). receiving a first input for a first control in the at least one functional control ([0028] see selection input received/applied to the displayed document via selection between selection handles 210 and 212 on the user interface Figs. 2A-2C); in response to the first input ( i.e., selection input received/applied via selection handles 210 and 212) displaying a target icon (launcher indicator 214) and updating a display position of the target icon, wherein the target icon corresponds to a same function as the first control ([0032] According to further embodiments, a user gesture may be used to move the launcher. For example, if the launcher happens to be over content that the user needs to get to, then they could press and hold on the launcher "dislodging it" and then drag and drop it elsewhere on the screen and in a case that the target icon at least partially overlaps with a first object in the at least one target object, setting the first object according to the function corresponding to the target icon. 0043] FIG. 6 illustrates an example dynamic location adjustment of a launcher indicator according to other embodiments. Various dynamic location and/or size adjustments may be employed in associated with a launcher indicator based on selected content, available display area, other content elements, device type, and so on. Also see [0044]). As per claim 2, Kotler further discloses that the method according to claim 1, wherein the target object comprises N objects, the target interface further comprises N identifiers corresponding respectively to the N objects, and after the displaying a target interface (as illustrated in at least Figs. 2A-2C, user interface 202 includes target object comprising one or more objects such as textual content where the a portion of the text is identified and it is in selected state (see identified text portion highlighted between handles 210 and 212 (Figs. 2A-2C) , a launcher indicator 214 may be used in vicinity of a selection between selection handles 210 and 212 on the user interface, also see [0028]), the method further comprising: receiving a second input for a target identifier in the N identifiers [0028] Referring to user interface 202, a launcher indicator 214 may be used in vicinity of a selection between selection handles 210 (first input) and 212 (second input) on the user interface) and in response to the second input, adjusting an object corresponding to the target identifier to be in a target state([0032] as illustrated in Figs. 2A-2C, a user drags via indicator 236 to selected a target text); wherein in a case that the object is in the target state, the object is editable ([0040] FIG. 5 illustrates an example disappearance of a launcher indicator according to some embodiments. As shown on user interface 502, a launcher indicator 506 according to embodiments may be invoked in response to selection of a portion of displayed content, tapping action on a fixed indicator on the screen, tapping on a selected object or text (504), tapping on selection handles, or a keyboard combination). As per claim 3, Kotler further discloses that the method according to claim 2, wherein after the adjusting an object corresponding to the target identifier to be in a target state, the method further comprises: receiving a third input for the target identifier ([0034] For example, indicator 310 including a letter may be used to represent a context based menu that includes text attribute related commands (e.g., font size, font style, font color, etc.)). and in response to the third input, displaying related information of the object corresponding to the target identifier ([0034-0035] as the user applied related commands (e.g., font size, font style, font color, etc.) corresponding changes will be reflected on the selected text accordingly) As per claim 4, Kotler further discloses that the method according to claim 1, wherein the first control corresponds to a marking function, and after the updating a display position of the target icon, the method further comprises: in a case that the display position of the target icon is at a target position of the target object, displaying a marker at the target position (as illustrated in Figs. 2A-2C, a desired target positions are marked by displaying a markers, that is, selected border region between handle 210 and handle 212 are marked or highlighted or selected, also see [0028]) wherein the marker is used to divide context and the marker is also used to indicate at least one of the following information: time of inserting the marker; a position of inserting the marker; or a position of the marker in the target interface (as illustrated in Figs. 2A-2C, a position of the marker that is selected border region by handles 210 and 212 is shown positioned in the target interface display) As per claim 5, Kotler further discloses that the method according to claim 1, wherein the first object is an object of a target type (see a portions of text document displayed in Figs. 2A-2c), and the in a case that the target icon at least partially overlaps with a first object in the at least one target object (see overlapped icon, indicator 236 overlaps the portions of the text document, Fig. 2C), setting the first object according to the function corresponding to the target icon comprises: displaying a target control in the case that the target icon at least partially overlaps with a first object in the at least one target object (see indictor 236 overlapped on the portion of the text document, Fig. 2C); and in a case that an input for the target control has been received, setting all objects in the at least one target object that are of the same type as the first object according to the function corresponding to the target icon [0032] Upon selection of the graphic content 240 (e.g., by tapping on it), the launcher indicator 242 appears near the selected object and activates a context based menu associated with graphic object related commands. According to further embodiments, a user gesture may be used to move the launcher. For example, if the launcher happens to be over content that the user needs to get to, then they could press and hold on the launcher "dislodging it" and then drag and drop it elsewhere on the screen). As per electronic device claims 6-10, Kotler further discloses electronic device (computing device 800, Fig. 8), the limitations of the electronic device claims are similar to that of method claims 1-5, respectively, thus clams 6-10 are also rejected under similar citations given to the method claims. As per non-transitory readable storage medium claims 11-15, Kotler further discloses electronic device (computing device 800, with system memory 604, Fig.8), the limitations of the storage medium claims are similar to that of method claims 1-5, respectively, thus clams 11-15 are also rejected under similar citations given to the method claims. As per claim 16, Kotler further discloses electronic device a chip (which includes at least one processing unit 802 and system memory 804. Fig. 8) , wherein the chip comprises a processor (processing unit 802) and a communication interface (communication connection(s) 818) , the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the steps of the file editing processing method according to claim 1 (see Fig. 8). As per claim 17, Kotler further discloses a computer program product (program modules, application, context menu module and detection module, stored in system memory of Fig. 8) wherein the computer program product is stored in a non-transient storage medium (Fig. 8) , and the computer program product is executed by at least one processor to implement the steps of the file editing processing method according to claim 1. Conclusion 6. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. US 20190073095 A1 discloses a method that involves displaying application program of a navigation bar by a user interface. Virtual button display of the navigation bar is provided and controlled by an operating system. Determination is made to check whether editable objects are in the user interface. Navigation pane is controlled corresponding to the editable objects. A first signal is received and generated according to first specified operation. Edit control is triggered by the generated signal. Editable objects editing operation is performed corresponding to the first signal. A second signal is received according to second specified operation. US 10534502 B1 discloses a device and method and graphical user interface for positioning a selection and selecting text on a mobile computing device with a touch-sensitive display is described. This includes: displaying a selection having a selection start point and a selection end point within text content displaying a control icon; detecting a contact on the touch-sensitive display; and in response to detecting a change in a horizontal and vertical position of the contact beginning anywhere on the control icon: changing a selection position wherein a horizontal position of the selection start point is changed by an amount proportional to the change in a horizontal position of the contact and a vertical position of the selection start point is changed by an amount proportional to the change in a vertical position of the contact and wherein the horizontal position of the selection start point with respect to the control icon is changed and a vertical position of the control icon is changed. 7. Any inquiry concerning this communication or earlier communications from the examiner should be directed to TADESSE HAILU whose telephone number is (571)272-4051; and the email address is Tadesse.hailu@USPTO.GOV. The examiner can normally be reached Monday- Friday 9:30-5:30 (Eastern time). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Bashore, William L. can be reached (571) 272-4088. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /TADESSE HAILU/ Primary Examiner, Art Unit 2174
Read full office action

Prosecution Timeline

May 05, 2024
Application Filed
Jan 14, 2026
Non-Final Rejection — §102 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596435
CONTACT OR CONTACTLESS INTERFACE WITH TEMPERATURE HAPTIC FEEDBACK
2y 5m to grant Granted Apr 07, 2026
Patent 12578976
SYSTEMS AND METHODS FOR AFFINITY-DRIVEN INTERFACE GENERATION
2y 5m to grant Granted Mar 17, 2026
Patent 12578849
METHOD, APPARATUS, ELECTRONIC DEVICE AND READABLE STORAGE MEDIUM FOR PAGE PROCESSING
2y 5m to grant Granted Mar 17, 2026
Patent 12572198
USER INTERFACES FOR GAZE TRACKING ENROLLMENT
2y 5m to grant Granted Mar 10, 2026
Patent 12566621
CUSTOMIZATION AND ENRICHMENT OF USER INTERFACES USING LARGE LANGUAGE MODELS
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
78%
Grant Probability
82%
With Interview (+4.5%)
3y 4m
Median Time to Grant
Low
PTA Risk
Based on 960 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month