Prosecution Insights
Last updated: April 19, 2026
Application No. 18/639,823

INTERFACE PROCESSING METHOD, ELECTRONIC APPARATUS, AND STORAGE MEDIUM

Non-Final OA §101§103
Filed
Apr 18, 2024
Examiner
TSUI, WILSON W
Art Unit
2172
Tech Center
2100 — Computer Architecture & Software
Assignee
BEIJING XIAOMI MOBILE SOFTWARE CO., LTD.
OA Round
1 (Non-Final)
62%
Grant Probability
Moderate
1-2
OA Rounds
4y 0m
To Grant
99%
With Interview

Examiner Intelligence

Grants 62% of resolved cases
62%
Career Allow Rate
365 granted / 593 resolved
+6.6% vs TC avg
Strong +58% interview lift
Without
With
+58.1%
Interview Lift
resolved cases with interview
Typical timeline
4y 0m
Avg Prosecution
44 currently pending
Career history
637
Total Applications
across all art units

Statute-Specific Performance

§101
15.5%
-24.5% vs TC avg
§103
52.5%
+12.5% vs TC avg
§102
12.0%
-28.0% vs TC avg
§112
14.2%
-25.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 593 resolved cases

Office Action

§101 §103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Information Disclosure Statement The information disclosure statement (IDS) submitted on 01/29/2025 is being considered by the examiner. Drawings The drawings filed on: 04/18/2024 are accepted. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claim 1, 2, 7-10 and 18-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. 101 Analysis for Claim 1 101 Analysis Step 2A, Prong One Claim 1 recites the following limitations (of which bolded limitations constitute a ‘mental process’ that covers performance of the limitations in the human mind). An interface processing method, applied to an electronic apparatus, and comprising: receiving a touch command based on a target application program; and switching a display state of the target application program from a first display state to a second display state in response to that the touch command conforms to a preset rule, wherein a window layout of the first display state is different from a window layout of the second display state. As a note, steps fall within the mental process groupings of abstract ideas because they cover concepts performed in the human mind, including observation, evaluation, judgement, and opinion (See MPEP 2106.04(a)(2), subsection III). With respect to the particular limitations (that were bolded above), these steps can be practically performed in the human mind using observation, evaluation, judgment and/or opinion. For example, the particular limitations encompass: 1) evaluating a display state (value) and based on the evaluation, making a judgement to update/change/switch a display state (value). It is noted that the independent claim does not require what aspects of a display state are ‘switched’ and thus the claim language encompasses assessing and making a judgement upon a display state value. 101 Analysis Step 2A, Prong Two With regards to the following additional elements of ‘… applied to an electronic apparatus …’ and ‘ … of the target application program …’ , these elements are considered to encompass a generic computer (and its components/functions) that are used as a tool to perform generic computer functions/operations such that they amount to no more than mere instructions/operations (i.e. application program instructions) to apply the exception using the generic computer (electronic apparatus). Applying an abstract idea on a generic computer does not integrate the abstract idea into a practical application. With regards to the following additional elements of ‘receiving a touch command based on a target application program’, these elements are considered to amount to mere data gathering recited at a high level of generality, and thus are insignificant extra-solution activity. See MPEP 2106.05(g) (“whether the limitation is significant”). See MPEP 2106.05 and Mayo, 566 U.S. at 79, 101 USPQ2d at 1968; OIP Techs., Inc. v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1092-93 (Fed. Cir. 2015) (presenting offers and gathering statistics amounted to mere data gathering). Thus, the additional elements to not integrate the judicial exception into a practical application. 101 Analysis Step 2B: With regards to the following additional elements of ‘… applied to an electronic apparatus …’ and ‘ … of the target application program …’ , as discussed above in step 2A, prong two, amounts to no more than elements that are considered to encompass a generic computer (and its components/functions) that are used as a tool to perform generic computer functions/operations. The courts have found ‘apply it’ (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, as insufficient to qualify as ‘significantly more’ than the judicial exception. With regards to ‘receiving a touch command based on a target application program’, as discussed above in step 2A, prong two, these elements are considered to amount to mere data gathering recited at a high level of generality, and thus are insignificant extra-solution activity. Adding insignificant extra solution activity to the judicial exception (such as data gathering) have been found by the courts to be insufficient to qualify as ‘significantly more’ than the judicial exception. 101 Analysis for claims 2, 7-10 and 18: Dependent claim(s) 2, 7-10 and 18 do not recite any further limitations that cause the claim(s) to be patent eligible. Rather, the limitations of dependent claims that have additional elements are directed toward additional aspects of the judicial exception that do not integrate the judicial exception into a practical application nor amount to significantly more than the judicial exception. More specifically, the following additional elements of (‘receiving a/the drag command …’ and ‘displaying … ) are interpreted to encompass data collection and an act of display of information (which are both considered insignificant extra solution activities and have been deemed insufficient by the courts to integrate the judicial exception into a practical application and insufficient to qualify as ‘significantly more’ than the judicial exception. Therefore, dependent claims 2, 7-10 and 18 are not patent eligible under the same rationale as claim 1. 101 Analysis of claim 19: With regards to claim 19, it is rejected under similar rationale as claim 1. It is noted that claim 19 additionally recites additional elements of ‘a processor’, ‘a memory’; for which these additional elements are considered merely including instructions to implement an abstract idea on a computer, or merely using a computer as a tool to perform an abstract idea. The courts have identified these types of limitations for ‘apply it’ using a generic computer are insufficient to integrate the judicial exception into a practical application and insufficient to qualify as ‘significantly more’ than the judicial exception. 101 Analysis of claim 20: With regards to claim 20, it is rejected under similar rationale as claim 1. It is noted that claim 20 additionally recites ‘a non-transitory computer-readable storage medium having stored … executable instruction that, when executed by a processor’, for which these additional elements are considered merely including instructions to implement an abstract idea on a computer, or merely using a computer as a tool to perform an abstract idea. The courts have identified these types of limitations for ‘apply it’ using a generic computer are insufficient to integrate the judicial exception into a practical application and insufficient to qualify as ‘significantly more’ than the judicial exception. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-9, 12-17, 19 and 20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Bott et al (“Managing and Arranging Windows”, published: March 2021, pages: Npl Pages 1-4 ) in view of Beek et al (US Application: US 2008/0178126, published: Jul. 24, 2008, filed: Jan. 24, 2007). With regards to claim 1, Bott et al teaches an interface processing method, applied to an electronic apparatus, and comprising: receiving a … command based on a target application program (NPL Page 2, Fig. 3-12: a user can apply a gesture drag command associated with a target window application state - “Drag the title bar to the top of the screen to maximize the window, or drag the title bar away from the top edge to restore it to its previous window size”); and switching a display state of the target application program from a first display state to a second display state in response to that the touch command conforms to a preset rule (NPL Page 2, Fig. 3-12: a rule to switch an application from full maximized size to a non-maximized-prior-window-size (interpreted as less than full maximized) size ) is applied by a gesture that includes select and drag the title bar of the target application away and below the top edge of a displayed screen (interpreted as dragged into an area below the top edge)) – “Drag the title bar to the top of the screen to maximize the window, or drag the title bar away from the top edge to restore it to its previous window size”), wherein a window layout of the first display state is different from a window layout of the second display state (NPL Page 2, Fig. 3-12: the first display state is interpreted as the maximized state of the target window application and the second display state is a non-maximized (less than maximum and prior window size) – “Drag the title bar to the top of the screen to maximize the window, or drag the title bar away from the top edge to restore it to its previous window size). However although Bott et al teaches receiving a command based on a target application program, Bott et al does not explicitly teach a touch command … Yet Beeck et al teaches receiving a touch command based on a target application program … (Fig. 4c, paragraphs 0019, 0042: a computer embodiment (which uses at least a memory and computing processor to execute instructions) to identify and process a user gesture/command to select and drag a window to a desired area/position is implemented). It would have been obvious to one of ordinary skill in the art before the effective filing of the invention to have modified Bott et al’s ability to process a gesture command based on target application program’s rendering state, such that the gesture command could have been a touch gesture command as taught by Beeck et al. The combination would have allowed support of touch technology to make it more intuitive and efficient to interact with windows. With regards to claim 2. The interface processing method according to claim 1, Bott et al teaches wherein receiving the touch command based on the target application program comprises: receiving a drag command based on the target application program (as similarly explained in the rejection of claim 1 above, a select and drag gesture command is applied to the target application’s title bar). With regards to claim 3. The interface processing method according to claim 2, Bott et al and Beek et al teaches wherein switching the display state of the target application program from the first display state to the second display state in response to that the touch command conforms to the preset rule comprises: switching the display state of the target application program from the first display state to the second display state, in response to that an endpoint of a motion trajectory corresponding to the drag command is in a preset area of a screen of the electronic apparatus (as similarly explained in the rejection of claim 1 above, the target application is switched from a maximized /full screen state to a smaller non-maximized prior-sized state when the user applies a drag gesture command of the title bar to a point below the top edge of the screen (also see Fig. 3-12 which shows windows can have a prior ratio of length and width size), and is rejected under similar rationale). With regards to claim 4. The interface processing method according to claim 3, Bott et al and Beek et al teaches further comprising: prompting the window layout of the second display state, in response to that the motion trajectory corresponding to the drag command enters the preset area (as explained in the rejection of claim 1 above, the drag enters an area below and away from the top edge and a second display state of applying a prior-sized window layout is applied/visually-prompted, and is rejected under similar rationale). With regards to claim 5. The interface processing method according to claim 4, Bott et al and Beek et al teaches further comprising: displaying the window layout of the second display state through a mask layer area of a first preset dimension (as similarly explained in the rejection of claim 1, the gesture applied causes the target window layout to transition to a second positioned window that has a smaller (not maximized) prior-sized window (and a rendered area of the prior sized window is interpreted as a defined (mask) area of having the prior-sized window dimensions (a rendered window having the dimensions is also interpreted as having length by width dimensions, and an example of a window capable of having length and width dimensions is shown in Fig. 3-12)), and is rejected under similar rationale). With regards to claim 6. The interface processing method according to claim 5, Bott et al and teaches wherein a size of the first preset dimension is associated with the window layout of the second display state, as similarly explained in the rejection of claim 5 (a first dimension could be for example a length dimension of the prior-sized-window (an example of a window capable of having length and width dimensions is shown in Fig. 3-12)), and is rejected under similar rationale. With regards to claim 7. The interface processing method according to claim 1, Bott et al teaches further comprising: displaying the target application program in a second preset dimension in response to executing the touch command, as similarly explained in the rejection of claim 5 (a second dimension could be for example a width dimension of the prior-sized-window), and is rejected under similar rationale. With regards to claim 8. The interface processing method according to claim 2, Bott et al teaches wherein receiving the drag command based on the target application program comprises: receiving the drag command of a control displayed on the target application program (as similarly explained in the rejection of claim 1, the user drags a title bar (interpreted as an interactive control), which then resizes the application associated with the title bar to a smaller (non-maximized) window having prior-size dimensions, and is rejected under similar rationale); or receiving the drag command of a control on an icon of the target application program. With regards to claim 9. The interface processing method according to claim 1, Bott et al teaches further comprising: setting the window layout of the first display state and the window layout of the second display state according to the target application program (as similarly explained in the rejection of claim 1, the target window layout of first display state is according to a maximized configuration/setting of the target application and the window layout of a second display state is according to a prior-sized-configuration/setting of the target application). With regards to claim 12. The interface processing method according to claim 3, Bott et al and teaches wherein: an interface comprises a plurality of preset areas, different preset areas corresponding to different display states; the first display state is that the target application program is in a closed mode, a small window mode, a split screen mode, or a full screen mode (as explained in the rejection of claim 1, the first display state is a maximized /full-screen mode, and is rejected under similar rationale); and the second display state is that the target application program is in a small window mode, a split screen mode, or a full screen mode (as similarly explained in the rejection of claim 1, the second display state is a non-maximized/smaller-prior-sized-window, and is rejected under similar rationale). With regards to claim 13. The interface processing method according to claim 12, Bott et al teaches wherein: the interface comprises a first area (as similarly explained in the rejection of claim 1 above, a first area can an area below a top edge of a title bar area indicated by a drag down gesture), a second area (Bott et al, Fig. 3-12: a second area can be a left split screen area where the target application is dragged to a left edge of a display screen area), and a third area (Bott et al, NPL Page 2 and 3: a third area can be a top edge screen area where the target application is dragged to a top of a display screen area - “Maximize window”); the first area is a preset area corresponding to the small window mode (Bott et al, NPL Page 2: the first area corresponds to a non-maximized prior-sized window (smaller window) -“ drag the title bar away from the top edge to restore it to its previous window size”) ; the second area is a preset area corresponding to the split screen mode (Bott et al, NPL Page 2: the second area is a split screen mode; and the third area is a preset area corresponding to the full screen mode (Bott et al, NPL Page 2 and 3: a third area can be a top edge screen area where the target application is dragged to a top of a display screen area - “Maximize window”). With regards to claim 14. The interface processing method according to claim 13, Bott et al and Beek et al teaches wherein switching the display state of the target application program from the first display state to the second display state, in response to that the endpoint of the motion trajectory corresponding to the drag command is in the preset area of the screen of the electronic apparatus comprises: switching the display state of the target application program into the small window mode, in response to that the endpoint of the motion trajectory corresponding to the drag command is in the first area of the screen of the electronic apparatus (as explained in the rejection of claim 1, Bott et al teaches the drag motion towards a point below the top edge of the screen is identified and the display state of the target application goes from maximized/full-screen to a smaller window having a prior-size); or switching the display state of the target application program into the split screen mode, in response to that the endpoint of the motion trajectory corresponding to the drag command is in the second area of the screen of the electronic apparatus; or switching the display state of the target application program into the full screen mode, in response to that the endpoint of the motion trajectory corresponding to the drag command is in the third area of the screen of the electronic apparatus. With regards to claim 15. The interface processing method according to claim 13, Bott et al and Beek et al teaches further comprising: switching a display state of a non-target application program from a third display state to a fourth display state in response to that the endpoint of the motion trajectory corresponding to the drag command is in the preset area of the screen of the electronic apparatus, wherein a window layout of the third display state is different from a window layout of the fourth display state (Bott et al, NPL Page 2, Fig 3-12: teaches another app separate from the target app gets switched to a different split area than the target app’s split area in response to the user dragging the target window application to a particular side area (such as left edge area of screen)). With regards to claim 16. The interface processing method according to claim 15, Bott et al and Beek et al teaches wherein the third display state is that the non-target application is in the small window mode, the split screen mode, or the full screen mode, and the fourth display state is that the non-target application is in the small window mode, the split screen mode, the full screen mode, or the closed mode (Bott et al, NPL Page 2, Fig. 3-12: Bott et al teaches the non-target application is displayed in a split area screen mode different from the target applications area for split screen). With regards to claim 17. The interface processing method according to claim 16, Bott et al and Beek et al teaches wherein switching the display state of the non-target application program from the third display state to the fourth display state in response to that the endpoint of the motion trajectory corresponding to the drag command is in the preset area of the screen of the electronic apparatus comprises: switching the display state of the non-target application program into a non-small window mode, in response to that the endpoint of the motion trajectory corresponding to the drag command is in the first area of the screen of the electronic apparatus; or switching the display state of the non-target application program into the small window mode or the split screen mode, in response to that the endpoint of the motion trajectory corresponding to the drag command is in the second area of the screen of the electronic apparatus; or switching the display state of the non-target application program into the closed mode, in response to that the endpoint of the motion trajectory corresponding to the drag command is in the third area of the screen of the electronic apparatus (Bott et al , NPL Pages 2 and 3: teaches maximize target application will close the non-target application’s display, as the target application is maximized (takes up entire area of the screen). Also an additional gesture is available to minimize the non-target window by applying a subsequent gesture to shake the target window which minimizes the non-target window(s)). With regards to claim 19. Bott et al and Beek et al teaches An electronic apparatus, comprising: a processor; and a memory for storing an instruction executable by the processor; wherein the processor is configured to: receive a touch command based on a target application program; and switch a display state of the target application program from a first display state to a second display state in response to that the touch command conforms to a preset rule, wherein a window layout of the first display state is different from a window layout of the second display state, as similarly explained in the rejection of claim 1, and is rejected under similar rationale. With regards to claim 20. Bott et al and Beek et al teaches A non-transitory computer-readable storage medium having stored therein an executable instruction that, when executed by a processor, implements: receiving a touch command based on a target application program; and switching a display state of the target application program from a first display state to a second display state in response to that the touch command conforms to a preset rule, wherein a window layout of the first display state is different from a window layout of the second display state, as similarly explained in the rejection of claim 1, and is rejected under similar rationale. Claim(s) 10 and 11 is/are rejected under 35 U.S.C. 103 as being unpatentable over Bott et al (“Managing and Arranging Windows”, published: March 2021, pages: Npl Pages 1-4 ) in view of Beek et al (US Application: US 2008/0178126, published: Jul. 24, 2008, filed: Jan. 24, 2007) in view of Kaufthal et al (US Application: 2015/0277682, published: Oct. 1, 2015, filed: Jul. 22, 2014). With regards to claim 10. The interface processing method according to claim 1, the combination of Bott et al and Beek et al teaches further comprising: setting the window layout of the first display state and the window layout of the second display state, as similarly explained in the rejection of claim 1, and is rejected under similar rationale. However the combination does not teach … according to a type of the electronic apparatus. Yet Kaufthal et al teaches … according to a type of the electronic apparatus (paragraph 0041: the amount of display space/area available to an application is considered (based on display screen size/type) before a display state/configuration/region is implemented). It would have been obvious to one of ordinary skill in the art before the effective filing of the invention to have modified Bott et al and Beek et al’s ability to change/set a window layout to be a first and/or display state/configuration, such that the configuration being considered is based upon a type of the electronic apparatus, as taught by Kaufthal et al The combination would have allowed implemented a software application that can adapt to devices of all shapes and sizes (Kaufthal et al, paragraph 0001). With regards to claim 11. The interface processing method according to claim 3, the combination of Bott et al, Beek et al and Kaufthal et al further comprising: determining a size of the preset area according to a type of the electronic apparatus, as similarly explained in the rejection of claim 10 (the size of the area is considered for a device type having a particular screen size), and is rejected under similar rationale. Claim(s) 18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Bott et al (“Managing and Arranging Windows”, published: March 2021, pages: Npl Pages 1-4 ) in view of Beek et al (US Application: US 2008/0178126, published: Jul. 24, 2008, filed: Jan. 24, 2007) in view of StackOverflow (“Application does not support the current display size”, published: Dec. 2017, page: 1). With regards to claim 18. The interface processing method according to claim 1, Bott et al and Beek et al teaches further comprising: … the target application program … the second display state … the electronic apparatus … switching the display state of the target application program from the first display state to the second display state, as similarly explained in the rejection of claim 1, and is rejected under similar rationale. However the combination does not expressly teach … prompting that the target application program does not support the second display state, in case that the electronic apparatus does not support switching the display state. Yet StackOverflow teaches … prompting that the target application program does not support the … display state, in the case that the electronic apparatus does not support switching the display state …. (Page 1: the device/apparatus with a particular configuration does not support a switch to a display state for a target application and a prompt is provided to indicate a such). It would have been obvious to one of ordinary skill in the art before the effective filing of the invention to have modified Bott et al and Beek et al’s ability to switch between display states, such that a prompt is provided when an unsupported display state is switched to for a target application, as taught by StackOverflow. The combination would have provided sufficient reasons/information to the user when an issue of support is encountered. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Buening (US Application: US 20130263042): This reference teaches managing display of multiple applications for a touch panel device. Sun et al (US Application: US 2013/0120294): This reference teaches allocating space on a touch screen for a plurality of applications. Mo (US Application: US 20230359343): This reference teaches processing a touch gesture and switching display modes based on the type of gesture. Any inquiry concerning this communication or earlier communications from the examiner should be directed to WILSON W TSUI whose telephone number is (571)272-7596. The examiner can normally be reached Monday - Friday 9 am -6 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Adam Queler can be reached at (571) 272-4140. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /WILSON W TSUI/Primary Examiner, Art Unit 2172
Read full office action

Prosecution Timeline

Apr 18, 2024
Application Filed
Jan 03, 2026
Non-Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602535
COMMENT DISPLAY METHOD AND APPARATUS OF A DOCUMENT, AND DEVICE AND MEDIUM
2y 5m to grant Granted Apr 14, 2026
Patent 12589766
AUTONOMOUS DRIVING SYSTEM AND METHOD OF CONTROLLING SAME
2y 5m to grant Granted Mar 31, 2026
Patent 12570284
AUTONOMOUS DRIVING METHOD AND DEVICE FOR A MOTORIZED LAND VEHICLE
2y 5m to grant Granted Mar 10, 2026
Patent 12552376
VEHICLE CONTROL APPARATUS
2y 5m to grant Granted Feb 17, 2026
Patent 12511993
SYSTEMS AND METHODS FOR CONFIGURING A HIERARCHICAL TRAFFIC MANAGEMENT SYSTEM
2y 5m to grant Granted Dec 30, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
62%
Grant Probability
99%
With Interview (+58.1%)
4y 0m
Median Time to Grant
Low
PTA Risk
Based on 593 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month