Prosecution Insights
Last updated: April 19, 2026
Application No. 18/373,917

Device, Method, and Graphical User Interface for Providing Navigation and Search Functionalities

Non-Final OA §101§103§DP
Filed
Sep 27, 2023
Examiner
ORR, HENRY W
Art Unit
2172
Tech Center
2100 — Computer Architecture & Software
Assignee
Apple Inc.
OA Round
1 (Non-Final)
50%
Grant Probability
Moderate
1-2
OA Rounds
3y 10m
To Grant
88%
With Interview

Examiner Intelligence

Grants 50% of resolved cases
50%
Career Allow Rate
230 granted / 456 resolved
-4.6% vs TC avg
Strong +37% interview lift
Without
With
+37.2%
Interview Lift
resolved cases with interview
Typical timeline
3y 10m
Avg Prosecution
29 currently pending
Career history
485
Total Applications
across all art units

Statute-Specific Performance

§101
6.8%
-33.2% vs TC avg
§103
53.4%
+13.4% vs TC avg
§102
19.4%
-20.6% vs TC avg
§112
15.1%
-24.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 456 resolved cases

Office Action

§101 §103 §DP
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION 1. This action is responsive to application communication filed on 9/27/2023. 2. Claims 1-16 are pending in the case. 3. Claims 1,15 and 16 are independent claims. Examiner Note Dependent Claims 11-13 were not given ODP rejections. Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13. The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer. Claims 1, 4-9, 15 and 16 are rejected on the ground of nonstatutory double patenting as being unpatentable over claim 11 of U.S. Patent No. 10481769 in view of Chaudhri; Imran, U.S. Published Application No. 20110252357 A1. Claim 1: Claim 11 of Patent No. 10481769 teaches A method, comprising: at an electronic device with a touch-sensitive display: (claim 11; A method comprising: at an electronic device that includes a touch-sensitive display: displaying, on the touch-sensitive display) displaying on the touch-sensitive display a first user interface; (claim 11; displaying, on the touch-sensitive display, a first page of a multi-page application launch interface that includes multiple pages of application icons for launching distinct applications,) while displaying the first user interface, detecting a swipe gesture; (claim 11; while displaying the first page of the multi-page application launch interface on the touch-sensitive display, detecting, on the touch-sensitive display, a first input that includes detecting a first contact and detecting movement of the first contact on the touch-sensitive display;) in response to detecting the swipe gesture: in accordance with a determination that the swipe gesture meets first criteria, wherein the first criteria include a criterion that is met when the swipe gesture starts at a first edge of the touch-sensitive display and moves away from the first edge, displaying a reduced scale representation of the first user interface and concurrently displaying a reduced scale representation of a second user interface not displayed immediately prior to detecting the swipe gesture; (claim 11; in accordance with a determination that the first input includes movement of the first contact starting from the first edge of the touch-sensitive display and moving away from the first edge, displaying a multitasking user interface that includes a plurality of concurrently displayed representations of open applications.) Claim 11 of Patent No. 10481769 fails to expressly teach while concurrently displaying the reduced scale representation of the first user interface and the reduced scale representation of the second user interface: detecting a tap gesture at a location on the touch-sensitive display corresponding to the second user interface; and in response to detecting the tap gesture at a location on the touch-sensitive display corresponding to the second user interface, ceasing to display the first user interface and displaying a larger scale representation of the second user interface than the reduced scale representation of the second user interface. However, Chaudhri teaches while concurrently displaying the reduced scale representation of the first user interface and the reduced scale representation of the second user interface: detecting a tap gesture at a location on the touch-sensitive display corresponding to the second user interface; (e.g., while browser application view 5008-12 and mail application view 5008-10 are concurrently displayed at a reduced scale representations, detecting a tap gesture 535 on the mail application view 5008-10; par. 240; FIG. 5EE illustrates that, in response to detecting input 527, a portion of home screen 5001, web browser application view 5008-12, and a portion of mail application view 5008-10 are displayed. FIG. 5EE also illustrates that tap gesture 535 can be detected at a location that corresponds to image 5008-10 of mail application.) and in response to detecting the tap gesture at a location on the touch-sensitive display corresponding to the second user interface, ceasing to display the first user interface and displaying a larger scale representation of the second user interface than the reduced scale representation of the second user interface. (e.g., in response to tap gesture 535, displaying mail application view without any other application view (i.e., displaying a larger scale of mail application view just like map application view of Fig. 5GG) par. 240; In response to detecting tap gesture 535, mail application view 5004-4 (as shown in FIG. 5CC) will be displayed without concurrently displaying any other application view. Par. 241; In FIG. 5FF, when gesture 531 (e.g., a tap gesture) is detected at a location that corresponds to map application view 5008-6, in response, a map application view is displayed, as shown in FIG. 5GG. ) It would have been obvious to one of ordinary skill in the art before effective filing date of the claimed invention to modify the concurrently displayed representations of open applications as taught by Claim 11 of Patent No. 10481769 to be selected via tap gestures like the reduced scale applications as taught by Chaudhri to provide the benefit of quickly scrolling and selecting to the next application (see Chaudhri; par. 240) Claim 4 depends on claim 1: Claim 11 of Patent No. 10481769 teaches wherein the second user interface is a user interface for an open application. (claim 11; displaying, on the touch-sensitive display, a first page of a multi-page application launch interface that includes multiple pages of application icons for launching distinct applications, claim 11; in accordance with a determination that the first input includes movement of the first contact starting from the first edge of the touch-sensitive display and moving away from the first edge, displaying a multitasking user interface that includes a plurality of concurrently displayed representations of open applications) Claim 5 depends on claim 1: Claim 11 of Patent No. 10481769/Chaudhri teaches including: in response to detecting the swipe gesture: in accordance with the determination that the swipe gesture meets the first criteria, while displaying the reduced scale representation of the first user interface and the reduced scale representation of the second user interface, displaying a reduced scale representation of a third user interface not displayed immediately prior to detecting the swipe gesture. (claim 11; displaying, on the touch-sensitive display, a first page of a multi-page application launch interface that includes multiple pages of application icons for launching distinct applications, claim 11; in accordance with a determination that the first input includes movement of the first contact starting from the first edge of the touch-sensitive display and moving away from the first edge, displaying a multitasking user interface that includes a plurality of concurrently displayed representations of open applications) (e.g., displaying at least three reduced scale representations of applications as shown in Chaudhri’s Figure 5FF; see Chaudhri par. 240; FIG. 5EE also illustrates that swipe gesture 529 can be detected on touch screen at a location that corresponds to mail application view 5008-10, and in FIG. 5FF, in response to detecting swipe gesture 529, application views (e.g., 5008-10 and 5008-12) are scrolled, and a portion of map application view 5008-6 is displayed.) Claim 6 depends on claim 1: Claim 11 of Patent No. 10481769/Chaudhri teaches wherein the first user interface is a home screen having a plurality of application icons. (e.g., first user interface is a home screen having a plurality of application icons in the background as shown Chaudhri’s Figure 5 EE) Claim 7 depends on claim 1: Claim 11 of Patent No. 10481769 teaches wherein the first user interface is a multi-page application launch interface that includes multiple pages of application icons for launching distinct applications. (claim 11; displaying, on the touch-sensitive display, a first page of a multi-page application launch interface that includes multiple pages of application icons for launching distinct applications, claim 11; in accordance with a determination that the first input includes movement of the first contact starting from the first edge of the touch-sensitive display and moving away from the first edge, displaying a multitasking user interface that includes a plurality of concurrently displayed representations of open applications) Claim 8 depends on claim 7: Claim 11 of Patent No. 10481769 teaches including: while displaying the first user interface, detecting a second swipe gesture; in response to detecting the second swipe gesture: in accordance with a determination that the second swipe gesture meets second criteria, wherein the second criteria include a criterion that is met when the second swipe gesture starts away from the first edge of the touch-sensitive display, navigating within the first user interface, including replacing display of a first page of the multi-page application launch interface, which includes a first plurality of application icons, with display of a second page of the multi-page application launch interface that includes a second plurality of application icons. (claim 11; and in response to detecting the first input on the touch-sensitive display, determining a response from at least three possible responses to the first input based on evaluating the first input against a plurality of criteria, including: in accordance with a determination that the first input includes movement of the first contact in a first direction starting from a first region of the touch-sensitive display that is away from a first edge of the touch-sensitive display, replacing display of the first page of the multi-page application launch interface with display of a second page of the multi-page application launch interface that includes a second plurality of application icons that are different from the first plurality of application icons;) Claim 9 depends on claim 1: Claim 11 of Patent No. 10481769 teaches including: while displaying the first user interface, detecting a second swipe gesture; in response to detecting the second swipe gesture: in accordance with a determination that the second swipe gesture meets search-interface display criteria, wherein the search-interface display criteria include a criterion that is met when the second swipe gesture starts over a region of the touch-sensitive display that does not correspond to the first edge, replacing display of at least a portion of the first user interface with display of a search interface that includes a search input field for inputting search terms. (claim 11; in accordance with a determination that the first input includes movement of the first contact in a second direction that is perpendicular to the first direction starting from the first region of the touch-sensitive display that is away from the first edge of the touch-sensitive display, replacing display of at least a portion of the first page of the multi-page application launch interface with display of a search interface that includes a search input field for inputting search terms;) Claim 15: System Claim 15 is substantially encompassed in method claim 1; therefore, claim 15 is rejected using the same rationale set forth in claim 1. Claim 16: Manufacture Claim 16 is substantially encompassed in method claim 1; therefore, claim 16 is rejected using the same rationale set forth in claim 1. Claims 1, 9, 10 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 12 and 17 of U.S. Patent No. 10481769 in view of Chaudhri; Imran, U.S. Published Application No. 20110252357 A1. Claim 10 depends on claim 9: System Claim 12 of Patent No. 10481769/Chaudhri teaches method claims 1 and. (see above; using the same rationale set forth in method claims 1 and 9 respectively with method claim 11 of Patent No. 10481769/Chaudhri) System Claim 17 of Patent No. 10481769 teaches wherein: the first user interface is a multi-page application launch interface that includes multiple pages of application icons; (claim 12; the one or more programs including instructions for: displaying, on the touch-sensitive display, a first page of a multi-page application launch interface that includes multiple pages of application icons for launching distinct applications,) the first criteria include a criterion that is met when the swipe gesture moves in a first predetermined direction on the touch-sensitive display; and (claim 12; while displaying the first page of the multi-page application launch interface on the touch-sensitive display, detecting, on the touch-sensitive display, a first input that includes detecting a first contact and detecting movement of the first contact on the touch-sensitive display; and in response to detecting the first input on the touch-sensitive display, determining a response from at least three possible responses to the first input based on evaluating the first input against a plurality of criteria) the search-interface display criteria include a criterion that is met when the second swipe gesture moves in a second predetermined direction on the touch-sensitive display, different from the first predetermined direction; and ( claim 12; in accordance with a determination that the first input includes movement of the first contact in a second direction that is perpendicular to the first direction starting from the first region of the touch-sensitive display that is away from the first edge of the touch-sensitive display, replacing display of at least a portion of the first page of the multi-page application launch interface with display of a search interface that includes a search input field for inputting search terms) displaying the search interface includes: translating a set of application icons from a first page of the multi-page application launch interface away from a respective edge of the touch-sensitive display in the second predetermined direction; and displaying the search input field in between the respective edge of the touch-sensitive display and the set of application icons. (claim 17; wherein displaying the search interface includes: translating a set of application icons from the first page away from a respective edge of the display in the second direction; and displaying the search input field in between the respective edge of the display and the application icons.) Claims 1 and 14 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 12-14 of U.S. Patent No. 10481769 in view of Chaudhri; Imran, U.S. Published Application No. 20110252357 A1. Claim 14 depends on claim 1: System Claim 12 of Patent No. 10481769/Chaudhri teaches method claim 1. (see above; using the same rationale set forth in method claim 1 with method claim 11 of Patent No. 10481769/Chaudhri) System Claim 14 of Patent No. 10481769 teaches in response to detecting the swipe gesture: in accordance with a determination that the swipe gesture meets settings-interface display criteria, wherein the settings- interface display criteria include a criterion that is met when the swipe gesture starts adjacent to a second edge of the touch-sensitive display that is different from the first edge of the touch-sensitive display, displaying a settings interface that includes controls for changing a plurality of device settings. (claim 14; wherein the one or more programs further include instructions for: in response to detecting the first input: in accordance with a determination that the first input includes movement of the first contact starting from a third edge of the touch-sensitive display that is different from the first edge of the touch-sensitive display and the second edge of the touch-sensitive display, displaying a settings interface that includes controls for changing a plurality of device settings.) Claims 2 and 3 are rejected on the ground of nonstatutory double patenting as being unpatentable over claim 11 of U.S. Patent No. 10481769 in view of Chaudhri; Imran, U.S. Published Application No. 20110252357 A1 in further view of Lazaridis et al. (hereinafter “Lazaridis”), U.S. Published 20120235930 A1. Claim 2 depends on claim 1: Claim 11 of Patent No. 10481769/Chaudhri fail to expressly teach wherein displaying the reduced scale representation of the first user interface and concurrently displaying the reduced scale representation of the second user interface includes reducing a size of the reduced scale representation of the first user interface as the swipe gesture moves away from the first edge. However, Lazaridis teaches wherein displaying the reduced scale representation of the first user interface and concurrently displaying the reduced scale representation of the second user interface includes reducing a size of the reduced scale representation of the first user interface as the swipe gesture moves away from the first edge. (e.g., in response to swipe gesture from edge of touch screen, moving first and second interface at a reduced size across the screen as shown in Lazaridis’s Figure 15 Lazaridis; par. 25; For example, FIG. 4 and FIG. 5 illustrate that the first application information is reduced in size more as the path 402 of the gesture extends further into the display area 202. par. 44; As shown in FIG. 15, the second application information 1502 shifts or scrolls onto the display 118 beginning at the right side or edge of the display 118 when the gesture is detected, which may include a slight delay. In this example, the second application information 1502 scrolls or shifts onto the display 118 from the same edge or side associated with the gesture. Par. 56; The information associated with the second application may shift onto the touch-sensitive display from a first edge of the touch-sensitive display while the information associated with the first application shifts off a second edge of the touch-sensitive display, wherein the second edge is opposite the first edge. Icons or information other than the first application information may also scroll or shift onto or off of the display as the first application information or the second application information scrolls onto or off of the display, changes size, and so forth.) It would have been obvious to one of ordinary skill in the art before effective filing date of the claimed invention to modify the gestures for concurrently displaying reduced scale representations of open applications as taught by Claim 11 of Patent No. 10481769/Chaudhri to be based on edge gestures as taught by Lazaridis with a reasonable expectation of success, to provide the benefit of an intuitive continuous way of previewing applications (see Lazaridis; par. 29, par. 50) Claim 3 depends on claim 1: Claim 11 of Patent No. 10481769/Chaudhri fail to expressly teach wherein displaying the reduced scale representation of the first user interface and concurrently displaying the reduced scale representation of the second user interface includes, as the swipe gesture moves away from the first edge, moving the reduced scale representation of the first user interface and the reduced scale representation of the second user interface. However, Lazaridis teaches wherein displaying the reduced scale representation of the first user interface and concurrently displaying the reduced scale representation of the second user interface includes, as the swipe gesture moves away from the first edge, moving the reduced scale representation of the first user interface and the reduced scale representation of the second user interface. (e.g., in response to swipe gesture from edge of touch screen, moving first and second interface at a reduced size across the screen as shown in Lazaridis’s Figure 15 Lazaridis; par. 25; For example, FIG. 4 and FIG. 5 illustrate that the first application information is reduced in size more as the path 402 of the gesture extends further into the display area 202. par. 44; As shown in FIG. 15, the second application information 1502 shifts or scrolls onto the display 118 beginning at the right side or edge of the display 118 when the gesture is detected, which may include a slight delay. In this example, the second application information 1502 scrolls or shifts onto the display 118 from the same edge or side associated with the gesture. Par. 56; The information associated with the second application may shift onto the touch-sensitive display from a first edge of the touch-sensitive display while the information associated with the first application shifts off a second edge of the touch-sensitive display, wherein the second edge is opposite the first edge. Icons or information other than the first application information may also scroll or shift onto or off of the display as the first application information or the second application information scrolls onto or off of the display, changes size, and so forth.) It would have been obvious to one of ordinary skill in the art before effective filing date of the claimed invention to modify the gestures for concurrently displaying reduced scale representations of open applications as taught by Claim 11 of Patent No. 10481769/Chaudhri to be based on edge gestures as taught by Lazaridis with a reasonable expectation of success, to provide the benefit of an intuitive continuous way of previewing applications (see Lazaridis; par. 29, par. 50) Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claim 16 is rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. Claim 16 recites the phrase “computer readable media”, which is not explicitly defined in the specification. Examiner submits that one of ordinary skill in the art would interpret the phrase to include a carrier wave, which is considered non statutory subject matter. Therefore, in such instance, the recited phrase is merely a signal and is not a process, a machine, a manufacture or a composition of matter. Accordingly, the claim fails to recite statutory subject matter as defined in 35 U.S.C. § 101. To overcome the 35 U.S.C. 101 rejection, Examiner suggests to amend claim to recite “non-transitory computer readable media”. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-8 and 15, 16 are rejected under 35 U.S.C. 103 as being unpatentable over Chaudhri; Imran, U.S. Published Application No. 20110252357 A1 in view of Lazaridis et al. (hereinafter “Lazaridis”), U.S. Published 20120235930 A1. Claim 1: Chaudhri teaches A method, comprising: at an electronic device with a touch-sensitive display: displaying on the touch-sensitive display a first user interface; (e.g., examples of touch screen interfaces are shown in Figures 5DD or 5EE; par. 239; In FIG. 5DD, in response to detecting gesture 525 at the location that corresponds to link 5020-1, the corresponding web page is displayed in web browser application view 5004-5. FIG. 5DD also illustrates that input 527 (e.g., a single or double click on home button 204) is detected. Par. 240; FIG. 5EE illustrates that, in response to detecting input 527, a portion of home screen 5001, web browser application view 5008-12, and a portion of mail application view 5008-10 are displayed.) while displaying the first user interface, detecting a swipe gesture; (e.g., scrolling user interface based on swipe gesture par. 240; FIG. 5EE also illustrates that swipe gesture 529 can be detected on touch screen at a location that corresponds to mail application view 5008-10, and in FIG. 5FF, in response to detecting swipe gesture 529, application views (e.g., 5008-10 and 5008-12) are scrolled, and a portion of map application view 5008-6 is displayed.) displaying a reduced scale representation of the first user interface and concurrently displaying a reduced scale representation of a second user interface (e.g., browser application view 5008-12 and mail application view 5008-10 are concurrently displayed at a reduced scale representations Figure 5EE; par. 50; FIG. 5EE illustrates that, in response to detecting input 527, a portion of home screen 5001, web browser application view 5008-12, and a portion of mail application view 5008-10 are displayed.) while concurrently displaying the reduced scale representation of the first user interface and the reduced scale representation of the second user interface: detecting a tap gesture at a location on the touch-sensitive display corresponding to the second user interface; (e.g., while browser application view 5008-12 and mail application view 5008-10 are concurrently displayed at a reduced scale representations, detecting a tap gesture 535 on the mail application view 5008-10; par. 240; FIG. 5EE illustrates that, in response to detecting input 527, a portion of home screen 5001, web browser application view 5008-12, and a portion of mail application view 5008-10 are displayed. FIG. 5EE also illustrates that tap gesture 535 can be detected at a location that corresponds to image 5008-10 of mail application.) and in response to detecting the tap gesture at a location on the touch-sensitive display corresponding to the second user interface, ceasing to display the first user interface and displaying a larger scale representation of the second user interface than the reduced scale representation of the second user interface. (e.g., in response to tap gesture 535, displaying mail application view without any other application view (i.e., displaying a larger scale of mail application view just like map application view of Fig. 5GG) par. 240; In response to detecting tap gesture 535, mail application view 5004-4 (as shown in FIG. 5CC) will be displayed without concurrently displaying any other application view. Par. 241; In FIG. 5FF, when gesture 531 (e.g., a tap gesture) is detected at a location that corresponds to map application view 5008-6, in response, a map application view is displayed, as shown in FIG. 5GG. ) Chaudhri fails to expressly teach in response to detecting the swipe gesture: in accordance with a determination that the swipe gesture meets first criteria, wherein the first criteria include a criterion that is met when the swipe gesture starts at a first edge of the touch-sensitive display and moves away from the first edge, displaying a reduced scale representation of the first user interface and concurrently displaying a reduced scale representation of a second user interface not displayed immediately prior to detecting the swipe gesture; However, Lazaridis teaches in response to detecting the swipe gesture: in accordance with a determination that the swipe gesture meets first criteria, wherein the first criteria include a criterion that is met when the swipe gesture starts at a first edge of the touch-sensitive display and moves away from the first edge, displaying a reduced scale representation of the first user interface and concurrently displaying a reduced scale representation of a second user interface not displayed immediately prior to detecting the swipe gesture; (e.g., in response to swipe gesture from edge of touch screen, moving first and second interface at a reduced size across the screen as shown in Figure 15 par. 25; For example, FIG. 4 and FIG. 5 illustrate that the first application information is reduced in size more as the path 402 of the gesture extends further into the display area 202. par. 44; As shown in FIG. 15, the second application information 1502 shifts or scrolls onto the display 118 beginning at the right side or edge of the display 118 when the gesture is detected, which may include a slight delay. In this example, the second application information 1502 scrolls or shifts onto the display 118 from the same edge or side associated with the gesture. Par. 56; The information associated with the second application may shift onto the touch-sensitive display from a first edge of the touch-sensitive display while the information associated with the first application shifts off a second edge of the touch-sensitive display, wherein the second edge is opposite the first edge. Icons or information other than the first application information may also scroll or shift onto or off of the display as the first application information or the second application information scrolls onto or off of the display, changes size, and so forth.) It would have been obvious to one of ordinary skill in the art before effective filing date of the claimed invention to replace the gesture for displaying reduced scale applications as taught by Chaudhri with the edge gestures as taught by Lazaridis with a reasonable expectation of success, to provide the benefit of an intuitive continuous way of previewing applications (see Lazaridis; par. 29, par. 50) Claim 2 depends on claim 1: Chaudhri/Lazaridis teaches wherein displaying the reduced scale representation of the first user interface and concurrently displaying the reduced scale representation of the second user interface includes reducing a size of the reduced scale representation of the first user interface as the swipe gesture moves away from the first edge. (e.g., in response to swipe gesture from edge of touch screen, moving first and second interface at a reduced size across the screen as shown in Lazaridis’s Figure 15 Lazaridis; par. 25; For example, FIG. 4 and FIG. 5 illustrate that the first application information is reduced in size more as the path 402 of the gesture extends further into the display area 202. par. 44; As shown in FIG. 15, the second application information 1502 shifts or scrolls onto the display 118 beginning at the right side or edge of the display 118 when the gesture is detected, which may include a slight delay. In this example, the second application information 1502 scrolls or shifts onto the display 118 from the same edge or side associated with the gesture. Par. 56; The information associated with the second application may shift onto the touch-sensitive display from a first edge of the touch-sensitive display while the information associated with the first application shifts off a second edge of the touch-sensitive display, wherein the second edge is opposite the first edge. Icons or information other than the first application information may also scroll or shift onto or off of the display as the first application information or the second application information scrolls onto or off of the display, changes size, and so forth.) Claim 3 depends on claim 1: Chaudhri/Lazaridis teaches wherein displaying the reduced scale representation of the first user interface and concurrently displaying the reduced scale representation of the second user interface includes, as the swipe gesture moves away from the first edge, moving the reduced scale representation of the first user interface and the reduced scale representation of the second user interface. (e.g., in response to swipe gesture from edge of touch screen, moving first and second interface at a reduced size across the screen as shown in Lazaridis’s Figure 15 Lazaridis; par. 25; For example, FIG. 4 and FIG. 5 illustrate that the first application information is reduced in size more as the path 402 of the gesture extends further into the display area 202. par. 44; As shown in FIG. 15, the second application information 1502 shifts or scrolls onto the display 118 beginning at the right side or edge of the display 118 when the gesture is detected, which may include a slight delay. In this example, the second application information 1502 scrolls or shifts onto the display 118 from the same edge or side associated with the gesture. Par. 56; The information associated with the second application may shift onto the touch-sensitive display from a first edge of the touch-sensitive display while the information associated with the first application shifts off a second edge of the touch-sensitive display, wherein the second edge is opposite the first edge. Icons or information other than the first application information may also scroll or shift onto or off of the display as the first application information or the second application information scrolls onto or off of the display, changes size, and so forth.) Claim 4 depends on claim 1: Chaudhri/Lazaridis teaches wherein the second user interface is a user interface for an open application. (e.g., second user interface such as open mail application view 5008-10 are concurrently displayed at a reduced scale representations Chaudhri’s Figure 5EE; Chaudhri; par. 50; FIG. 5EE illustrates that, in response to detecting input 527, a portion of home screen 5001, web browser application view 5008-12, and a portion of mail application view 5008-10 are displayed.) Claim 5 depends on claim 1: Chaudhri/Lazaridis teaches including: in response to detecting the swipe gesture: in accordance with the determination that the swipe gesture meets the first criteria, while displaying the reduced scale representation of the first user interface and the reduced scale representation of the second user interface, displaying a reduced scale representation of a third user interface not displayed immediately prior to detecting the swipe gesture. (e.g., displaying at least three reduced scale representations of applications as shown in Chaudhri’s Figure 5FF; see Chaudhri par. 240; FIG. 5EE also illustrates that swipe gesture 529 can be detected on touch screen at a location that corresponds to mail application view 5008-10, and in FIG. 5FF, in response to detecting swipe gesture 529, application views (e.g., 5008-10 and 5008-12) are scrolled, and a portion of map application view 5008-6 is displayed.) (e.g., displaying at least three reduced scale representations of applications in response to swipe gestures Lazaridis par. 53; Optionally, different gestures or gestures associates with different edges or sides or corners may preview multiple different applications. For example, a gesture associated with the right edge previews a messaging inbox, a gesture associated with the left edge previews a calendar, a gesture associated with the bottom edge previews an address book, and a gesture associated with the top edge previews a user-selected application. The user may be provided with the option to assign the application with the desired edge or side or corner. Par. 56; The information associated with the second application may shift onto the touch-sensitive display from a first edge of the touch-sensitive display while the information associated with the first application shifts off a second edge of the touch-sensitive display, wherein the second edge is opposite the first edge. Icons or information other than the first application information may also scroll or shift onto or off of the display as the first application information or the second application information scrolls onto or off of the display, changes size, and so forth.)) Claim 6 depends on claim 1: Chaudhri/Lazaridis teaches wherein the first user interface is a home screen having a plurality of application icons. (e.g., first user interface is a home screen having a plurality of application icons in the background as shown Chaudhri’s Figure 5 EE) Claim 7 depends on claim 1: Chaudhri/Lazaridis teaches wherein the first user interface is a multi-page application launch interface that includes multiple pages of application icons for launching distinct applications. (See Chaudhri FIG. 5A, 5P, 5Q and 5Z, an exemplary user interface ("home screen" 5001) displaying a plurality of application icons 5002 (e.g., 5002-21 through 5002-38) on touch screen 112 of a portable electronic device (e.g., portable multifunction device 100). [0212], application icons 5002 can be the same as application icons illustrated in FIGS. 5A-5K (e.g., map application icons 5002-6 and 5002-27 can be identical). In other embodiments, application icons 5002 displayed in a grid can be different from application icons displayed elsewhere (e.g., within an application icon area 5006 or on home screen 5001). [0228] Typically, the images of open applications are user selectable, and the images of open applications, when selected (e.g., by a gesture), initiate certain processes associated with them (e.g., displaying a corresponding application view [0278]) Claim 8 depends on claim 7: Chaudhri/Lazaridis teaches including: while displaying the first user interface, detecting a second swipe gesture; in response to detecting the second swipe gesture: in accordance with a determination that the second swipe gesture meets second criteria, wherein the second criteria include a criterion that is met when the second swipe gesture starts away from the first edge of the touch-sensitive display, navigating within the first user interface, including replacing display of a first page of the multi-page application launch interface, which includes a first plurality of application icons, with display of a second page of the multi-page application launch interface that includes a second plurality of application icons.(e.g., gesture 519 (e.g., a second swipe gesture) is detected on touch screen 112. Chaudhri’s FIG. 5Q illustrates that, in response to detecting gesture 519, open application icons 5002 displayed on touch screen 112 in FIG. 5P are scrolled off the display, and a different set of open application icons 5002 (e.g., 5002-10 through 5002-18) are displayed on touch screen 112 [0229] scrolling open application icons arranged in a grid. In FIG. 5P, open application icons 5002 (e.g., 5002-1 through 5002-9) are displayed in a three-by-three grid. In some embodiments, application icons 5002 can be the same as application icons illustrated in FIGS. 5A-5K (e.g., map application icons 5002-6 and 5002-27 can be identical). [0228]) Claim 15: Claim 15 is substantially encompassed in claim 1; therefore, claim 15 is rejected using the same rationale set forth in claim 1. Claim 16: Claim 16 is substantially encompassed in claim 1; therefore, claim 15 is rejected using the same rationale set forth in claim 1. Claims 9 and 14 are rejected under 35 U.S.C. 103 as being unpatentable over Chaudhri/ Lazaridis as cited above, in further view of Shutterworth et al. (hereinafter “Shutterworth”), U. S. Published Application No. 20140189578 A1, which claims priority to provisional 61788842 dated 3/15/2013. Claim 9 depends on claim 1: Chaudhri/ Lazaridis fails to expressly teach including: while displaying the first user interface, detecting a second swipe gesture; in response to detecting the second swipe gesture: in accordance with a determination that the second swipe gesture meets search-interface display criteria, wherein the search-interface display criteria include a criterion that is met when the second swipe gesture starts over a region of the touch-sensitive display that does not correspond to the first edge, replacing display of at least a portion of the first user interface with display of a search interface that includes a search input field for inputting search terms. However, Shutterworth teaches further including: while displaying the first user interface, detecting a second swipe gesture; in response to detecting the second swipe gesture: in accordance with a determination that the second swipe gesture meets search-interface display criteria, wherein the search-interface display criteria include a criterion that is met when the second swipe gesture starts over a region of the touch-sensitive display that does not correspond to the first edge, (e.g., The top edge provides access to system services, settings and searches, through a ranged gesture [0041] {Provisional pg. 4, Section B.3}, the left part of the top edge is dedicated to search. So initiating the gesture on the top left of the screen provides access to a range of system search options [0048] {Provisional pg. 5, fifth paragraph}, e.g. See FIG.s 9A and 10 {Provisional drawings 15 and 16}; an example, in which in a home screen, swiping down from the top left comer brings down the Search. [0070] {Provisional drawing 15}, a swipe down through the left part of the top edge always invokes the search experience. It is a ranged gesture that enables the selection of a particular search scope (by default, the Home scope is used, which searches everything). [0238] {Provisional pg. 21, third paragraph}, the search bar that enables a general search across multiple data sources is reached by a short swipe from the top left edge [0254] {Provisional pg. 23, Other optional features}, in a home screen, swiping down from the top left corner brings down the Search [0360] {Provisional drawing 15}) replacing display of at least a portion of the first user interface with display of a search interface that includes a search input field for inputting search terms. (See FIG.s 9 and 10 {Provisional drawings 15 and 16}, portion of Home Screen with Frequent Apps in FIG. 10 is replaced as shown in FIG. 9 with search bar for inputting search terms {Provisional drawings 15 and 16}; The user types in what to search for and Ubuntu takes care of the rest. [0239] {Provisional pg. 21, fourth paragraph}) It would have been obvious to one of ordinary skill in the art before effective filing date of the claimed invention to modify the interactions with an interactive interface as taught by Chaudhri/Lazaridis to include interactions for searching as taught by Shutterworth with a reasonable expectation of success, to provide the benefit of intuitive faster interactions when performing search tasks with the GUI. (see Shutterworth; Provisional page 1) Claim 14 depends on claim 1: Chaudhri teaches: displaying a settings interface that includes controls for changing a plurality of device settings [adjacent to a second edge of the touch-sensitive display that is different from the first edge of the touch-sensitive display ]. (see FIG. 5M illustrates an exemplary user interface including settings icons in the predefined area. In some embodiments, the settings icons are displayed in response to detecting left-to-right swipe gesture 537. In FIG. 5M, settings icons (e.g., rotate lock icon 5102-1, Wi-Fi icon 5102-2, and Bluetooth icon 5102-3) are displayed in application icon area 5006-1. Each settings icon, when activated (e.g., by a finger gesture), changes a corresponding setting (e.g., Wi-Fi icon 5102-2, when activated, turns on or off a Wi-Fi connection). In some embodiments, one or more application icons can be concurrently displayed with settings icons in application icon area 5006 (not shown). [0224]) Chaudhri/ Lazaridis fails to expressly teach in response to detecting the swipe gesture: in accordance with a determination that the swipe gesture meets settings-interface display criteria, wherein the settings- interface display criteria include a criterion that is met when the swipe gesture starts adjacent to a second edge of the touch-sensitive display that is different from the first edge of the touch-sensitive display, displaying a settings interface that includes controls for changing a plurality of device settings. However, Shuttleworth teaches: while displaying the first user interface, detecting a second swipe gesture; in response to detecting the second swipe gesture: in response to detecting the swipe gesture: in accordance with a determination that the swipe gesture meets settings-interface display criteria, wherein the settings- interface display criteria include a criterion that is met when the swipe gesture starts adjacent to a second edge of the touch-sensitive display that is different from the first edge of the touch-sensitive display, displaying a settings interface that includes controls for changing a plurality of device settings. (e.g., The top edge hosts a set of status indicators, and provides a means to access system-wide settings and features (typically network, battery, clock and calendar, sound etc.) as well as system or system-wide capabilities (such as messaging and search). [0023] {Provisional pg. 2, section A.2}, The top edge provides access to system services, settings and searches, through a ranged gesture [0041] {Provisional pg. 4, Section B.3}, it is very easy to reach settings through the top edge. They are on the right of the top edge, so an initial swipe through the right part of the top edge reveals system indicators if they were off-screen (for full-screen applications) and selects the closest [0044] {Provisional pg. 5, third paragraph}, at the top right of the screen are the system status and function icons, such as time and date, volume, network, messaging, battery [0104] {Provisional pg. 8, fifth paragraph}, Swipe from top edge to bring up system status parameters for the system icons displayed on the right side of the top edge [0113] {Provisional pg. 9, Section A.2}, the user can select a system status icon placed close to the top edge and swipe down from it (e.g. touch it directly and then pull-down) to reveal a screen that lists the setting(s) relevant to that icon, allowing the user to rapidly update the setting(s) [0135] {Provisional pg. 11, paragraph fourth paragraph} A short swipe from the top right edge causes a pane relating to one system information function to be displayed, and continuing that swipe down expands the pane to include user-selectable parameters for that function. [0159] {Provisional pg. 13, section B.2} See FIG. 34 {Provisional drawing 49}, example of available views from swiping down (and then sideways, as mentioned above) from the top right edge of the screen where the indicators are displayed [0388] {Provisional Drawings 48-49}) It would have been obvious to one of ordinary skill in the art before effective filing date of the claimed invention to modify the interactions with an interactive interface as taught by Chaudhri/Lazaridis to include interactions for settings as taught by Shutterworth with a reasonable expectation of success, to provide the benefit of intuitive faster interactions when performing settings related tasks with the GUI. (see Shutterworth; Provisional page 1) Claims 10-13 are rejected under 35 U.S.C. 103 as being unpatentable over Chaudhri/ Lazaridis/Shutterworth as cited above and applied to claim 9, in further view of Chaudhri et al. (hereinafter “Chaudhri 533”), U. S. Published Application No. 20100231533 A1. Claim 10 depends on claim 9: Chaudhri teaches:the multi-page application launch interface includes multiple pages of application icons; and (See FIG. 5A, 5P, 5Q, 5W and 5Z, an exemplary user interface ("home screen" 5001) displaying a plurality of application icons 5002 (e.g., 5002-21 through 5002-38) on touch screen 112 of a portable electronic device (e.g., portable multifunction device 100). [0212], application icons 5002 can be the same as application icons illustrated in FIGS. 5A-5K (e.g., map application icons 5002-6 and 5002-27 can be identical). In other embodiments, application icons 5002 displayed in a grid can be different from application icons displayed elsewhere (e.g., within an application icon area 5006 or on home screen 5001). [0228] Typically, the images of open applications are user selectable, and the images of open applications, when selected (e.g., by a gesture), initiate certain processes associated with them (e.g., displaying a corresponding application view [0278]) displaying the search interface includes:... [displaying] a set of application icons from a first page of the multi-page application launch interface away from a respective edge of the touch-sensitive display...; and displaying the search input field in between the respective edge of the touch-sensitive display and the set of application icons. (See FIG.s 5Q, 5W and 5X with Search icon 1504 between top edge and icons 5002-10 - 5002-18; search icon 5104 and keyboard 5014 are concurrently displayed on touch screen 112 with at least a subset of open application icons 5002 (e.g., application icons 5002 in FIG. 5X) [0234], a search input user interface (e.g., the user interface in FIG. 5W, including keyboard 5014), receives one or more search terms in the search input user interface, performs a search using the one or more search terms, and displays results of the search [0280]) Lazaridis teaches: the first criteria include a criterion that is met when the swipe gesture moves in a first predetermined direction on the touch-sensitive display; (e.g., in response to swipe gesture from edge of touch screen, moving first and second interface at a reduced size across the screen as shown in Figure 15 par. 25; For example, FIG. 4 and FIG. 5 illustrate that the first application information is reduced in size more as the path 402 of the gesture extends further into the display area 202. par. 44; As shown in FIG. 15, the second application information 1502 shifts or scrolls onto the display 118 beginning at the right side or edge of the display 118 when the gesture is detected, which may include a slight delay. In this example, the second application information 1502 scrolls or shifts onto the display 118 from the same edge or side associated with the gesture. Par. 56; The information associated with the second application may shift onto the touch-sensitive display from a first edge of the touch-sensitive display while the information associated with the first application shifts off a second edge of the touch-sensitive display, wherein the second edge is opposite the first edge. Icons or information other than the first application information may also scroll or shift onto or off of the display as the first application information or the second application information scrolls onto or off of the display, changes size, and so forth.) the search-interface display criteria include a criterion that is met when the second swipe gesture moves in a second predetermined direction on the touch-sensitive display, different from the first predetermined direction; (e.g., The top edge provides access to system services, settings and searches, through a ranged gesture [0041] {Provisional pg. 4, Section B.3}, the left part of the top edge is dedicated to search. So initiating the gesture on the top left of the screen provides access to a range of system search options [0048] {Provisional pg. 5, fifth paragraph}, e.g. See FIG.s 9A and 10 {Provisional drawings 15 and 16}; an example, in which in a home screen, swiping down from the top left comer brings down the Search. [0070] {Provisional drawing 15}, a swipe down through the left part of the top edge always invokes the search experience. It is a ranged gesture that enables the selection of a particular search scope (by default, the Home scope is used, which searches everything). [0238] {Provisional pg. 21, third paragraph}, the search bar that enables a general search across multiple data sources is reached by a short swipe from the top left edge [0254] {Provisional pg. 23, Other optional features}, in a home screen, swiping down from the top left corner brings down the Search [0360] {Provisional drawing 15}) Shuttleworth teaches: the multi-page application launch interface includes multiple pages of application icons; (e.g., Home Screen is divided into a number of sections the user reaches by flicking up or down on the screen; the sections are: Frequent Apps; Favourite People; People Recently in Touch; Recent Music; Videos Popular Online [0273] {Provisional pg. 24, second paragraph}, swiping right from the Home Screen takes you to the Apps page; a further swipe right takes you to Videos. A swipe left takes you to People (contacts) and a further swipe right takes you to Music [0277] {Provisional pg. 24, sixth paragraph}, On the apps page, you find icons for all installed apps, plus apps available for download-so there's one consistent way to find apps on any Ubuntu device; specifically, there are sections for Running Apps; Frequently Used Apps; Installed Apps; Apps available for download [0278] {Provisional pg. 24-25, paragraph bridging the pages}, Home screen includes apps sections listing all installed apps and also apps available for download [0291] {Provisional 25 Section F.2} swiping right from the Home Screen takes you to the Apps page; a further swipe right takes you to Videos [0292] {Provisional pg. 25 Section F.2}) The combination of Chaudhri/Lazaridis/Shuttleworth fail to expressly teach : translating a set of application icons from a first page of the multi-page application launch interface away from a respective edge of the touch-sensitive display in the second predetermined direction; (emphasis added) However, Chaudhri ‘533 teaches: [a] multi-page application launch interface includes multiple pages of application icons; and (See FIG.s 5A and 5B, The device concurrently displays (602) a first plurality of application launch icons (e.g., 5002, 5004, 5006 and 5008 in FIG. 5A) in a first area (e.g., 5010 in FIG. 5A) of the touch screen display. In some embodiments, the device also displays a second plurality of application launch icons (e.g., 5009-1, 5009-2, 5009- 3, and 5009-4 in FIG. 5A) in a second area (e.g., 5012 in FIG. 5A) on the touch screen display, wherein the second area is different from the first area. [0147], the set- sequence-indicia icons provide (606) information about the number of sets of application launch icons in the plurality of sets of application launch icons and a position of a displayed set of application launch icons in the sequence of sets of application launch icons. For example, in some embodiments, if there are two set sequence indicia icons, there are two sets of icons. In some embodiments, the device displays (607) a search indicia icon (e.g., 5016 in FIG. SA) adjacent to a set-sequence-indicia icon that corresponds to the first set of application launch icons [0149], a finger gesture on the touch screen display in the first area (e.g., a finger swipe gesture such as a contact 5018 in FIG. 5A and a right-to-left finger swipe 5020 in FIG. 5A). In some embodiments, in response (609) to detecting the finger gesture on the touch screen display in the first area, the device replaces (610) display of the first set of application launch icons with display of a second set of application launch icons (e.g., 5022, 5024, 5026 and 5028 in FIG. 5B) in the first area on the touch screen display [0150]) displaying [a] search interface includes: translating a set of application icons from a first page of the multi-page application launch interface away from a respective edge of [a] touch-sensitive display in [a] second predetermined direction; and displaying the search input field in between the respective edge of the touch-sensitive display and the set of application icons. (the animation includes sliding the search input user interface (e.g., 5037 in FIG. 5G) onto the screen from the top of the screen while sliding the plurality of application launch icons towards a bottom of the screen after detecting a user gesture (e.g., contact 5041 and downward swipe 5043 in FIG. 5G). [0153] also In some embodiments, the animation comprises (618) sliding the search input user interface (e.g., 5037 in FIG. 5D) onto the touch screen display from a first side of the display while concurrently sliding the first plurality of application launch icons towards a second side of the display opposite the first side of the display (e.g., as shown in FIGS. 5D, 5E and 5F). [0153], e.g. FIG.s 5C and 5G, icons for "iTunes" and "App Store" cease to display while other icons are maintained, and FIG.s 5D and 5E, icons for "Messages", "You Tube", "Clock" and "iTunes" are maintained while other icons cease to be displayed; also FIG. 5F, the device ceases (614) to display the second set of application launch icons (e.g., the device displays a search input user interface, as illustrated in FIG. 5F [0152]) Given that Shuttleworth teaches that many other data visualization variants are possible (Shuttleworth [0171] {Provisional pg. 14, final paragraph}) and Chaudhri ‘357 also teaches embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated (Chaudhri ‘357 [0318)), it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the displaying the search interface of Chaudhri ‘357/Lazaridis/Shuttleworth, to include translating a set of application icons from a first page of the multi-page application launch interface away from a respective edge of the touch-sensitive display in the second predetermined direction; and displaying the search input field in between the respective edge of the touch-sensitive display and the set of application icons, as taught by Chaudhri ‘533. One would have been motivated to make such a modification to provide faster, more efficient methods and interfaces for search. Such methods and interfaces reduce the cognitive burden on a user and produce a more efficient human-machine interface (Chaudhri ‘533 [0005)). Claim 11 depends on claim 10: In the interpretation in which “ceasing to display at least a subset of the application icons from the first page of the multi-page application launch interface” includes an overlay that covers the display of a subset of the application icons, Shuttleworth teaches: wherein displaying the search interface includes ceasing to display at least a subset of the application icons from the first page of the multi-page application launch interface. (See FIG.s 9 and 10 {Provisional drawings 15 and 16}, portion of Home Screen with Frequent Apps in FIG. 10 is replaced as shown in FIG. 9 with search bar for inputting search terms that covers icons on the Home page and other icons from the Home page remain displayed {Provisional drawings 15 and 16}; an example, in which in a home screen, swiping down from the top left corner brings down the Search. [0070] {Provisional drawing 15}, a swipe down through the left part of the top edge always invokes the search experience. It is a ranged gesture that enables the selection of a particular search scope (by default, the Home scope is used, which searches everything). [0238] {Provisional pg. 21, third paragraph}, the search bar that enables a general search across multiple data sources is reached by a short swipe from the top left edge [0254] {Provisional pg. 23, Other optional features}, in a home screen, swiping down from the top left corner brings down the Search [0360] {Provisional drawing 15}) In the alternative interpretation in which “ceasing to display at least a subset of the application icons from the first page of the multi-page application launch interface” includes removing icons from the display, Shuttleworth may not explicitly disclose: wherein displaying the search interface includes ceasing to display at least a subset of the application icons from the first page of the multi-page application launch interface. Chaudhri ‘533 teaches: wherein displaying the search interface includes ceasing to display at least a subset of the application icons from the first page of the multi-page application launch interface. (the animation includes sliding the search input user interface (e.g., 5037 in FIG. 5G) onto the screen from the top of the screen while sliding the plurality of application launch icons towards a bottom of the screen after detecting a user gesture (e.g., contact 5041 and downward swipe 5043 in FIG. 5G). [0153] also In some embodiments, the animation comprises (618) sliding the search input user interface (e.g., 5037 in FIG. 5D) onto the touch screen display from a first side of the display while concurrently sliding the first plurality of application launch icons towards a second side of the display opposite the first side of the display (e.g., as shown in FIGS. 5D, 5E and 5F). [0153], e.g. FIG.s 5C and 5G, icons for "iTunes" and "App Store" cease to display while other icons are maintained, and FIG.s 5D and 5E, icons for "Messages", "You Tube", "Clock" and "iTunes" are maintained while other icons cease to be displayed; also FIG. 5F, the device ceases (614) to display the second set of application launch icons (e.g., the device displays a search input user interface, as illustrated in FIG. 5F [0152]) Therefore, combining Chaudhri ‘357, Lazaridis, Shuttleworth, and Chaudhri ‘533 would meet the claim limitations for the same reasons as set forth in Claim 10. Claim 12 depends on claim 11: Chaudhri ‘357 teaches: maintaining display of one or more of the application icons from the first page from the multi-page application launch interface while displaying the search interface. (See FIG.s 5Q, 5W and 5X with Search icon 1504 between top edge and icons 5002-10 — 5002-18; search icon 5104 and keyboard 5014 are concurrently displayed on touch screen 112 with at least a subset of open application icons 5002 (e.g., application icons 5002 in FIG. 5X) [0234], a search input user interface (e.g., the user interface in FIG. 5W, including keyboard 5014), receives one or more search terms in the search input user interface, performs a search using the one or more search terms, and displays results of the search [0280]) Shuttleworth also teaches: in response to detecting the swipe gesture, maintaining display of one or more of the application icons from the first page from the multi-page application launch interface while displaying the search interface. (See FIG.s 9 and 10 {Provisional drawings 15 and 16}, portion of Home Screen with Frequent Apps in FIG. 10 is replaced as shown in FIG. 9 with search bar for inputting search terms and icons from the Home page remain displayed {Provisional drawings 15 and 16}; an example, in which in a home screen, swiping down from the top left corner brings down the Search. [0070] {Provisional drawing 15}, a swipe down through the left part of the top edge always invokes the search experience. It is a ranged gesture that enables the selection of a particular search scope (by default, the Home scope is used, which searches everything). [0238] {Provisional pg. 21, third paragraph}, the search bar that enables a general search across multiple data sources is reached by a short swipe from the top left edge [0254] {Provisional pg. 23, Other optional features}, in a home screen, swiping down from the top left corner brings down the Search [0360] {Provisional drawing 15}) Also, Chaudhri ‘533 teaches: in response to detecting the second swipe gesture, maintaining display of one or more of the application icons from the first page of the multi-page application launch interface while displaying the search interface. (the animation includes sliding the search input user interface (e.g., 5037 in FIG. 5G) onto the screen from the top of the screen while sliding the plurality of application launch icons towards a bottom of the screen after detecting a user gesture (e.g., contact 5041 and downward swipe 5043 in FIG. 5G). [0153] also In some embodiments, the animation comprises (618) sliding the search input user interface (e.g., 5037 in FIG. 5D) onto the touch screen display from a first side of the display while concurrently sliding the first plurality of application launch icons towards a second side of the display opposite the first side of the display (e.g., as shown in FIGS. 5D, 5E and 5F). [0153], e.g. FIG.s 5C and 5G, icons for "iTunes" and "App Store" cease to display while other icons are maintained, and FIG.s 5D and 5E, icons for "Messages", "You Tube", "Clock" and "iTunes" are maintained while other icons cease to be displayed; also FIG. 5F, the device ceases (614) to display the second set of application launch icons (e.g., the device displays a search input user interface, as illustrated in FIG. 5F [0152]) Therefore, combining Chaudhri ‘357, Lazaridis, Shuttleworth and Chaudhri ‘533 would meet the claim limitations for the same reasons as set forth in Claim 10. Claim 13 depends on claim 10: Chaudhri ‘357, Lazaridis, Shuttleworth fail to expressly teach: the multi-page application launch interface includes a fixed icon region that includes a plurality of application icons, and the fixed icon region is concurrently displayed with each page of the multiple pages of application icons; and displaying the search interface in response to detecting the swipe gesture includes ceasing to display the fixed icon region. However, Chaudhri ‘533 teaches: the multi-page application launch interface includes a fixed icon region that includes a plurality of application icons, and the fixed icon region is concurrently displayed with each page of the multiple pages of application icons; and (See FIG.s 5A- 5B, icons 5009-1 - 5009-4 in fixed icon region displayed with each set of application icons; a second plurality of application launch icons (e.g., 5009-1, 5009-2, 5009-3, and 5009-4 in FIG. 5A) in a second area (e.g., 5012 in FIG. 5A) on the touch screen display, wherein the second area is different from the first area [0147]) displaying the search interface in response to detecting the swipe gesture includes ceasing to display the fixed icon region. (See FIG. 5F, the device ceases (614) to display the second set of application launch icons (e.g., the device displays a search input user interface, as illustrated in FIG. 5F [0152]) Given that Shuttleworth teaches that many other data visualization variants are possible (Shuttleworth [0171] {Provisional pg. 14, final paragraph}) and Chaudhri ‘357 also teaches embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated (Chaudhri ‘357 [0318)), it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the displaying the search interface and multi-page application launch interface of Chaudhri ‘357, Lazaridis and Shuttleworth, to include the multi-page application launch interface includes a fixed icon region that includes a plurality of application icons, and the fixed icon region is concurrently displayed with each page of the multiple pages of application icons; and displaying the search interface in response to detecting the swipe gesture includes ceasing to display the fixed icon region, as taught by Chaudhri ‘533. One would have been motivated to make such a modification to provide faster, more efficient methods and interfaces for search. Such methods and interfaces reduce the cognitive burden on a user and produce a more efficient human-machine interface (Chaudhri ‘533 [0005)). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Bakker; Jan Hendrik, U.S. Published Application No. 20130038541. Bakker teaches: in response to detecting [a] swipe gesture: in accordance with a determination that the swipe gesture meets settings-interface display criteria, wherein the settings- interface display criteria include a criterion that is met when the swipe gesture starts adjacent to a third edge of [a] touch-sensitive display that is different from [a] first edge of the touch-sensitive display and [a] second edge of the touch-sensitive display, displaying a settings interface that includes controls for changing a plurality of device settings. (See FIG.s 3-5, a home screen associated with a root-navigation application 510 to replace the information associated with the current application 502 [0047], The touch 302 begins at the origin point outside the boundary 210 and outside the buffer region 212. The path of the touch 302 crosses the buffer region 212 and the boundary 210 and is therefore identified as a meta-navigation gesture. The touch 302 begins at the origin point outside the boundary 210 and outside the buffer region 212. The path of the touch 302 crosses the buffer region 212 and the boundary 210 and is therefore identified as a meta-navigation gesture. [0043] The meta-navigation gesture 314, which in the present example originates near a bottom, left corner of the touch-sensitive display 118, causes information associated with a status application 514 to be tiled over the information associated with the current application 502. Similarly, in the present example the meta-navigation gesture 316, which originates near a bottom, right corner of the touch-sensitive display 118, causes information associated with the status application 514 to be tiled over the information associated with the current application 502. [0052]) Any inquiry concerning this communication or earlier communications from the examiner should be directed to HENRY ORR whose telephone number is (571)270-1308. The examiner can normally be reached 9AM-5PM EST M-F. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Adam Queler can be reached at (571)272-4140. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /HENRY ORR/Primary Examiner, Art Unit 2172
Read full office action

Prosecution Timeline

Sep 27, 2023
Application Filed
Nov 29, 2025
Non-Final Rejection — §101, §103, §DP
Mar 18, 2026
Applicant Interview (Telephonic)
Mar 18, 2026
Examiner Interview Summary

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12578851
SYSTEMS, METHODS, AND GRAPHICAL USER INTERFACES FOR GENERATING SHORT RUN CONTROL CHARTS
2y 5m to grant Granted Mar 17, 2026
Patent 12572268
ACCELERATED SCROLLING AND SELECTION
2y 5m to grant Granted Mar 10, 2026
Patent 12561589
SYSTEM AND METHOD FOR INDUSTRIAL AUTOMATION RULES ENGINE
2y 5m to grant Granted Feb 24, 2026
Patent 12547304
INFORMATION PROCESSING SYSTEM, METHOD, AND NON-TRANSITORY COMPUTER READABLE MEDIUM FOR DISPLAYING ENLARGEED IMAGE CORRESPONDING TO A FILE IMAGE
2y 5m to grant Granted Feb 10, 2026
Patent 12530968
MAP-BASED EMERGENCY CALL MANAGEMENT AND DISPATCH
2y 5m to grant Granted Jan 20, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
50%
Grant Probability
88%
With Interview (+37.2%)
3y 10m
Median Time to Grant
Low
PTA Risk
Based on 456 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month