Prosecution Insights
Last updated: April 19, 2026
Application No. 18/734,854

USER INTERFACE FOR SEARCHING

Final Rejection §103
Filed
Jun 05, 2024
Examiner
MAY, ROBERT F
Art Unit
2154
Tech Center
2100 — Computer Architecture & Software
Assignee
Apple Inc.
OA Round
4 (Final)
76%
Grant Probability
Favorable
5-6
OA Rounds
3y 3m
To Grant
99%
With Interview

Examiner Intelligence

Grants 76% — above average
76%
Career Allow Rate
216 granted / 286 resolved
+20.5% vs TC avg
Strong +30% interview lift
Without
With
+29.7%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
41 currently pending
Career history
327
Total Applications
across all art units

Statute-Specific Performance

§101
19.3%
-20.7% vs TC avg
§103
45.6%
+5.6% vs TC avg
§102
18.0%
-22.0% vs TC avg
§112
12.9%
-27.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 286 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION The Action is responsive to the Amendments and Remarks filed on 11/14/2025. Claims 1-39 are pending claims. Claims 1, 14, and 15 are written in independent form. Priority Applicant’s claim for benefit of prior-filed provisional application 62/005,912 under 35. U.S.C. 119(e) or under 35 U.S.C. 120, 121, or 365(c) is acknowledged. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-4, 9-11, 14-18, 23-25, 28-30, and 35-37 are rejected under 35 U.S.C. 103 as being unpatentable over Ray et al. (U.S. Pre-Grant Publication No. 2007/0204232, hereinafter referred to as Ray) and further in view of Jiang et al. (U.S. Pre-Grant Publication No. 2012/0284247, hereinafter referred to as Jiang). Regarding Claim 1: Ray teaches an electronic device comprising: A display; Ray teaches “a display device interface for displaying information on a display screen” (Para. [0068]). One or more input devices; Ray teaches “man-machine device interfaces include those that communicate by wire or wirelessly to man-machine interface devices 912 (e.g., a keyboard or keypad, a mouse or other graphical pointing device, a remote control, etc.) to manipulate and interact with a UI” (Para. [0068]). One or more processors; Ray teaches “device 902 includes one or more input/output (I/O interfaces 904, at least one processor 906, and one or more media 908…media 908 includes processor-executable instructions 910” (Para. [0067]). A memory; and Ray teaches “device 902 includes one or more input/output (I/O interfaces 904, at least one processor 906, and one or more media 908…media 908 includes processor-executable instructions 910” (Para. [0067]). One or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs including instructions for: Ray teaches “device 902 includes one or more input/output (I/O interfaces 904, at least one processor 906, and one or more media 908…media 908 includes processor-executable instructions 910” (Para. [0067]). Receiving, via the one or more input devices, a search input; Ray teaches a search input area 210 for receiving search input (Para. [0026] & Fig. 2 Element 210). Displaying, via the display, a user interface of a search application, wherein the user interface of the search application includes a first search result and a second search result based on the received search input; Ray teaches “UI 200 includes a program window 202” displaying search results 222 in search result area 214 based on the received search input in search input area 210 (Paras. [0025]-[0026]) & Fig. 2 Elements 210, 214, and 222). Ray further teaches “a display device interface for displaying information on a display screen” (Para. [0068]) thereby teaching displaying the search results of a search application program on a display device interface via the display screen. Detecting an input corresponding to the first search result; and Ray teaches “pane 302 enables selected search result(s) 222 to be previewed” (Para. [0040] thereby teaching a request to preview a first search result that has been selected to be previewed. In response to detecting the input corresponding to the first search result: In accordance with a determining that the input corresponding to the first search result is a request to preview the first search result: Ray teaches “pane 302 enables selected search result(s) 222 to be previewed” (Para. [0040]) thereby teaching a determining an input as a request to preview a respective search result. Ceasing display of the first search result and the second search result in the user interface of the search application; and Ray teaches “pane 302 enables selected search result(s) 222 to be previewed” (Para. [0040]) thereby teaching a request to preview a respective search result that has been selected to be previewed. Ray further teaches “Pane 302 also includes an available actions area 308A…308A includes actions that are available to be initiated with respect to the previewed item 306 that corresponds to the selected search result (Para. [0042]). Ray also teaches, upon selection of a search result 222, providing the preview of the selected search result “within a pop-up overlay 402 that presents available actions 308B” (Para. [0045] and Fig. 4) where “in UI 400, pop-up overlay 402 is displayed as touching or at least proximate to the selected search result 222(3), but another location within program window 202 may alternatively be chosen” (Para. [0047]), thereby teaching a display that ceases to display the first search result and second search result when a respective search result is selected and a preview for the selected search result is displayed in a determined location within program window 202. Ray further teaches, in relation to the display of a preview, that “although less than all of the material of item 306 may be displayed, UI 300 displays as much of a full version of item 306 as is capable of being displayed in pane 302 as governed by space limitations” where the preview pane 302 “may be capable of being sized and/or scrolled” and “sizing and/or scrolling the overall program window 202 may also change the amount of item 306 that is displayed within pane 302” (Para. [0041]), thereby providing a motivation to display a pane of a desired size containing the preview that can be located anywhere within the program window with the desired goal of displaying as much information as is capable of being displayed as governed by space limitations. Displaying, via the display, a preview of content of the first search result in the user interface of the search application, wherein the preview of content of the first search result includes one or more actionable user interface objects that, when activated, cause the electronic device to perform an operation associated with the first search result; Ray teaches “pane 302 enables selected search result(s) 222 to be previewed” (Para. [0040]) thereby teaching a request to preview a respective search result that has been selected to be previewed and displaying a preview of the content. Ray further teaches “Pane 302 also includes an available actions area 308A…308A includes actions that are available to be initiated with respect to the previewed item 306 that corresponds to the selected search result (Para. [0042]). While displaying, via the display, the preview of content of the first search result in the user interface of the search application, detecting, via the one or more input devices, an input corresponding to a first actionable user interface object of the one or more actionable user interface objects; Ray teaches “pane 302 enables selected search result(s) 222 to be previewed” (Para. [0040]) thereby teaching a request to preview a respective search result that has been selected to be previewed and displaying a preview of the content. Ray further teaches “Although less than all of the material of item 306 may be displayed, UI 300 displays as much of a full version of item 306 as is capable of being displayed in pane 302 as governed by space limitations. Pane 302 may be capable of being sized and/or scrolled. Sizing and/or scrolling the overall program window 202 may also change the amount of item 306 that is displayed within pane 302.” (Para. [0041]).Therefore, Ray teaches detecting an input corresponding to a first actionable user interface object for sizing and/or scrolling that changes the amount of item 306 that is displayed. In response to detecting the input corresponding to the first actionable user interface object, displaying, via the display, content associated with the first search result in the user interface of the search application; Ray teaches “Although less than all of the material of item 306 may be displayed, UI 300 displays as much of a full version of item 306 as is capable of being displayed in pane 302 as governed by space limitations. Pane 302 may be capable of being sized and/or scrolled. Sizing and/or scrolling the overall program window 202 may also change the amount of item 306 that is displayed within pane 302.” (Para. [0041]).Therefore, Ray teaches detecting an input corresponding to a first actionable user interface object for sizing and/or scrolling that changes the amount of item 306 (the content associated with the first search result) that is displayed. Ray teaches all of the elements as recited above except: Detecting, via the one or more input devices, an input corresponding to the content associated with the first search result displayed within the user interface of the search application; In response to detecting the input corresponding to the content associated with the first search result displayed within the user interface of the search application: Ceasing display of the user interface of the search application; and Displaying, via the display, a result of a task performed using the content associated with the first search result in a user interface of a first software application corresponding to the first search result; In accordance with a determining that the input corresponding to the first search result is a request to view the first search result: Ceasing display of the first search result and the second search result in the user interface of the search application; and Displaying, via the display, a user interface of a second software application corresponding to the first search result. However, in the related field of endeavor of previewing search results, Jiang teaches: Detecting, via the one or more input devices, an input corresponding to the content associated with the first search result displayed within the user interface of the search application; Ray teaches displaying content associated with search results displayed in the user interface of the search application by teaching “The displayed item 306 may include text, images, video, some combination thereof, and so forth, depending on the selected category 216 as well as the content of the particular item 306.” (Para. [0040] & Fig. 3). Jiang teaches “receiving a user-initiated selection of a term or phrase within the content 705 (see block 1004).” (Para. [0102]). Therefore, Ray in combination with Jiang teaches detecting an input corresponding to the content associated with the first search result while it is displayed within the user interface of the search application, the content including a term or phrase that can be selected by a user. In response to detecting the input corresponding to the content associated with the first search result displayed within the user interface of the search application: Ceasing display of the user interface of the search application; and Ray teaches displaying text content associated with search results displayed in the user interface of the search application by teaching “The displayed item 306 may include text, images, video, some combination thereof, and so forth, depending on the selected category 216 as well as the content of the particular item 306.” (Para. [0040] & Fig. 3). Jiang teaches “receiving a user-initiated selection of a term or phrase within the content 705 (see block 1004). As illustrated, the selection of the term or phrase (e.g., "inception") may occur via one of various ways, such as a word-finder tool 740 or manually highlighting. In response to the selection, the application(s) 720 that are relevant to the term or phase are determined (see block 1006).” And “These relevant application(s) 720 may then be presented in a pop-up display window 710 that overlays at least a portion of the display area 700 (see block 1008).” (Para. [0102]) and “if the app is recognized as being listed in the device's inventory (i.e., previously installed on the device), the search engine may automatically launch the app. In this way, the user is saved the steps of manually locating and starting the app” (Para. [0010]). By automatically launching the app, Jiang is teaching ceasing to display the user interface of the search application and instead displaying the automatically launched app in response to the user providing input corresponding to the content associated with the search result. Therefore, Ray in combination with Jiang teaches detecting an input corresponding to the content associated with the first search result while it is displayed within the user interface of the search application, the content including a term or phrase that can be selected by a user, and in response to the input/selection, automatically launching the application, thus ceasing display of the search application. Displaying, via the display, a result of a task performed using the content associated with the first search result in a user interface of a first software application corresponding to the first search result; Ray teaches displaying text content associated with search results displayed in the user interface of the search application by teaching “The displayed item 306 may include text, images, video, some combination thereof, and so forth, depending on the selected category 216 as well as the content of the particular item 306.” (Para. [0040] & Fig. 3). Jiang teaches “receiving a user-initiated selection of a term or phrase within the content 705 (see block 1004). As illustrated, the selection of the term or phrase (e.g., "inception") may occur via one of various ways, such as a word-finder tool 740 or manually highlighting. In response to the selection, the application(s) 720 that are relevant to the term or phase are determined (see block 1006).” And “These relevant application(s) 720 may then be presented in a pop-up display window 710 that overlays at least a portion of the display area 700 (see block 1008).” (Para. [0102]) and “if the app is recognized as being listed in the device's inventory (i.e., previously installed on the device), the search engine may automatically launch the app. In this way, the user is saved the steps of manually locating and starting the app” (Para. [0010]). By automatically launching the app, Jiang is teaching ceasing to display the user interface of the search application and instead displaying the automatically launched app in response to the user providing input corresponding to the content associated with the search result. Jiang further teaches “the spatial pairing informs the user about the context of the entry point at which the application 610 will be launched upon selection. For example, if the Cooking Mama.RTM. application 610 were launched from the search-results page 605, the proximity of the application 610 to the search result 615 (describing a recipe for chicken pizza) may alert the user that, upon selecting the application 610, the context of the entry point of Cooking Mama.RTM. might pertain to chicken pizza.” (Para. [0088]). Therefore, Jiang teaches displaying a result of a task (creating the entry point of the Cooking Mama application) performed using the content associate with the first search result (text/context pertaining to “chicken pizza”). In accordance with a determining that the input corresponding to the first search result is a request to view the first search result: Jiang teaches “if a website (e.g., website 811 that includes the web address 815) that exhibits parity with the candidate app (e.g., application 810 Yelp.RTM.) is listed towards the top of a search-results page 805 presented at the UI display 800, then there is a strong indication that the candidate app is useful to complete the user's tasks” and “select[ing] the candidate application for incorporation within the search results” (Para. [0059]). Jiang further teaches “the spatial pairing informs the user about the context of the entry point at which the application 610 will be launched upon selection. For example, if the Cooking Mama.RTM. application 610 were launched from the search-results page 605, the proximity of the application 610 to the search result 615 (describing a recipe for chicken pizza) may alert the user that, upon selecting the application 610, the context of the entry point of Cooking Mama.RTM. might pertain to chicken pizza.” (Para. [0088]). Ceasing display of the first search result and the second search result in the user interface of the search application; and Jiang teaches “upon receiving the selection of the subject application at the web browser 222, the web browser 222 may attempt to open the subject application on the client device 210. As depicted at operation 445, when the subject application is installed on the client device 210, the subject application is launched directly from the search-results page.” (Para. [0090]). Therefore, Jiang teaches ceasing display of the search results on the search-results page in favor of the subject application being opened/launched, and thus displayed. Displaying, via the display, a user interface of a second software application corresponding to the first search result. Jiang teaches “the spatial pairing informs the user about the context of the entry point at which the application 610 will be launched upon selection. For example, if the Cooking Mama.RTM. application 610 were launched from the search-results page 605, the proximity of the application 610 to the search result 615 (describing a recipe for chicken pizza) may alert the user that, upon selecting the application 610, the context of the entry point of Cooking Mama.RTM. might pertain to chicken pizza.” (Para. [0088]). Therefore, Jiang teaches displaying a user interface of second software application (launched application) corresponding to the first search result. Jiang also explicitly teaches “open the subject application on the client device 210.” (Para [0090]).Jiang further teaches multiple applications corresponding to a single item by teaching “the application(s) 720 that are relevant to the term or phase are determined (see block 1006)…These relevant application(s) 720 may then be presented in a pop-up display window 710 that overlays at least a portion of the display area 700 (see block 1008) (Para [0102]). Thus it would have been obvious to one of ordinary skill in the art, having the teachings of Jiang and Ray at the time that the claimed invention was effectively filed, to have combined the system and method for associating search results with actionable applications after the search results are received, as taught by Jiang, with the actionable search results, as taught by Ray. One would have been motivated to make such combination because it would have been obvious to a person having ordinary skill in the art that associating actionable applications with search results after the search has been performed (Jiang - Para. [0009]) allows for a more dynamic and robust system that makes the associations in real time rather than a pre-compiled list of applications for each search result that can become outdated as applications are created, deleted, or replaced. Regarding Claim 2: Ray and Jiang further teach: Wherein the preview of content of the first search result includes a plurality of different actionable user interface objects that, when activated, cause the electronic device to perform different operations associated with the first search result. Ray teaches “pane 302 enables selected search result(s) 222 to be previewed” (Para. [0040] thereby teaching a request to preview a respective search result that has been selected to be previewed. Ray further teaches “Pane 302 also includes an available actions area 308A…308A includes actions that are available to be initiated with respect to the previewed item 306 that corresponds to the selected search result” (Para. [0042]). Regarding Claim 3: Ray and Jiang further teach: Wherein detecting the input corresponding to the first search result includes detecting a selection of a preview affordance associated with the first search result. Ray teaches “pane 302 enables selected search result(s) 222 to be previewed” (Para. [0040] thereby teaching a request to preview a respective search result that has been selected to be previewed. Regarding Claim 4: Ray and Jiang further teach: Wherein the search input is received at the search application. Ray teaches “processor-executable instructions 110 include…applications” (Para. [0070] & Fig. 9 Element 110) thereby teaching performing the search using a search application. Regarding Claim 9: Ray and Jiang further teach: Determining whether the search input meets a computation criteria; and Ray teaches “the selected category 216, as visually represented by selection highlighter indicator 218, determines the context for a requested search” where “if the images category 216 is selected, the input search term ‘Terms XYZ’ is applied to, for example, a database or index of data, that pertains to images” and "if the mail category 216 is selected, then the input search term 'Terms XYZ' is applied to information pertaining to mail" (Para. [0036]). Therefore, Ray teaches determining whether the search input contains a selected category and in response, displaying a computation result of applying the input search term against a database or index of data pertaining to the selected category. In accordance with a determination that the search input meets the computation criteria, displaying a computation result of the one or more search results, the computation result based on a computation performed based on at least a portion of the search input. Ray teaches “the selected category 216, as visually represented by selection highlighter indicator 218, determines the context for a requested search” where “if the images category 216 is selected, the input search term ‘Terms XYZ’ is applied to, for example, a database or index of data, that pertains to images” and "if the mail category 216 is selected, then the input search term 'Terms XYZ' is applied to information pertaining to mail" (Para. [0036]). Therefore, Ray teaches determining whether the search input contains a selected category and in response, displaying a computation result of applying the input search term against a database or index of data pertaining to the selected category. Regarding Claim 10: Ray and Jiang further teach: Wherein the one or more actionable user interface objects include a second actionable user interface object, and the second actionable interface object initiates a communication with an entity associated with the respective search result when the second actionable user interface object is activated. Jiang teaches an actionable user interface object as being an application associated with a search result where "upon detecting a user-initiated selection of the app within the displayed search results, one or more actions may occur...if the app is recognized as being listed in the device's inventory (i.e., previously installed on the device), the search engine may automatically launch the app” (Paras. [0009]-[0010]). Regarding Claim 11: Ray and Jiang further teach: Wherein the one or more actionable user interface objects include a third actionable user interface object, and the third actionable interface object initiates playback of a media content associated with the respective search result when the third actionable user interface object is activated. Ray teaches "the displayed item 306 may include text, images, video, some combination thereof, and so forth, depending on the selected category 216 as well as the content of the particular item 306" (Para. [0040]) thereby teaching an actionable user interface object for initiating playback of media content associated with the respective search result when “the displayed item…include[s]….video” and playback of the video is initiated (Para. [0040]). Regarding Claim 14: Some of the limitations herein are similar to some or all of the limitations of Claim 1. Ray and Jiang further teach: A non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display (Ray - Para. [0071]). Regarding Claim 15: All of the limitations herein are similar to some or all of the limitations of Claim 1. Regarding Claim 16: All of the limitations herein are similar to some or all of the limitations of Claim 2. Regarding Claim 17: All of the limitations herein are similar to some or all of the limitations of Claim 3. Regarding Claim 18: All of the limitations herein are similar to some or all of the limitations of Claim 4. Regarding Claim 23: All of the limitations herein are similar to some or all of the limitations of Claim 9. Regarding Claim 24: All of the limitations herein are similar to some or all of the limitations of Claim 10. Regarding Claim 25: All of the limitations herein are similar to some or all of the limitations of Claim 11. Regarding Claim 28: All of the limitations herein are similar to some or all of the limitations of Claim 2. Regarding Claim 29: All of the limitations herein are similar to some or all of the limitations of Claim 3. Regarding Claim 30: All of the limitations herein are similar to some or all of the limitations of Claim 4. Regarding Claim 35: All of the limitations herein are similar to some or all of the limitations of Claim 9. Regarding Claim 36: All of the limitations herein are similar to some or all of the limitations of Claim 10. Regarding Claim 37: All of the limitations herein are similar to some or all of the limitations of Claim 11. Claim(s) 5-8, 12-13, 19-22, 26-27, 31-34, and 38-39 are rejected under 35 U.S.C. 103 as being unpatentable over Ray and Jiang, and further in view of Yang et al. (U.S. Pre-Grant Publication No. 2015/0161212, hereinafter referred to as Yang). Regarding Claim 5: Ray and Jiang explicitly teach all of the elements of the claimed invention as recited above except: wherein the preview of content of the first search result replaces the first search result in the user interface of the search application. However, in the related field of endeavor of previewing search results, Yang teaches: Wherein the preview of content of the first search result replaces the first search result in the user interface of the search application. Yang teaches "presentation of the preview graphic 220, or another extended result card, can include initially presenting only a portion (e.g., less than 100%) of the preview graphic 220, or another extended result card, at one end of the content portion 210, and animating the preview graphic 220 to move in the direction of the user swipe (e.g., form right to left in Fig. 2B)” (Para. [0049]) thereby teaching preview content replacing a respective search result. Thus it would have been obvious to one of ordinary skill in the art, having the teachings of Yang, Jiang, and Ray at the time that the claimed invention was effectively filed, to have combined the system and method of displaying extended result content using the swiping gesture, as taught by Yang, with the system and method for associating search results with actionable applications after the search results are received, as taught by Jiang, and the actionable search results, as taught by Ray. One would have been motivated to make such combination because Yang teaches an “increase in usability of tablet devices in search environments” by configuring the tablet device “to accept additional user interactions with a search result that allow the user to access extended result content” (Para. [0019]). Regarding Claim 6: Ray, Jiang, and Yang further teach: Wherein the preview of content of the first search result includes first information when the search input includes a first term, and the preview of content of the first search result includes second information when the search input includes a second term, the first term different from the second term. Yang teaches “the extended result content that the user can access through user interaction with a search result can also include other information that has been identified as relevant to the search query for which the search result was provided" (Para. [0022]) thereby teaching providing access to extended result content based on an identified relevance of the extended result content to the search query so when the search query changes, while still returning the same search result, the extended result content might accessible through user interaction can change as well, which will change the information and order of the extended result content as some extended result content are displayed and others are not. Regarding Claim 7: Ray, Jiang, and Yang further teach: Wherein the preview of content of the first search result includes information in a first order when the search input includes a first term, and the preview of content of the first search result includes information in a second order when the search input includes a second term, the first term different form the second term. Yang teaches “the extended result content that the user can access through user interaction with a search result can also include other information that has been identified as relevant to the search query for which the search result was provided" (Para. [0022]) thereby teaching providing access to extended result content based on an identified relevance of the extended result content to the search query so when the search query changes, while still returning the same search result, the extended result content might accessible through user interaction can change as well, which will change the information and order of the extended result content as some extended result content are displayed and others are not. Regarding Claim 8: Ray, Jiang, and Yang further teach: Detecting a swipe gesture on a touch-sensitive surface at a location that corresponds to a location of a first portion of the first search result on the display; and Yang teaches “a user swipe of the search result may be indicative of a request for the extended result content, such that the extended result content may be presented in response to detection of a user swipe” (Para. [0023]). In response to detecting the swipe gesture on the touch-sensitive surface at the location that corresponds to the location of the first portion of the first search result on the display, replacing the display of the first portion of the first search result with display of a second portion of the first search result. Yang teaches “a user swipe of the search result may be indicative of a request for the extended result content, such that the extended result content may be presented in response to detection of a user swipe” (Para. [0023]). Yang further teaches "presentation of the preview graphic 220, or another extended result card, can include initially presenting only a portion (e.g., less than 100%) of the preview graphic 220, or another extended result card, at one end of the content portion 210, and animating the preview graphic 220 to move in the direction of the user swipe (e.g., form right to left in Fig. 2B)” (Para. [0049]) thereby teaching preview content replacing a respective search result. Regarding Claim 12: Ray, Jiang, and Yang further teach: Wherein the one or more actionable user interface objects include a fourth actionable user interface object, and the fourth actionable interface object initiates display of webpage content associated with the respective search result when the fourth actionable user interface object is activated. Yang teaches “the user device 106 can be configured to insert at least a portion of a preview graphic into a search result with which the specified user interaction occurred…for example, in response to detecting the specified user interaction with a particular search result, the user device 106 can remove a portion of text from the search result and insert a preview graphic that visually represents three web pages of a web site to which the search result links” (Para. [0038]) thereby teaching a user activating to displaying webpage content associated with respective search result. Regarding Claim 13: Ray, Jiang, and Yang further teach: Wherein the one or more user interface objects includes a fifth actionable user interface object, the fifth actionable user interface object initiates display of map content associated with the first search result when the fifth actionable user interface object is activated. Yang teaches “a user can request presentation of extended result content for a particular search result by performing a user interaction that is indicative of a request for presentation of the extended result content…the extended result content can include…a map indicating a business location of a business that has been identified as relevant to the search query” (Para. [0022]). Yang further teaches “A determination is made whether user interaction with the search result has occurred (306). In some implementations, the user interaction is a user interaction that is indicative of a user request for presentation of extended result content for the search result. For example, the user interaction can be user interaction with an extended result interface element (e.g., a virtual button) that initiates presentation of the extended result content” (Para. [0076]). Regarding Claim 19: All of the limitations herein are similar to some or all of the limitations of Claim 5. Regarding Claim 20: All of the limitations herein are similar to some or all of the limitations of Claim 6. Regarding Claim 21: All of the limitations herein are similar to some or all of the limitations of Claim 7. Regarding Claim 22: All of the limitations herein are similar to some or all of the limitations of Claim 8. Regarding Claim 26: All of the limitations herein are similar to some or all of the limitations of Claim 12. Regarding Claim 27: All of the limitations herein are similar to some or all of the limitations of Claim 13. Regarding Claim 31: All of the limitations herein are similar to some or all of the limitations of Claim 5. Regarding Claim 32: All of the limitations herein are similar to some or all of the limitations of Claim 6. Regarding Claim 33: All of the limitations herein are similar to some or all of the limitations of Claim 7. Regarding Claim 34: All of the limitations herein are similar to some or all of the limitations of Claim 8. Regarding Claim 38: All of the limitations herein are similar to some or all of the limitations of Claim 12. Regarding Claim 39: All of the limitations herein are similar to some or all of the limitations of Claim 13. Response to Amendment Applicant’s Amendments, filed on 11/14/2025, are acknowledged and accepted. In light of the Amendments filed on 11/14/2025, the claim objection of claims 13, 27, and 39 has been withdrawn. Response to Arguments On page 15 of the Remarks filed on 11/14/2025, Applicant argues that “the content being displayed in Jiang is not associated with a search at all and is displayed in a generic webpage. Thus, Jiang fails to disclose to suggest ‘content associated with the first search result displayed within the user interface of the search application,’ much less a selection of such content as required by the amended independent claims”. Applicant’s argument is not convincing because it is Jiang was relied upon as teaching detecting the input corresponding to content/text displayed (Para. [0102]), where Ray teaches a display of search results comprising content, including text, associated with the search results (Para. [0040] & Fig. 3). On pages 15-16 of the Remarks filed on 11/14/2025, Applicant argues that “Ray and Jiang also fail to disclose or suggest, alone or in combination, ‘in response to detecting the input corresponding to the content associated with the first search result displayed within the user interface of the search application: ceasing display of the user interface of the search application," as required by amended claims 1, 14, and 15.’” because “Jiang merely discloses that a pop-up including applications relevant to the terms selected can be displayed over the webpage including the content. However, Jiang is silent with regard to what is displayed when one of the applications is selected and if the webpage ceases to be displayed when an application as selected. Rather, Jiang merely discloses in a separate embodiment that "if the app is recognized as being listed in the device's inventory..., the search engine may automatically launch the app." (Jiang at [0010].)”. Applicant’s argument is not convincing because launching an application of interest is understood as opening the application in the display overtop any other applications. However, this is further reinforced by Jiang teaching “Upon launching the app at an entry point (e.g., in accordance with context of a user's search), the user may interact with the app for a period of time prior to returning to conduct further searches on the search engine” (Para. [0014]). On page 16 of the Remarks filed on 11/14/2025, Applicant argues that “as discussed above, Jiang fails to disclose that the alleged "content associated with the first search result" is "displayed within the user interface of the search application," because Jiang does not disclose that a "user interface of the search application" is displayed while the content is being displayed. Rather, Jiang discloses that the content is displayed within a webpage. Thus, even if something ceases to be displayed when an application is opened in the system of Jiang, which Applicant does not concede, the "user interface of the search application," cannot cease to be displayed as it was not displayed prior to detecting the input.”. Applicant’s argument is not convincing because it is Jiang was relied upon as teaching detecting the input corresponding to content/text displayed (Para. [0102]), where Ray teaches a display of search results comprising content, including text, associated with the search results (Para. [0040] & Fig. 3). On page 16 of the Remarks filed on 11/14/2025, Applicant argues that “Ray and Jiang fail to disclose or suggest, alone or in combination, "displaying, via the display, a result of a task performed using the content associated with the first search result in a user interface of a first software application corresponding to the first search result," much less doing so "in response to detecting the input corresponding to the content associated with the first search result displayed within the user interface of the search application," as required by amended claims 1, 14, and 15.” because “Jiang is silent with regard to what is displayed when one of the applications is selected and thus fails to disclose or suggest that the "user interface of a first software application" is displayed, much less "a result of a task performed using the content associated with the first search result," as recited by the amended claims.” Applicant’s argument related to the amended limitation has been considered but was not found to be convincing in overcoming the previously cited prior art and has been addressed in the rejection above. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Daher et al. (U.S. Pre-Grant Publication No. 2012/0323897) teaches providing query-dependent audio and video clip previews are provided. Using the systems and methods described herein, an identification of an audio or video clip relevant to a user search query is received. The user search query has one or more keywords. Occurrences of the keywords and the locations of the occurrences are identified in a transcription of the identified audio or video clip. Clip segments are extracted from the audio or video clip. Each extracted clip segment includes an identified keyword occurrence. A query-dependent clip preview is created that includes at least one extracted clip segment. The query-dependent clip preview can be provided in search results for the user search query to provide an informative preview that is specific to the query to which the clip is relevant. Silber et al. (U.S. Pre-Grant Publication No. 2013/0275422) teaches providing search result page previews. In one aspect, a method includes receiving data that specify a set of search results responsive to a search query. Query-relevant content is selected to be included in a page preview for at least one of the search results. In turn, data that cause presentation of the page preview are provided. The data provided can cause presentation of the query-relevant content at an initial zoom level and at a higher zoom level, where the initial zoom level is a zoom level at which both the query-relevant content and other content from the resource are presented. The page preview can include a page tear that defines multiple portions of the page preview for a resource. Choi et al. (U.S. Pre-Grant Publication No. 2011/0066610) teaches providing preview information for search results in real time. The search method includes receiving a list of search result items and preview information from a server in real time and outputting the list and the preview information. Therefore, a search time can be efficiently reduced, which enables a user to obtain a desired search result more rapidly. Jamil et al. (U.S. Pre-Grant Publication No. 2009/0234811) teaches context information for a user of a device is identified and is used to identify a set of keywords based at least in part on a current Web page being displayed and one or more previous Web pages displayed for the user. The set of keywords and/or information regarding previous searches are used to identify a set of query terms. The set of query terms are displayed as part of a user interface. Additionally, a user selection of a search preview option can be received while displaying a Web page. In response to receiving the user selection of the search preview option, a user-entered query term is sent to a search engine. Search results based on the query term are received from the search engine, and both the search results and the Web page are displayed concurrently in a same window. Palermiti, II (U.S. Pre-Grant Publication No. 2012/0109986) teaches one or more systems and/or techniques for presenting visual previews of search results are disclosed herein. In particular, a user may reference an identifier (e.g., "Bill") that may be used as search criteria to retrieve corresponding objects (e.g., photos of Bill). A visual preview of the retrieved objects may be presented to the user. The user may quickly view visual previews of search results by referencing various identifiers without committing to a particular search result set. Arrouye et al. (U.S. Pre-Grant Publication No. 2008/0033919) teaches “The following figures show examples of previews or other representations which are resizeable or zoomable or scrollable or pageable through. FIG. 30A shows an example of a preview 3001 displayed on a display device, either within a search result window or as an overlay on the window. The preview 3001 is scrollable and resizeable; it may be scrolled using any one of the scroll controls 3002, 3003 and/or 3004. It may be resized using the resize control 3005. FIG. 30B shows a preview 3010 which can display multiple documents or items in a scrollable format. The view shown in FIG. 30B of the preview 3010 shows only one document and another document can be selected for viewing using interface controls 3015, 3013, and 3017. The view of preview 3010 is scrollable using scroll controls 3011, 3012A and/or 3012B. The view of preview 3010 is also resizeable using resize control 3005. The user can also switch to display multiple documents or items at once in the view of preview 3010 by selecting the user interface control 3019 which will cause the preview shown in FIG. 30B to appear similar to the preview shown in FIG. 30C which shows multiple documents concurrently. The preview shown in FIG. 30C may also be scrollable.” (Para. [0108]). Chi (U.S. Pre-Grant Publication No. 2009/0089416) teaches “a page preview may be displayed upon the user selection of a hyperlink. For example, when a user selects a hyperlink, a small preview (such as a thumbnail image) may be generated providing the user an opportunity to preview the page prior to navigating to the page. Alternatively, a larger preview may be displayed, such as by dimming the search result page and overlaying a preview of the requested page.” (Para. [0048]). Jhaveri et al. (U.S. Pre-Grant Publication No. 2009/0282363) teaches in response to receiving a search query, at least one individual search result is presented in an overlay window, the overlay window being configured to overlay at least a portion of a document viewing window. Navigation between the overlay window and the document viewing window may be performed in response to receiving input of at least one pre-determined command. Content associated with a selected individual search result (or other document identifier) may be presented in the document viewing window and, substantially simultaneously, the overlay window may be hidden from view. Subsequently, in response to receiving at least one pre-determined command, the overlay window may be re-presented such that it again overlays at least a portion of the document viewing window. THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ROBERT F MAY whose telephone number is (571)272-3195. The examiner can normally be reached Monday-Friday 9:30am to 6pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Boris Gorney can be reached on 571-270-5626. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ROBERT F MAY/Examiner, Art Unit 2154 1/5/2026 /SYED H HASAN/Primary Examiner, Art Unit 2154
Read full office action

Prosecution Timeline

Jun 05, 2024
Application Filed
Dec 11, 2024
Response after Non-Final Action
Dec 28, 2024
Non-Final Rejection — §103
Mar 25, 2025
Examiner Interview Summary
Mar 25, 2025
Applicant Interview (Telephonic)
Mar 31, 2025
Response Filed
Jun 26, 2025
Final Rejection — §103
Aug 18, 2025
Examiner Interview Summary
Aug 18, 2025
Applicant Interview (Telephonic)
Aug 26, 2025
Request for Continued Examination
Sep 05, 2025
Response after Non-Final Action
Sep 27, 2025
Non-Final Rejection — §103
Oct 24, 2025
Applicant Interview (Telephonic)
Oct 24, 2025
Examiner Interview Summary
Nov 14, 2025
Response Filed
Jan 05, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12586145
METHOD AND APPARATUS FOR EDITING VIDEO IN ELECTRONIC DEVICE
2y 5m to grant Granted Mar 24, 2026
Patent 12468740
CATEGORY RECOMMENDATION WITH IMPLICIT ITEM FEEDBACK
2y 5m to grant Granted Nov 11, 2025
Patent 12367197
Pipelining a binary search algorithm of a sorted table
2y 5m to grant Granted Jul 22, 2025
Patent 12360955
Data Compression and Decompression Facilitated By Machine Learning
2y 5m to grant Granted Jul 15, 2025
Patent 12347550
IMAGING DISCOVERY UTILITY FOR AUGMENTING CLINICAL IMAGE MANAGEMENT
2y 5m to grant Granted Jul 01, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
76%
Grant Probability
99%
With Interview (+29.7%)
3y 3m
Median Time to Grant
High
PTA Risk
Based on 286 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month