Prosecution Insights
Last updated: April 19, 2026
Application No. 18/329,813

FUNCTIONAL TESTING OF USER INTERFACE BASED APPLICATIONS USING HEAT MAPS

Non-Final OA §103
Filed
Jun 06, 2023
Examiner
GALERA, PATRICK PAUL CONTRER
Art Unit
2617
Tech Center
2600 — Communications
Assignee
DELL PRODUCTS, L.P.
OA Round
3 (Non-Final)
86%
Grant Probability
Favorable
3-4
OA Rounds
2y 5m
To Grant
99%
With Interview

Examiner Intelligence

Grants 86% — above average
86%
Career Allow Rate
6 granted / 7 resolved
+23.7% vs TC avg
Strong +17% interview lift
Without
With
+16.7%
Interview Lift
resolved cases with interview
Typical timeline
2y 5m
Avg Prosecution
21 currently pending
Career history
28
Total Applications
across all art units

Statute-Specific Performance

§101
2.1%
-37.9% vs TC avg
§103
72.9%
+32.9% vs TC avg
§102
18.8%
-21.2% vs TC avg
§112
5.2%
-34.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 7 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment/Arguments Applicant’s amendments filed on 12/03/2025 have been considered. Applicant’s remarks page 7, “Claims 1-4, 7-14 and 16-23 will be currently pending in the subject application and under consideration after entry of the subject Reply. Claims 1, 8-11 and 17-19 have been amended, claim 5 has been canceled (and incorporated into claim 1) and claims 23 is new, as indicated at pages 2-6 of the Reply. No new matter has been introduced by the herein amendments” Applicant argues on page 13 of the applicant’s remarks that paragraph 66 of Nagamalla “does not state or suggest that the behavior tracking system is executed during execution of the declarative testing tool”. The examiner respectfully disagrees. Applicant's arguments filed on 12/03/2025 have been fully considered but they are not persuasive because the heatmap generation tool of Nagamalla is executed during execution of the declarative testing tool. Nagamalla: ¶66, the automated declarative testing tool 140 re-creates the user experience . ¶36, The user experience refers to the set of actions that a user may perform on a site such as clicking and typing. ¶110, “. . .while testing the system under test . . . The declarative testing tool 140 is simulating a user experience on the system side.” (NOTE: The simulation of the user experience is computer generated, and not done by an actual person during testing. Therefore, Nagamalla’s declarative testing tool re-creates and simulates the user experience, or re-creates and simulates mouse clicks and keyboard typing during testing.) ¶66, “By using the declarative testing tool 140 to test the system under test 210 (e.g., a software application residing on application server(s) 118, the declarative testing tool 140 may be able to re-create the user experience . . . the user experience is tracked by the user behavior tracking system 143. (NOTE: Executing automated testing by the declarative testing tool >> the declarative testing tool re-creates and simulates mouse clicks and keyboard strokes during testing >> the user experience, which is the simulated mouse clicks and keyboard strokes during the execution of the declarative testing tool “is tracked” by the user behavior tracking system. The only component in Nagamalla’s user behavior tracking system that “tracks” user experience is the Google Analytics that is implemented as its heatmap generation tool. When it does its tracking, it is executed. It will not be able to track if it is not executed. Therefore, the user behavior tracking system inherently executes its heatmap generation tool, which is Google Analytics, during the execution of the declarative testing tool that simulates user experience to track the user experience.) Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-3, 7-12, 14, 16-19, and 22-23 are rejected under 35 U.S.C. 103 as being unpatentable over Nagamalla et al. (US 20150363304 A1, hereinafter “Nagamalla”) in view of Patnaik (US 11436130 B1, hereinafter “Patnaik”). In light of the specification, the examiner interprets a “heatmap generation tool” according to applicant’s specification ¶ 19, to be “one or more existing heatmap generation tools (e.g., Google Analytics™, Microsoft Clarity™, and similar heatmap generation tools)”. Regarding claim 1, Nagamalla teaches: a system comprising: at least one memory that stores computer-executable components (Nagamalla: ¶197: memory)(Nagamalla: ¶ 197, “The memory 1230 may include a main memory 1232, a static memory 1234, and a storage unit 1236 accessible to the processors 1210 via the bus 1202. The storage unit 1236 may include a machine-readable medium 1238 on which is stored the instructions 1216 embodying any one or more of the methodologies or functions described herein. . .”); and at least one processor that executes the computer-executable components stored in the at least one memory (Nagamalla: ¶198, one or more processors)(Nagamalla: ¶ 198, “. . . such that the instructions, when executed by one or more processors of the machine 1200 (e.g., processors 1210), cause the machine 1200 to perform any one or more of the methodologies described herein”), wherein the computer-executable components comprise: a heatmap component (Nagamalla: ¶53, user behavior tracking system) that executes a heatmap generation tool (Nagamalla: ¶53, Google analytics), during execution of an automated user interface test program on the application (Nagamalla: ¶ 53, "The user behavior tracking system 143 may track user activity using a variety of site tracking mechanisms. . . the user's clicks are recorded . . .the data warehouse 142 and the user behavior tracking system 143 may be partially or fully implemented, using Google Analytics, a service offered by Google, for providing a website statistics service . . ."; Nagamalla: ¶55, “. . . During testing of the system under test, the declarative testing tool 140 may test one or more user experiences . . .; Nagamalla: ¶66, “. . . using the declarative testing tool 140 to test the system under test 210 (e.g., a software application residing on application server(s) 118, the declarative testing tool 140 may be able to re-create the user experience . . . the user experience is tracked by the user behavior tracking system 143. . .”; NOTE: Paragraph 36 describes that the user experience refers to a set of actions that a user may perform on a site such as clicking, typing. Paragraph 86 describes that the user experience is a simulation that is being tested during testing of system under test. The application is the system under test. The automated user interface test program is the declarative testing tool 140. Declarative testing tool re-creates the user experience as a simulation during testing of system under test >> user experience is tracked by the user behavior tracking system during testing as described in paragraph 66. The heatmap component is the user behavior tracking system. The heatmap generation tool used by the user behavior tracking system 143 is Google Analytics. Since the user experience is tracked during testing by the user behavior tracking tool 143 which is implemented using a heatmap generation tool, therefore, the heatmap generation tool is executed during testing or execution of an automated user interface test program.) -to track executed interface elements (Nagamalla: ¶52, “. . .The user behavior tracking system 143 tracks user activity on a site. . .user activity. . .user clicks”; Nagamalla: ¶66, “. . . the user experience is tracked by the user behavior tracking system 143. . .”; NOTE: when a click is registered, then it executes an interface element. As discussed above referencing paragraph 66, the user behavior tracking system tracks user’s experience during testing. The user experience being the simulated clicks and actions on a web application under test.) that were executed by the automated user interface test program (Nagamalla: ¶66-71, “. . . By using the declarative testing tool 140 to test the system under test 210 (e.g., a software application residing on application server(s) 118, the declarative testing tool 140 may be able to re-create the user experience. . . Selenium-WebDriver”; Selenium is a tool for automating web application testing, which is also a tool that is disclosed by the applicant specification ¶2) (Nagamalla: ¶68, “A WebDriver refers to a tool for automating web application testing, and in particular to verify that they work as expected”; Nagamalla: ¶70, “a WebDriver framework 270 may be one framework used to test a UI. . .”; Nagamalla: ¶71, “. . .the WebDriver framework 270 may be a Selenium-WebDriver. . .”; NOTE: Referencing Nagamalla Fig. 2, the Webdriver Framework is a part of the Declarative testing tool 140. Paragraph 66 describes that the declarative testing tool re-creates user experience and paragraph 86 describes that the user experience recreated during the automated testing is a simulation. Therefore the clicks that executes interface elements were executed by the automated user interface test program) and a reporting component (Nagamalla: ¶74, GEI 260) that generates test coverage report data for the automated user interface test program (Nagamalla: ¶74: generate steps HTML report) (Nagamalla: ¶74, “For various embodiments, the GEI 260 may include one or more of the following features or functions. The framework may categorize the error when a user experience fails and propagate appropriate exception to denote infrastructure test or a software under test issue; enable various users to build automation of user experiences to easily test; provide maven archetype to create test projects including all the dependencies like WebDriver, test ng, and the like; provide HTML reporter to generate steps HTML report; and provide Jenkins job configuration to easily create and schedule jobs; Nagamalla: ¶93, “. . .to generate test results. . .”) that identifies the executed interface elements (Nagamalla: ¶134, “A step may be an action (like a click, typing, mouse over etc.) or an assert (assert element presence, attribute presence or text contains, etc.)”)(NOTE as referenced above, Nagamalla’s GEI 260 provides HTML reporter to generate “steps” HTML report. A step may be an action like a click as described in Nagamalla: ¶134. When a user interface element is clicked, or interacted with, that user interface element is executed. Since Nagamalla’s GEI 260 provides HTML reporter to generate “steps” HTML report, therefore the report generates test coverage data regarding the executed interface elements. Also, a person having ordinary skill in the art would have recognized to include the data from the user behavior tracking 143 that uses “Google Analytics” in the test coverage report data for further analysis of the conducted test. A PHOSITA also would have recognized that it is illogical to employ such data collection tool and not use the data it generates.). However, Nagamalla fails to teach tracking executed interface elements among the testable interface elements that were executed by the automated user interface test program (NOTE: Nagamalla only tracks executed interface elements but not the set of testable interface elements.) and a reporting component that determines one or more non-executed elements of the testable interface elements that were not executed by the automated user interface test program based on the executed interface elements excluding the one or more non-executed elements, and generates test coverage report data for the automated user interface test program that identifies the testable interface elements, and the one or more non executed elements. The analogous art Patnaik teaches. Patnaik teaches: an extraction component that determines testable interface elements that identifies testable interface elements (Patnaik: col 4 lines 57-64: “. . .a web scraper 204. . . that scrapes an application to be tested to identify elements of the application. . .then stores, in. . . an application element repository 206. . .an indication of the elements of the application identified from the scraping. . .”; Patnaik: col 2 lines 14-15, “. . . The scraping may enable all elements of the application to be identified”; Patnaik: col 2 lines 21-29, “. . the elements of the application may be visual elements of the application, such as input fields of the application (e.g. included in the web pages) that are utilized for receiving user input during execution of the application.); (NOTE: The extraction component is the web scraper. Patnaik’s web scraper 204 is part of the testing platform 202 that can determine and distinguish visual elements of the application that are utilized for receiving user input. The visual elements that are utilized for receiving user input are testable interface elements. Therefore, Patnaik’s testing tool that uses a web scraper determines testable interface elements.) It would have been obvious to a PHOSITA to combine Nagamalla and Patnaik and modify Nagamalla’s heatmap component and incorporate Patnaik’s method of using a web scraper to determine testable interface elements extracted by the web scraping tool. A predictable result from the combination results in: a heatmap component that tracks executed interface elements (tracked by Nagamalla’s user behavior system) among the testable interface elements (identified by Patnaik’s web scraper) that were executed by the automated user interface test program. The reason for doing so is to scrape the application after each change made to the application in order to ensure that the application element repository 206 remains up-to-date with current elements included in the application (Patnaik: col 5 lines 1-7). It would also have been an obvious design choice among a finite number of solutions to a PHOSITA before the effective filing date of the claimed invention to include or not to include the data of the testable interface elements identified by using Patnaik’s method in the generated test coverage report of Nagamalla. Including the data of the testable interface elements would have predictably resulted in Nagamalla’s reporting component generating test coverage report data for the automated user interface test program that identifies the testable interface elements, and the executed interface elements. The reason for including the data of the testable interface elements is to easily visualize the amount of the executed elements in contrast to the number of testable interface to provide meaningful data to developers for debugging and testing verification purposes. Although the combination of Nagamalla and Patnaik provides data for the executed interface elements and data for the testable interface elements, the combination of Nagamalla and Patnaik still fails to teach a reporting component that determines one or more non-executed elements of the testable interface elements that were not executed by the automated user interface test program based on the executed interface elements excluding the one or more non-executed elements, and generating test coverage report data that identifies the one or more non-executed elements. It would have been an obvious design choice to a PHOSITA to try one or more of a finite set of known mathematical operations (addition, subtraction, division, multiplication) between the data sets: the executed interface elements and the testable interface elements provided by the combination of Nagamalla and Patnaik. Subtracting the number of executed interface elements from the testable interface elements would have resulted in a predictable result: a reporting component that determines one or more non-executed elements of the testable interface elements that were not executed by the automated user interface test program based on the executed interface elements excluding the one or more non-executed elements using subtraction. (NOTE: Testable interface elements – executed interface elements = non-executed elements. Because the executed interface elements is used for the determination of the non-executed elements. A PHOSITA would have found it obvious that the non-executed is excluded from the determination because it is the set that is being determined. Claim 1 requires the determination of the one or more non-executed elements, it is inherent that it will be excluded from any operations to determine itself. Therefore, the determination of the one or more non-executed elements is based on the executed interface elements excluding the one or more non-executed elements) The reason for doing so is to provide meaningful data to developers for debugging and testing verification purposes. Determining the number of one or more non-executed elements provides insights to a developer and helps identify potential issues in the application such as a link or buttons that are not working when clicked during testing. It would also have been an obvious design choice among a finite number of solutions to a PHOSITA before the effective filing date of the claimed invention to include or not to include the determined non-executed elements in the generated test coverage report of Nagamalla. Including determined one or more non-executed elements would have predictably resulted in Nagamalla’s reporting component generating test coverage report data for the automated user interface test program that identifies the testable interface elements, the executed interface elements, and the one or more non-executed elements. The reason for including the determined one or more non-executed elements in the generated test coverage report is to easily visualize the amount of the one or more non-executed elements in contrast to the number of testable interface and to the number of executed interface elements to provide meaningful data to developers for debugging and testing verification purposes. Regarding claim 2, depending on 1, The combination of Nagamalla and Patnaik teaches: The system of claim 1, wherein the heatmap component employs the heatmap generation tool to generate heatmap data for the executed interface elements, and wherein the test coverage report data comprises the heatmap data. (note: the heatmap data are user behavior data such as clicks (or steps as described Nagamalla: ¶134) tracked by Nagamalla’s user behavior tracking 143 during testing)(Nagamalla: ¶74, “. . . provide HTML reporter to generate steps HTML report. . .”; Nagamalla: ¶134, “A step may be an action (like a click, typing, mouse over etc.) or an assert (assert element presence, attribute presence or text contains, etc.) Regarding claim 3, depending on 2, The combination of Nagamalla and Patnaik teaches: The system of claim 2, wherein the computer-executable components further comprise: a rendering component (Nagamalla: ¶80, user interface module 320) that renders the test coverage report data via a display device. (Nagamalla: ¶80, “. . .the user interface module 320 may be used to display the results of the tests performed. . .”). Regarding claim 7, depending on claim 1, The combination of Nagamalla and Patnaik teaches: The system of claim 1, wherein the extraction component determines the total amount of testable user interface elements using a web-scraping tool (Patnaik: col 4 lines 57-64: “. . .a web scraper 204. . . that scrapes an application to be tested to identify elements of the application. . .then stores, in. . . an application element repository 206. . .an indication of the elements of the application identified from the scraping. . .”; Patnaik: col 2 lines 14-15, “. . . The scraping may enable all elements of the application to be identified. . .”; NOTE: As discussed in the rejection of claim 1, Patnaik’s web scraper 204 is capable of determining testable user interface elements (visual elements that are utilized for receiving user input) which are elements of the application. Also, Patnaik discloses that all elements of the application (Patnaik: col 2 lines 21-29, “For example, the elements of the application may be visual elements. . .that are utilized for receiving user input. . .”). Therefore, all testable interface elements are determined. It would have been obvious to a PHOSITA to implement a counter function into the extraction component (web scraper 204 as taught by Patnaik) to determine the total amount of testable user interface elements. A PHOSITA would immediately understand that a counter implementation is common and a routine practice in programming to track events. Regarding claim 8, depending on claim 1, The combination of Nagamalla and Patnaik teaches: The system of claim 1, wherein the computer-executable components further comprise: a testing component (Nagamalla: ¶64, declarative testing tool 140) that controls the execution of the automated user interface test program (Nagamalla: ¶64, the declarative testing tool 140 is used to test a system under test. . .may include an automated test software user interface 250. . .and a WebDriver framework 270) on the user interface-based application (Nagamalla: ¶64, site. . .system under test. . publication system. . .payment system. . .) (Nagamalla: ¶64, “FIG. 2 illustrates a block diagram of a system for declarative testing according to an example embodiment. In one embodiment, the system 200 and, in particular, the declarative testing tool 140 may represent the automated testing system 123 of FIG. 1 (or a portion of the automated testing system 123). In further embodiments, the declarative testing tool 140, may be located completely or partially outside the networked system 102 (shown in FIG. 1A) or the networked system 102a (shown in FIG. 1B). The declarative testing tool 140 is used to test a system under test 210. The system under test 210 may be one or more components of the publication system(s) 120 or the payment system(s) 122 shown in FIG. 1A or FIG. 1C. In an example embodiment, the declarative testing tool 140 may include an automated test software user interface 250, a generic experience infrastructure (GEI) 260, and a WebDriver framework 270”), Regarding claim 9, depending on claim 8, The combination of Nagamalla and Patnaik teaches: The system of claim 8, wherein the extraction component extracts information identifying testable interface elements (see rejection of claim 1) during the execution (Patnaik: col 2 line 26, “. . . during execution of the application. . .”) of the automated user interface test program (Patnaik: col 2 line 46, “. . .testing platform. . .”) on the application (Patnaik: col 2 lines 3-30, “. . .web-based application. . .”) (Patnaik: col 2 lines 3-30, “FIG. 1 illustrates a method 100 for automating manually written test cases, in accordance with one embodiment. The method 100 may be carried out by any system that includes at least one processor. For example, the method 100 may be carried using the computer system described below with reference to FIG. 5. As shown in operation 102, a web scraper is used to scrape an application to be tested, wherein the scraping identifies elements of the application. The application is any computer code that performs one or more functions. Thus, the code of the application may be scraped to identify the elements of the application. The scraping may enable all elements of the application to be identified. In one embodiment, the application may be a web-based application. To this end, the application may include one or more web pages. In this embodiment, the web pages of the application may be scraped for the elements of the application. The elements of the application may be any components, building blocks, etc. of the application. For example, the elements of the application may be visual elements of the application, such as input fields of the application (e.g. included in the web pages) that are utilized for receiving user input during execution of the application, tables output by the application, etc. As another example, the elements of the application may include data structures created and used during execution of the application”) It would have been obvious to a person having ordinary skill in the art (PHOSITA) before the effective filing date of the claimed invention to have: the extraction component extracts information identifying testable interface elements during the execution of the automated user interface test program via the application. It would have been obvious to a PHOSITA that web applications are highly dynamic. For example, a simulated click during the test on a link may result in a change of a webpage and may also result for user interface elements to change (e.g. position, size, visibility etc.), a PHOSITA would recognize that extraction of interface elements during the automated testing must be done every time the UI elements change due to the execution of interface elements during testing. The reason for doing so is to scrape the application periodically after each change made to the application, in order to ensure that the application element repository remains up-to-date with current elements included in the application (Patnaik: col 5 lines 1-7). Regarding claim 10, depending on claim 9, The combination of Nagamalla and Patnaik teaches: The system of claim 9, wherein the computer-executable components further comprise: a monitoring component (Nagamalla: Fig. 3 GEI 300; note: the GEI 300 has a monitoring sub-module 350) that monitors the execution of the automated user interface program on the application (Nagamalla: ¶79, “In various embodiments, a monitoring module 350 may be configured to monitor the user experiences and flows of a software application (e.g., the system under test 210). An example flow diagram of monitoring a software application is shown in FIG. 6”) and detects loading of dynamic interface elements (Nagamalla: ¶68, “. . .The Selenium-WebDriver may integrate the WebDriver API with Selenium 2.0 and may be used to better support dynamic web pages where elements of a page may change without the page itself being reloaded. . .Nagamalla: ¶69, “The GEI 260 (with the monitoring module as a sub-module) may leverage the WebDriver framework 270 (Selenium-WebDriver) or some other framework depending on the capabilities needed by the declarative testing tool 140”). (note: As disclosed by Nagamalla, the GEI leverages the WebDriver framework (note: means working directly together), A PHOSITA would have found it as an obvious design choice to integrate both GEI and the WebDriver framework as a single monitoring component). Regarding claim 11, Method claim 11 is drawn to the method corresponding to the computer-executable extraction component, heatmap component, and generating function of the reporting component claimed in apparatus claim 1. (NOTE: The apparatus of claim 1 covers all the limitations of the broader method claim of claim 11.) The determining method corresponds to the extraction component of claim 1. The executing method corresponds to the heatmap component of claim 1. The generating method corresponds to the generating function of the reporting component of claim 1. Therefore, claim 11 is rejected for the same reasons of obviousness as used in claim 1. Regarding claims 12, and 16-18 Method claims 12, and 16-18 are drawn to the methods corresponding to the computer-executable components of using same as claimed in apparatus claims 2, and 7-9 respectively. Therefore, methods claims 12, and 16-18 correspond to the computer-executable components in the apparatus of claims 2, and 7-9 respectively and are rejected for the same reasons of obviousness as used above. Regarding claim 14, The limitations of claim 14 correspond to the limitations of the reporting component of claim 1 and is rejected for the same reasons of obviousness as used above. Regarding claim 19, CRM claim 19 is drawn to the CRM corresponding to the method of using same as claimed in method claim 11. Therefore, CRM claim 11 corresponds to the method of claim 11, and is rejected for the same reasons of obviousness as used above. Regarding claim 22 CRM claim 22 is drawn to the CRM corresponding to the method of using same as claimed in method claim 14. Therefore, CRM claim 11 corresponds to the method of claim 14, and is rejected for the same reasons of obviousness as used above. Regarding claim 23, CRM claim 23 is drawn to the CRM corresponding to the configuration of the extraction component claimed in apparatus of claim 7. Therefore, CRM claim 23 corresponds to the configuration of the extraction component claimed in apparatus of claim 7, and is rejected for the same reasons of obviousness as used above. Claims 4, 13, and 20-21 are rejected under 35 U.S.C. 103 as being unpatentable over Nagamalla in view of Patnaik further in view of Cordasco (US 20130091417 A1, hereinafter “Cordasco”) Regarding claim 4, depending on 3, The combination of Nagamalla and Patnaik teaches: The system of claim 3, wherein the heatmap data comprises graphical heatmap elements for the executed interface elements. However the combination of Nagamalla and Patnaik fails to teach the analogous art Cordasco teaches. Cordasco teaches: wherein the rendering component (Cordasco: ¶28, proxy object generator 190) displays, via a display device the user interface and the graphical heatmap elements (Cordasco: ¶23, usage clusters 150) at positions relative to the executed interface elements as included in the user interface. (Cordasco: Fig. 2, ¶ 23, " FIG. 2 depicts an example of an overlay 124 displayed over webpage 100. In one example, a heatmap is displayed within overlay 124. Usage clusters 150 represent places on webpage 100 where users performed mouse clicks, selections, entered data, etc. In one example, usage clusters 150 become darker as more events are detected in the same general area"; ¶ 28, "Analytics obtained from previously monitored web sessions may be generated and displayed as usage clusters 150 within a heatmap displayed over webpage 100. A proxy object generator 190 may generate an overlay 124 that contains the heatmap or may display overlay 124 over the heatmap. To enable operator interaction with webpage objects 128, proxy object generator 190 generates proxy objects 204 on Z-index layers above webpage 100 and overlay 124. Proxy objects 204 may be configured to act as proxies for detecting and reacting to events associated with underlying webpage objects 128. For example, proxy objects 204 can be configured to detect mouseovers and initiate actions, such as displaying pop-up windows associated with underlying webpage objects 128. Thus, proxy objects 204 convert previously static heatmaps into interactive overlays that can now detect and respond to operator initiated events"; ¶ 6, “The usage clusters may be aligned with associated webpage objects within the underlying webpage”). It would have been obvious to a person having ordinary skill in the art (PHOSITA) before the effective filing date of the claimed invention to combine Nagamalla, Patnaik and Cordasco and implement Cordasco’s teachings wherein the rendering component displays, via a display device the user interface and the graphical heatmap elements at positions relative to the executed interface elements as included in the user interface to “allow a user to visually associate the information provided by the usage clusters with associated underlying webpage objects” (Cordasco: ¶ 6). Regarding claim 13, depending on claim 12, The combination of Nagamalla, Patnaik, and Cordasco teaches: The method of claim 12, wherein the heatmap data comprises graphical heatmap elements for the executed interface elements, and wherein the method further comprises: embedding, by the system, the graphical heatmap elements within the user interface interface-based application at positions relative to the executed interface elements,(note: this limitation is drawn to the method corresponding to the computer-executable components of using same as claimed in apparatus claim 4. Therefore, this limitation correspond to the computer-executable components in the apparatus of claim 4 and is rejected for the same reasons of obviousness as used above) resulting in a marked-up version of the user interface; and rendering, by the system, the marked-up version via a display (Cordasco: Fig. 2, heatmap displayed within the overlay) (Cordasco: ¶23, FIG. 2 depicts an example of an overlay 124 displayed over webpage 100. In one example, a heatmap is displayed within overlay 124. Usage clusters 150 represent places on webpage 100 where users performed mouse clicks, selections, entered data, etc. In one example, usage clusters 150 become darker as more events are detected in the same general area. Displaying overlay 124 over webpage 100 and displaying a heatmap within overlay 124 is described in co-pending U.S. patent application Ser. Nos. 13/401,725 and 12/750,607 which have both been incorporated by reference in their entirety). It would have been obvious to a person having ordinary skill in the art (PHOSITA) before the effective filing date of the claimed invention to combine Nagamalla, Patnaik and Cordasco and implement Cordasco’s teachings wherein the heatmap data comprises graphical heatmap elements for the executed interface elements, and wherein the method further comprises: embedding, by the system, the graphical heatmap elements within the user interface interface-based application at positions relative to the executed interface elements, resulting in a marked-up version of the user interface; and rendering, by the system, the marked-up version via a display to “allow a user to visually associate the information provided by the usage clusters with associated underlying webpage objects” (Cordasco: ¶ 6). Regarding claim 20, depending on 19, The combination of Nagamalla, Patnaik, and Cordasco teaches: The non-transitory machine-readable storage medium of claim 19, wherein the operations further comprise: generating heatmap data for the executed interface elements using the heatmap generation tool (note: this limitation is drawn to the CRM corresponding to the computer-executable components of using same as claimed in apparatus claim 2. Therefore, this limitation correspond to the computer-executable components in the apparatus of claim 2 and is rejected for the same reasons of obviousness as used above); wherein the heatmap data comprises graphical heatmap elements for the executed interface elements; and embedding the graphical heatmap elements within the user interface at positions relative to the interface elements, resulting in a marked-up version of the user interface (note: this limitation is drawn to the CRM corresponding to the method of using same as claimed in method 13. Therefore, this limitation correspond to the method claim 13 and is rejected for the same reasons of obviousness as used above). Regarding claim 21, depending on 20, The combination of Nagamalla, Patnaik, and Cordasco teaches: The non-transitory machine-readable storage medium of claim 20, wherein the operations further comprise: rendering the marked-up version via a display (note: this limitation is drawn to the CRM corresponding to the method “rendering, by the system, the marked-up version via a display” of using same as claimed in method claim 13. Therefore, this limitation correspond to the method “rendering, by the system, the marked-up version via a display” of claim 13 and is rejected for the same reasons of obviousness as used above). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to PATRICK GALERA whose telephone number is (571)272-5070. The examiner can normally be reached Mon-Fri 0800-1700 ET. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, King Poon can be reached on 571-270-0728. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /PATRICK P GALERA/ Examiner, Art Unit 2617 /KING Y POON/Supervisory Patent Examiner, Art Unit 2617
Read full office action

Prosecution Timeline

Jun 06, 2023
Application Filed
Apr 21, 2025
Non-Final Rejection — §103
Jul 01, 2025
Examiner Interview Summary
Jul 01, 2025
Applicant Interview (Telephonic)
Jul 14, 2025
Response Filed
Sep 30, 2025
Final Rejection — §103
Nov 20, 2025
Examiner Interview Summary
Nov 20, 2025
Applicant Interview (Telephonic)
Dec 03, 2025
Response after Non-Final Action
Dec 15, 2025
Request for Continued Examination
Jan 13, 2026
Response after Non-Final Action
Feb 06, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602567
SYSTEM AND METHOD FOR RENDERING A VIRTUAL MODEL-BASED INTERACTION
2y 5m to grant Granted Apr 14, 2026
Patent 12597184
IMAGE PROCESSING METHOD AND APPARATUS, DEVICE AND READABLE STORAGE MEDIUM
2y 5m to grant Granted Apr 07, 2026
Patent 12586549
Image conversion apparatus and method having timing reconstruction mechanism
2y 5m to grant Granted Mar 24, 2026
Patent 12579921
ELECTRONIC DEVICE HAVING FLEXIBLE DISPLAY AND METHOD FOR CONTROLLING THE SAME
2y 5m to grant Granted Mar 17, 2026
Patent 12491085
SYSTEMS AND METHODS FOR ORTHOPEDIC IMPLANT FIXATION
2y 5m to grant Granted Dec 09, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
86%
Grant Probability
99%
With Interview (+16.7%)
2y 5m
Median Time to Grant
High
PTA Risk
Based on 7 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month