Prosecution Insights
Last updated: April 19, 2026
Application No. 17/030,149

SYSTEM AND METHOD FOR ENABLING EXTRACT TRANSFORM AND LOAD PROCESSES IN A BUSINESS INTELLIGENCE SERVER

Final Rejection §101§103
Filed
Sep 23, 2020
Examiner
WILLIS, AMANDA LYNN
Art Unit
2156
Tech Center
2100 — Computer Architecture & Software
Assignee
Oracle International Corporation
OA Round
6 (Final)
36%
Grant Probability
At Risk
7-8
OA Rounds
4y 8m
To Grant
62%
With Interview

Examiner Intelligence

Grants only 36% of cases
36%
Career Allow Rate
123 granted / 345 resolved
-19.3% vs TC avg
Strong +27% interview lift
Without
With
+26.6%
Interview Lift
resolved cases with interview
Typical timeline
4y 8m
Avg Prosecution
25 currently pending
Career history
370
Total Applications
across all art units

Statute-Specific Performance

§101
14.0%
-26.0% vs TC avg
§103
44.8%
+4.8% vs TC avg
§102
13.1%
-26.9% vs TC avg
§112
21.5%
-18.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 345 resolved cases

Office Action

§101 §103
DETAILED ACTION Receipt of Applicant’s Amendment, filed January 19, 2026 is acknowledged. Claims 1-13, 15-26, 28-33, 36, 37, 40, 41, 44, 45, 49, 51 were cancelled. Claims 52-59 were added. Claims 14, 27, 34, 35, 38, 39, 42, 43, 46-48, 50, 52-59 are pending in this office action. Notice of Pre-AIA or AIA Status The present application is being examined under the pre-AIA first to invent provisions. Claim Interpretation To facilitate identifying specific limitations for discussion, the following claim labels will be referenced herein: 14. A method for performing an extract, transform and load (ETL) process, the method comprising: (a) providing a business intelligence (BI) server executing on one or more microprocessors, wherein the BI server connects source tables on a source system to target tables on a target system; (b) providing a plurality of objects on the BI server, that are used by the extract, transform and load (ETL) process to extract, transform, and load data from the source system to the target system, wherein the plurality of objects comprises: (b.1) a plurality of transparent view objects, wherein each transparent view object of the plurality of transparent view objects, comprises metadata that represents a data shape of a joined set of the source tables using a transformation, and (b.2) an ETL mapping association object that maps the data shape of the joined set of source tables provided by one or more of the transparent view objects to one or more of the target tables; (c) providing a user interface to said BI server that displays labels associated with the plurality of objects, including a set of source and target links associated with the transparent view objects, wherein the user interface supports a number of ETL mapping types, each ETL mapping type exposing a subset of data manipulation language options, the user interface constructed dynamically, and providing object selector edit boxes and browse buttons to associate the transparent view objects with a staging table, and to specify column mappings; (d) receiving a user input via the user interface, the user input specifying that the one or more of the transparent view objects are to be used to project columns of data onto the target tables as part of the ETL process; (e) reading, with a code generator, the transparent view objects specified by the user input, and the ETL mapping association object, and generating, by the code generator, one or more ETL scripts that perform the ETL process; and (f) performing the ETL process, including executing the one or more ETL scripts to move a set of data from the source system to the target system, based on the transparent view objects and the ETL mapping association object, by: (f.1) extracting the data from the source tables of the source system; (f.2) transforming the extracted data; and (f.3) loading the transformed data into the target tables of the target system. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 14, 27, 34, 35, 38, 39, 42, 43, 46-48, 50, 52-59 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The following sections following the 2019 PEG guidelines for analyzing subject matter eligibility. Step 1 Statutory Category: Claims 14, 35, 38, 47-50, 58 are directed to a process. Claims 27, 39, 42, and 52-59 are directed to a machine. Claims 34, 43, 46, and 55-57 are directed to a manufacture or a composition of matter. Step 2A Prong 1 Judicial exception: The following claim limitations (sans the limitations struck through) have been identified as reciting abstract ideas, as mental processes. The struck through limitations have been identified as additional elements and will be discussed in later sections. Section identifiers have been added to the claims to facilitate discussion. The same identifier is added for parallel limitations in each claim recited. Claims 27 and 34 appear to recite substantially similar limitations as claim 14, the analysis and rational provided bellow for claim 14 also applies to claim 27 and 34, duplicate rejections have been omitted for brevity. 14. A method for performing an extract, transform and load (ETL) process, the method comprising: (b) providing a plurality of objects on the BI server, that are used by the extract, transform and load (ETL) process to extract, transform, and load data from the source system to the target system, wherein the plurality of objects comprises: (b.1) a plurality of transparent view objects, wherein each transparent view object of the plurality of transparent view objects, comprises metadata that represents a data shape of a joined set of the source tables using a transformation, and (b.2) an ETL mapping association object that maps the data shape of the joined set of source tables provided by one or more of the transparent view objects to one or more of the target tables; (c) (d) (e) (f) (f.1) (f.2) (f.3) With regard to limitation (b) this claim limitation involves the user inputting data (e.g. the plurality of metadata objects). What these objects are used for (e.g. the ETL process) is irrelevant to the computing device at this point, as this is merely data being input into the system by the user. This means that the data itself is generated by the human being using the device, and as such the generation of the data is a mental process being performed by the human operating the system. The metadata objects are ‘provided’ by a human being thinking about something and mentally determining an answer. The claim limitation appears to be directed to the process by which the human being contemplates what the ‘metadata objects’ are. This interpretation of this claim limitation is supported by recitations in the specification which describe the metadata being provided by user interaction with the computing device: Paragraph [0025] “An administration tool on the Bl server allows a user to interact with the Bl server, and manage the extension process of the underlying data warehouse through metadata.” Paragraph [0031] “The project metadata allows the user to group together and execute a collection of data flows.” Paragraph [0032] “In accordance with an embodiment, the project metadata includes set definition that is driven by facts selected by the users for extract. Using the set definition, the system can analyze dependencies and pull in related artifacts that need to participate in the ETL processes, such as base facts, dimensions. The system can exclude and/or include additional target artifacts, and allows a user to persist and maintain a customized set.” With regard to limitation (b.1), this claim limitation details the information that the user is inputting. This involves mental determinations being made by the user to select and formulate the specific view objects, and data that the user wishes to retrieve. As demonstrated above, the computer system is not generating any of this data. This data is provided to the computer system by the human user entering it into the system. Meaning that the human user is the one manually generating the view objects which will represent the data shapes. This is supported by Paragraph [0040] of the instant specification which recites: “In accordance with an embodiment, the Bl server allows users to span a TV object across multiple databases and tables, so that users can progressively build the data shape by nesting objects within each other. A nested TV object can be joined with other physical layer objects, such as source tables”. With regard to limitation (b.2), this limitation is again detailing information that is input by the user. This is detailed in Paragraph [0042] of the original specification which recites: “In other embodiments, the user can define multiple ETL mappings in a single EMA object.” The human user is the one performing the mental evaluations to generate the mappings that are to be used. The human user is manually providing the intellect. With regard to limitation (d), this limitation is again detailing actions of the human user of the system. The human user is the one, as explicitly recited in the claim language itself, which is providing input (e.g. the operations, the selecting, manipulations, and transformations). The specific data being received (e.g. join operations which represent a data shape) does not alleviate the fact that the intelligence is being provided by the human user merely entering it into the computing system. This interpretation is supported by the following paragraphs from the original specificity: Paragraph [0039] which recites “Users are able to specify operations such as: joins, expression based derived columns, and filters. In one example, the TV object can be implemented in a similar manner to Logical Table Sources (LTS), which allows an administrator to create a logical table by transforming one or more physical tables from one or more sources” Paragraph [0040] which recites “In accordance with an embodiment, the Bl server allows users to span a TV object across multiple databases and tables, so that users can progressively build the data shape by nesting objects within each other. A nested TV object can be joined with other physical layer objects, such as source tables” Please note that within all of the above limitations, the system is merely receiving input from the user. The user is providing the intelligence. The user enters specifically detailed instructions for the system to operate. The above claim limitations are literally claiming a human user creating computer coding. Specifically: XML and DML (Paragraph [0054] of the original specification “In an embodiment, each ETL mapping type can be defined via XML declarations. Additionally, the Bl server can support a set of data manipulation language (DML) options, with each ETL mapping type exposing a subset of the DML options.”); DML transformation logic (Paragraph [0060] of the original specificity recites “Additionally, the EMA object can include one or more data manipulation language (DML) options (Lines 22-26) that allow the user to configure the data transformation logic.” The act of a human generating code is a mental process. As the Federal Circuit explained, "methods which can be performed mentally, or which are the equivalent of human mental work, are unpatentable abstract ideas the ‘basic tools of scientific and technological work’ that are open to all.’" 654 F.3d at 1371, 99 USPQ2d at 1694 (citing Gottschalk v. Benson, 409 U.S. 63, 175 USPQ 673 (1972)). Nor do the courts distinguish between claims that recite mental processes performed by humans and claims that recite mental processes performed on a computer. As the Federal Circuit has explained, "[c]ourts have examined claims that required the use of a computer and still found that the underlying, patent-ineligible invention could be performed via pen and paper or in a person’s mind." Versata Dev. Group v. SAP Am., Inc., 793 F.3d 1306, 1335, 115 USPQ2d 1681, 1702 (Fed. Cir. 2015). Within the instant claims, the computing system is merely serving as a tool to receive the human intelligence (e.g. provided code). Dependent Limitations (Step 2A Prong 1): With regard to the dependent claims, these limitations have been analyzed both individually and as part of the ordered combination. Unless explicitly stated otherwise, the limitations within the dependent claims did not appear to substantially change the functionality of the device as a whole beyond what has already been discussed. For sake of brevity, the discussion of the ordered combination in view of the dependent limitations has not been restated. With regard to claims 35, 39, and 43 recites With regard to claims 38, 42, and 46 recites "wherein a first transparent view object, of the plurality of transparent view objects, spans multiple databases and tables, wherein users progressively build the data shape by nesting objects within each other." With regard to claims 47, 52 and 55 recites "wherein the transparent view objects are defined in the context of one or more physical tables." With regard to claims 48, 53, and 56 recites "wherein the ETL mapping association object provides one or more ETL mappings between one or more of the transparent view objects and one or more physical tables." With regard to claims 50, 54, and 57 recites "wherein the ETL mapping association object provides a many-to-many ETL mapping relationship between the plurality of transparent view objects and a plurality of physical tables" All of these claim limitations appear to detail what the user is expected to provide to the claimed system. These limitations describe the results generated by the human intelligence (e.g. the ETL mapping association object) which the claimed system merely receives. This limitation does not place any restriction on the claimed device, but instead places a restriction on the specific data that the user is inputting into the device. Savage details the steps that the software developer is expected to manually perform when coding a ETL process: “This step typically involves "data cleansing" or "data staging" processes which are labor intensive and tedious in a data warehousing project. This transformation involves the major steps of reverse engineering the source data, its form, and its database structures to determine its metadata, adapting the source data to fit the data structure of the EDM application, and loading the transformed source data to the EDM target database. In varying degrees, this requires the EDM application developer to: (i) analyze the source databases to identify the type and form of the source data and the structure and operational system of the source databases, to determine their relevance to the EDM target database; (ii) load the source data from the legacy databases to an intermediate database using extract, transform, and load (ETL) software tools or hand-coded programs; (iii) massage, cleanse, and manipulate ("transform") the source data in the intermediate database into the form needed for the EDM application; and (iv) load the transformed source data into the EDM application, and format it in those schemas necessary for OLAP or other EDM applications. In the prior art, these steps are performed manually, by a team of developers, including those of ordinary skill in the art of software programming. (Savage, Column 1, line 66 – Column 2, line 22). As such, these claim limitations have been identified as further defining the abstract idea, detailing some of the contemplations that the human user must perform to enter the necessary metadata objects and ELT mappings that are generated as a result of the mental process into the claimed computer system. Step 2A Prong 2 Integration into a practical application The following claim limitations have been identified as additional elements. Section identifiers are consistent with previously presented identifiers. 14. A method for performing an extract, transform and load (ETL) process, the method comprising: (a) providing a business intelligence (BI) server executing on one or more microprocessors, wherein the BI server connects source tables on a source system to target tables on a target system; (b) providing a plurality of metadata objects on the BI server,…, wherein said plurality of metadata objects comprises: (b.1) …, and (b.2) …; (c) providing a user interface to said BI server that displays labels associated with the plurality of objects, including a set of source and target links associated with the transparent view objects, wherein the user interface supports… the user interface constructed dynamically, and providing object selector edit boxes and browse buttons; (d) receiving a user input via the user interface, the user input specifying operations on the transparent view objects; (e) reading, with a code generator, the transparent view objects specified by the user input, and the ETL mapping association object, and generating, by the code generator, one or more ETL scripts that perform the ETL process; and (f) performing the ETL process, including executing the one or more ETL scripts to move a set of data from the source system to the target system, based on the transparent view objects and the ETL mapping association object, by: (f.1) extracting the data from the source tables of the source system; (f.2) transforming the extracted data; and (f.3) loading the transformed data into the target tables of the target system. With regard to limitation (a), the “BI server” of limitation (b), the preamble of claims 27 and 34, the additional description of the BI server in claims 27 and 34, as well as the recitation of the computer and microprocessors in claim 27. These claim limitations appear to recite the use of a generic computing device used to execute the claimed functionality (MPEP 2106.05(f)). Paragraph [0066] of the original specification recites “The present invention may be conveniently implemented using a conventional general purpose or a specialized digital computer or microprocessor programmed according to the teachings of the present disclosure”. Paragraph [0020] of the original specificity recites “Business intelligence (Bl) applications running on top of the data warehouse can provide powerful tools to the users for managing and operating their business.” One of ordinary skill in the art would recognize the BI server as a generic computing device storing business related data. Within these limitations, the recited generic computing devices appears to merely receive input from the user, and as such are merely functioning as a tool for the human user. Using a computer as a tool does not integrate the abstract idea into a practical application. With regard to limitations (c) and the “receiving a user input via the user interface” of limitation (d), these limitations appear to recite generic computing devices which are being used to perform insignificant extra-solution activity of receiving data. The user interface itself, is a generic computing device (MPEP 2106.05(f)) being used to present and retrieve data (MPEP 2106.05(g). The specific data being presented is merely an intended use of the user interface to convey a message to the human user. The human use then uses the interface as a tool to enter data. The details of the browser appear to be basic browser functionality. This is similar to the activity’s courts have found to be insignificant extra-solution activity. Specifically: iii. Presenting offers to potential customers and gathering statistics generated based on the testing about how potential customers responded to the offers; the statistics are then used to calculate an optimized price, OIP Technologies, 788 F.3d at 1363, 115 USPQ2d at 1092-93. With regard to limitations (e), (f) and (f.1) – (f.3), this limitation recites the use of a code generator to generate ETL scripts. One of ordinary skill in the art would recognize such code generators as generic computing devices which are being used to automate a manual process. Limitations (f.1) through (f.3) have been identified as the standard process for ETL, and as such are identified as the use of generic computing devices described at a high level of generality. It is noted that the acronym for the generic device stands for Extract, Transform and Load, which are what limitations f.1 through f.3 are claiming at a high level of generality. The use of a code generator to create the ETL script is also a generic computing device. This generic computing device is merely taking the user provided input and using that to generate the script. It is noted, that such ETL scripts are merely retrieving data, using code such as the user entered “metadata objects”, transforming the data using code such as the “ETL mappings” and then loading the resulting transformation into a user specified destination. All the intelligence for the script is provided by the user, and the code generator is merely using the User provided intelligence to provide scripts to perform the user specified actions. One of ordinary skill in the art would identify this as using a generic computing device to perform an action after said action has been determined by the user. This is similar to the court recognized insignificant extra-solution activity: “i. Cutting hair after first determining the hair style, In re Brown, 645 Fed. App'x 1014, 1016-1017 (Fed. Cir. 2016) (non-precedential);” As such, the devices itself (e.g. the code generator and the ETL script) have been identified as generic computing devices, and the execution of the ETL script has been identified as insignificant extra-solution activity. Ordered combination (Step 2A Prong 2): When taken as an ordered combination the claim appears to recite using a generic computing device to receive human generated code, which is then executed to achieve the expected results of what the human code was intended to do. Within the claimed system, all of the intelligence of what the system is actually doing (e.g. the identification of what data to retrieve via the transparent view object, and the mappings) are provided by the user. The computing system appears to merely be used as a tool to facilitate the human user managing and transmitting data within standard applications. It is noted that one of ordinary skill in the art would identify such software tools as being available (Savage [6604110], Column 2, lines 20-28 “In the prior art, these steps are performed manually, by a team of developers, including those of ordinary skill in the art of software programming. While there are now ETL software tools that automate the process of extracting and transforming data from the source database to the EDM 25 application, these ETLTools recognize and rely on the source metadata (the source data structure) available from the source database management system (DBMS) to generate the EDM application code.”) When viewed as an ordered combination the claims as a whole do not appear to integrate the mental process into a practical application. Dependent limitations (Step2A Prong 2): With regard to claims 35, 39, and 43 recites "wherein the BI server can associate an entity in a target data warehouse with data objects from multiple data sources, by extracting data from the data sources into a staging area, …” With regard to claims 58 and 59, claim 58 recites “wherein the BI server operates to allow BI applications to use high-level analytical queries to scan and analyze large volumes of data in a data warehouse, wherein the data warehouse is sourced from multiple data source systems associated with the BI applications, wherein the BI server operates to associate an entity in a target data warehouse with data objects from multiple data sources, by extracting data from the multiple data sources into a single staging area, where data conformance is performed before conformed data is loaded into the target data warehouse.” These claim limitations appear to recite the underlying generic computer devices which the human is using to automate the manual process. As stated above with regard to limitations (a) these claim limitations do not appear to integrate the abstract idea into a practical application. Please note that one of ordinary skill in the art would recognize analytical queries and ELT processes as generic computer components, which as recited appear to be recited in a generic manner and operating in their ordinary capacity. Step 2B Significantly more Within the Step 2A Prong 2 analysis specific claim limitations were identified as additional elements. This section will analyze these previously identified additional elements with regard to the Significantly more analysis. With regard to limitation (a) and the “BI server” of limitation (b), these claim limitations appear to recite the use of a generic computing device used to execute the claimed functionality (MPEP 2106.05(f)). Using a computer as a tool does not amount to significantly more than the abstract idea itself. With regard to limitations (c) and the “receiving a user input via the user interface… the user interface constructed dynamically, and providing object selector edit boxes and browse buttons” of limitation (d), these limitations appear to recite generic computing devices which are being used to perform insignificant extra-solution activity of receiving data. The user interface itself, is a generic computing device (MPEP 2106.05(f)) being used to present and retrieve data (MPEP 2106.05(g). These claim limitations appear substantially similar to the court recognized well-understood, routine, and conventional activities (MPEP 2106.05(d)(II): Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610, 118 USPQ2d 1744, 1745 (Fed. Cir. 2016) (using a telephone for image transmission); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network); buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network); but see DDR Holdings, LLC v. Hotels.com, L.P., 773 F.3d 1245, 1258, 113 USPQ2d 1097, 1106 (Fed. Cir. 2014) ("Unlike the claims in Ultramercial, the claims at issue here specify how interactions with the Internet are manipulated to yield a desired result‐‐a result that overrides the routine and conventional sequence of events ordinarily triggered by the click of a hyperlink." (emphasis added)); vi. A Web browser’s back and forward button functionality, Internet Patent Corp. v. Active Network, Inc., 790 F.3d 1343, 1348, 115 USPQ2d 1414, 1418 (Fed. Cir. 2015). As such, these claim limitations do not appear to amount to significantly more than the abstract idea itself. With regard to limitations (e), (f) and (f.1) – (f.3), this limitation recites the use of a code generator to generate ETL scripts. One of ordinary skill in the art would recognize such code generators as generic computing devices which are being used to automate a manual process. Limitations (f.1) through (f.3) have been identified as the standard process for ETL, and as such are identified as the use of generic computing devices described at a high level of generality. The use of a code generator to create the ETL script is also a generic computing device, and is well known in the art as evidenced by Savage. Savage Column 2, lines 23-28 “While there are now ETL software tools that automate the process of extracting and transforming data from the source database to the EDM application, these ETLTools recognize and rely on the source metadata (the source data structure) available from the source database management system (DBMS) to generate the EDM application code.” Similar to the ETL software tool Savage discusses, the instant device is provided the source metadata (e.g. the user provided metadata objects). Within the instant device, all the intelligence for the script is provided by the user, and the code generator is merely using the User provided intelligence to provide scripts to perform the user specified actions. One of ordinary skill in the art would identify this as using a generic computing device to perform an action after said action has been determined by the user. As such, these limitations do not appear to amount to significantly more than the abstract idea itself. Ordered combination (Step 2B): When taken as an ordered combination the claim appears to recite using a generic computing device to receive human generated code, which is then executed to achieve the expected results of what the human code was intended to do. As stated above, within the claimed system, all of the intelligence of what the system is actually doing (e.g. the identification of what data to retrieve via the transparent view object, and the mappings) are provided by the user. The computing system appears to merely be used as a tool to facilitate the human user managing and transmitting data within standard applications. One of ordinary skill in the art would identify such software tools as being available (Savage [6604110], Column 2, lines 20-28 “In the prior art, these steps are performed manually, by a team of developers, including those of ordinary skill in the art of software programming. While there are now ETL software tools that automate the process of extracting and transforming data from the source database to the EDM 25 application, these ETL Tools recognize and rely on the source metadata (the source data structure) available from the source database management system (DBMS) to generate the EDM application code.”) When viewed as a ordered combination the claims as a whole appear to amount to using user provided intelligence to automate a manual process, and as such do not appear to amount to significantly more than the abstract idea itself. Dependent limitations (Step 2B): With regard to claims 35, 39, and 43, these claim limitations appear to recite the underlying generic computer devices which the human is using to automate the manual process. As stated above with regard to limitations (e), (f), and (f.1) through (f.3) the automation of the manual ETL process is now able to be done using ETL software tools that automate the process as evidenced by Savage (Column 2, lines 22-28). As such, the recitation of generic computing devices perform the ETL processes claimed above does not appear to amount to significantly more than the abstract idea itself. Please note that Savage further details the use of these ETL software automation tools specifically to automate the manual staging processes (Savage, Column 1, lines 66- Column 2, line 28). As such these limitations do not appear to amount to significantly more than the abstract idea itself. 101 Abstract Idea Analysis Conclusion Based on the above rational the claims have been deemed to ineligible subject matter under 35 USC 101. Claim Rejections - 35 USC § 103 The following is a quotation of pre-AIA 35 U.S.C. 103(a) which forms the basis for all obviousness rejections set forth in this Office action: (a) A patent may not be obtained though the invention is not identically disclosed or described as set forth in section 102, if the differences between the subject matter sought to be patented and the prior art are such that the subject matter as a whole would have been obvious at the time the invention was made to a person having ordinary skill in the art to which said subject matter pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 14, 27, 34, 35, 38, 39, 42, 43, 46-48, 52, 53, 55, 56, 58, 59 are rejected under pre-AIA 35 U.S.C. 103(a) as being unpatentable over Savage [6604110] in view of vonKaenel [2004/0117358]. With regard to claim 14 Savage teaches A method (Savage, Column 2, line 49 “One object of the present invention is to provide method and apparatus for acquiring the metadata of a source database in a manner which allows the acquired metadata to be used to generate diverse type EDM applications”) for performing an extract, transform and load (ETL) processes (Savage, Column 3, lines 28-31 “The present invention allows users to migrate multiple disparate systems by providing a complete understanding of the metadata and generating the ETL programs to merge the data”), the method comprising: (a) providing a business intelligence (BI) (Savage, Column 1, lines 30-33 “EDM applications include: … business intelligence (BI)”) server (Savage, Column 6, lines 9-10 “The servers 34, 36 each include a known type signal processor 38, 40 and optional viewable display 42, 44”) executing on one or more microprocessors as signal processors 38 and 40 (Id), wherein the BI server connects (Savage, Column 23, lines 13-16 “Once the target schema and associated migration specification have been generated, the target database can be created and the source data migrated to it”; Column 26, lines 9-10 “(v)” a specification describing how to migrate (transform) data from one or more sources to a target”) source tables as multiple source tables (Savage, Column 22, lines 12-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”) on a source system as the source database (Savage, Column 4 lines 59-63 “It analyzes the source database in an elemental sequence that is defined by the logical and structural model of a data repository having defined entity relationships, so as to provide a comprehensive metadata model of the source data.”) to target tables as target tables (Savage, Column 22, lines 43-44 “Following the relevant processing by the ETLTOOL, the job can then be run to create the target table as specified”) on a target system as the target database (Savage, Column 23 lines 15-16 “the target database can be created and the source data migrated to it”); (b) providing a plurality of objects as the migration specifications (Savage, Column 21, lines 17-19 “Fig. 16, which is a summary entity relationship diagram, illustrates all of the entities and relationships used with the migration specifications”; lines 25-27 “A migration specification is focused upon the concept of defining the procedures required to produce the data values for a single column”) on the BI (Savage, Column 1, lines 30-33 “EDM applications include: … business intelligence (BI)”) server (Savage, Column 6, lines 9-10 “The servers 34, 36 each include a known type signal processor 38, 40 and optional viewable display 42, 44”), that are used by the extract, transform and load (ETL) process (Savage, Column 3, lines 28-31 “The present invention allows users to migrate multiple disparate systems by providing a complete understanding of the metadata and generating the ETL programs to merge the data”; Column 21 line 10-12 “This provides the repository with the ability to produce target tables for many types of extract, transform, and load (ETL jobs”) to extract as extract (Id), transform as load (Id), and to load data as load (Id) from the source system as the source database (Savage, Column 4 lines 59-63 “It analyzes the source database in an elemental sequence that is defined by the logical and structural model of a data repository having defined entity relationships, so as to provide a comprehensive metadata model of the source data.”) to the target system as the target database (Savage, Column 23 lines 15-16 “the target database can be created and the source data migrated to it”), wherein the plurality of objects as the migration specifications (Savage, Column 21, lines 17-19 and lines 25-27 “A migration specification is focused upon the concept of defining the procedures required to produce the data values for a single column”) comprises: (b.1) a plurality of transparent view objects as the set tables in the viewpoint table (Savage, Column 22, line 13-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”; line 36 “The first table in the join path is referred to as the “viewpoint table””), wherein each transparent view object, of the plurality of transparent view objects, comprises metadata as the possible sequence of joins (Savage, Column 22, lines 21-36 “Each possible sequence of joining source tables is noted as an instance of the JoinPath entity 230 (with primary key 231)… The first table in the join path is referred to as the “viewpoint table””; Please note this claim limitation has been construed in light of Paragraph [0027] which recites “The data transformation logic metadata can specify the data transformations, such as joins between participating entities, expressions etc.,”) that represents a data shape of a joined set (Savage, Column 21, lines 51-54 “the transformation expression defines the manner in which corresponding values of the referenced source columns are processed to produce a single value of the target column”; Savage, Column 22, lines 21-36 “Each possible sequence of joining source tables is noted as an instance of the JoinPath entity 230 (with primary key 231)… The first table in the join path is referred to as the “viewpoint table””) of source tables as the source tables (Id) using a transformation as the transformation expression (Id), and (b.2) an ETL mapping association object as the mapping expressions (Savage, Column 21, lines 5-8 “The migration specification is a complete and detailed collection of mapping expressions (i.e., statements of the functional relationship between a source column and a target column”) that maps as mapping (Id) the data shape of the joined set of source tables as the column of values produced by the transformation expression (Savage, Column 21, lines 51-54 “the transformation expression defines the manner in which corresponding values of the referenced source columns are processed to produce a single value of the target column”) provided by one or more of the transparent view objects as the set tables in the viewpoint table (Savage, Column 22, line 13-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”; line 36 “The first table in the join path is referred to as the “viewpoint table””) to one or more of the target tables as the target column (Savage, Column 21, lines 51-54 “the transformation expression defines the manner in which corresponding values of the referenced source columns are processed to produce a single value of the target column”; For examination purposes this claim limitation has been construed to refer to --the target table--, as per the 112 rejection above.); (c) providing a user interface (Savage, Column 5, lines 53-55 “In the best mode embodiment, the human operator is given access to the method through a graphical user interface (GUI) 24, comprising a known type signal processor 26, a GUI display 28, and an operator pointer/command entry device 30 (e.g., keyboard, mouse)”) to said BI (Savage, Column 1, lines 30-33 “EDM applications include: … business intelligence (BI)”) server (Savage, Column 6, lines 9-10 “The servers 34, 36 each include a known type signal processor 38, 40 and optional viewable display 42, 44”) that displays labels associated with as presenting the entity’s 216, 218, and 220 and allowing the user to modify the relations (Savage, Column 21, lines 20-25 “The MigrationSpec entity 216 (with primary key 217) represents migration specifications. Specifications can include additional customer properties as defined in the MigrationSpecPropertyList entity 218 (with primary key 219) and MigrationSpecProperty entity 220 (with primary keys 221).”; Savage, Column 15, lines 19-32 “the process pauses at step 150 to provide a review phase in which the operator may review the inferences made by the present method in its column analysis phase… the operator may either accept or modify the inferences… The table analysis process 152 is one of examining the relationships between all of the columns of a source database table, and identifying all of the functional dependencies between them”; Column 18, lines 27-30 “following completion of the relational analysis procedure 174, the method provides 78 for another pause 186 to permit the operator to review the inferences made in the relational analysis and to accept or modify them on an individual basis”) the plurality of objects as the migration specifications (Savage, Column 21, lines 17-19 and lines 25-27 “A migration specification is focused upon the concept of defining the procedures required to produce the data values for a single column”) including a set of source as accessing the source tables (Savage, Column 22, line 13-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”; line 36 “The first table in the join path is referred to as the “viewpoint table””) and target links as accessing the target data (Id) associated with the transparent view objects as the viewpoint tables (Id), wherein the user interface supports (Savage, Column 5, lines 53-55 “In the best mode embodiment, the human operator is given access to the method through a graphical user interface (GUI) 24, comprising a known type signal processor 26, a GUI display 28, and an operator pointer/command entry device 30 (e.g., keyboard, mouse)”) a number of ETL mapping types, each ETL mapping type as the mapping expression (Savage, Column 21 lines 5-10 “The migration specification is a complete and detailed collection of mapping expressions (i.e., statements of the functional relationship between a source column and a target column). Within the transformation mapping environment (1ME), the mapping expression conforms to T=f(S), where T is a target column, S is a source column(s) and f is some function.” ) exposing a subset of data manipulation language options as transformation expression that conforms to the mapping expression T=f(S) (Column 21, lines 28-37 “The process by which every target table column is produced is defined by the ColumnMapping entity 222 (with primary key 223). The transformation expression (TransformExpr- see Appendix I) is an attribute of the ColumnMapping entity 222. The transformation expression can include any number of source column references, along with additional specifications that transform data, including (but not limited to): invocations of functions; simple or complex arithmetic operators; and string manipulation.”; Please note this claim limitation has been interpreted in light of Paragraph [0060] of the original specificity as transformation logic), the user interface (Savage, Column 5, lines 53-55 “In the best mode embodiment, the human operator is given access to the method through a graphical user interface (GUI) 24, comprising a known type signal processor 26, a GUI display 28, and an operator pointer/command entry device 30 (e.g., keyboard, mouse)”) constructed dynamically (Column 5, lines 48-52 “In the best mode embodiment, the invention provides the human operator, such as the EDM application developer, the opportunity to review the inferences made and either accept or modify them before proceeding to the next phase.”), and [[as the user electing to manually enter the specification (Savage, Column 22, lines 45-51 “The data repository architecture supports the manual entry of migration specifications, since this may continue to be the primary form of creation in the near term. However, as shown in Fig. 17, upon a request 240 by the user, the migration specification 242 may be automatically generated from the data analysis results for the source database, which is stored in the data repository 200”) the transparent view objects as the viewpoint table (Savage, Column 22, line 13-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”; line 36 “The first table in the join path is referred to as the “viewpoint table””) with a staging table as storing the transformed retrieved data, commonly known as data staging (Savage, Column 24, lines 35-40 “Generally the ETL job retrieves existing data, transforms it in some manner, and uses the result to create a new collection of data. For example, a data migration process is an example of an ETL job”; Column 1, lines 60-67 “the major step in data migration is the transformation of the source data… This step typically involves “data cleansing” or “data staging” processes”; Savage, Column 22, lines 43-44 “Following the relevant processing by the ETLTOOL, the job can then be run to create the target table as specified”; Column 5, lines 13-18 “At the conclusion of each phase the invention makes inferences regarding the source data construct. The inferences are dependent on the analysis results, and are stored with the analysis results within the data repository 37. Succeeding phases are also, to a degree, dependent on the inferences made in the preceding phase”), and to specify column mappings (Column 10, line27 “DictKeyColumnMap”; line 33 “DictRelation”; lines 57-61 “The columns that include a key of any type are represented using the DictKeyColumnMap entity 116, which relates the key to a set of one or more DictColumn 60 occurrences, each of which is defined for the containing DictTable instance.”; Column 15, lines 19-23 “Referring again to FIG. 4, following completion of the column analysis procedure 136 the process pauses at step 150 to provide a review phase in which the operator may review the inferences made by the present method in its colun111 analysis phase.”; Figure 4, 151, 160, 164, 172, 186); (d) receiving a user input as user specifying the configuration settings, which grants control over the algorithms, such as the user electing the source tables (e.g. the location of the data repositories) or specifying the SQL used to retrieve the data (Savage, Column 6, line 66 – Column 7, line 4 “The server 34 communicates with the source DBMS 22 using a structured query language (SQL) format and the ODBC library of function calls. The analysis server uses the "catalog" and "retrieval" function calls to have the source DBMS 22 download the available source metadata.”; Column 6, lines 32-33 “In other words, the user may elect the location of the data repository as deemed suitable for a given application.”; Column 11, lines 36-40 “Many of the attributes of these entities are utilized as configuration settings that govern one or more phases of data analysis. All such attributes are configurable by the user, which grants considerable control over the algorithms to the user.”; Figure 4, 151, 160, 164, 172, 186) via the user interface as the human operator being given access to the GUI to specify the SQL (Savage, Column 5, lines 53-55 “In the best mode embodiment, the human operator is given access to the method through a graphical user interface (GUI) 24, comprising a known type signal processor 26, a GUI display 28, and an operator pointer/command entry device 30 (e.g., keyboard, mouse)”), the user input as user specifying the configuration settings, which grants control over the algorithms, such as the user electing the source tables (e.g. the location of the data repositories) or specifying the SQL used to retrieve the data (Savage, Column 6, line 66 – Column 7, line 4; Column 6, lines 32-33; Column 11, lines 36-40) specifying that the one or more transparent view objects as the set tables in the viewpoint table, e.g. the source tables (Savage, Column 22, line 13-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”; line 36 “The first table in the join path is referred to as the “viewpoint table””) are to be used to project columns of data onto the target tables as part of the ETL process (Column 23, lines 8-13 “The specification can be used to move the original source data into the new target database schema, by generating an E1L job based on the specification. This 10 completes the preparation for projects in which the source database is being transformed into an analyzed/revised, perhaps normalized, target database.”; Please note this claim limitation has been identified as an intended use of the transparent view objects.); (e) reading, with a code generator (Savage, Column 39-31 “When a migration specification is complete for a target table, the specificity can be used as the basis for generating an ETL job”), the transparent view objects as the view point tables directly referenced in the migration specifications (Savage, Column 22, lines 23-24 “Note that source tables may be directly referenced in a migration specification (in the transformation expression)”; Column 22, lines 35-38 “The first table in the join path is referred to as the "viewpoint table". The selection of a different viewpoint table often yields a very different result within the target table.”; Please see the 112 rejection above regarding claim interpretation for this claim language as a whole), specified by the user input as user specifying the configuration settings, which grants control over the algorithms, such as the user electing the source tables (e.g. the location of the data repositories) or specifying the SQL used to retrieve the data (Savage, Column 6, line 66 – Column 7, line 4; Column 6, lines 32-33; Column 11, lines 36-40), and the ETL mapping association object as the mapping expressions (Savage, Column 21, lines 5-8 “The migration specification is a complete and detailed collection of mapping expressions (i.e., statements of the functional relationship between a source column and a target column”), and generating, by the code generator, one or more ETL scripts as generating an ETL job (Id) that perform the ETL process as the job can create the target tables as specified (Column 22, lines 42-44 “Following the relevant processing by the ETLTool, the job can then be run to create the target table as specified.”); and (f) performing the ETL process, including executing the one or more ETL scripts as running the ETL job (Column 22, lines 42-44 “Following the relevant processing by the ETLTool, the job can then be run to create the target table as specified.”) to move a set of data from the source system to the target system as data migration process (Savage, Column 24, lines 35-40 “Generally the ETL job retrieves existing data, transforms it in some manner, and uses the result to create a new collection of data. For example, a data migration process is an example of an ETL job”), based on the transparent view objects as the set tables in the viewpoint table (Savage, Column 22, line 13-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”; line 36 “The first table in the join path is referred to as the “viewpoint table””) and the ETL mapping association object as the mapping expressions (Savage, Column 21, lines 5-8 “The migration specification is a complete and detailed collection of mapping expressions (i.e., statements of the functional relationship between a source column and a target column”), by: (f.1) extracting the data as retrieving existing data (Savage, Column 24, lines 35-40 “Generally the ETL job retrieves existing data, transforms it in some manner, and uses the result to create a new collection of data. For example, a data migration process is an example of an ETL job”) from the source tables (Savage, Column 22, lines 12-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”) of the source system as the source database (Savage, Column 4 lines 59-63 “It analyzes the source database in an elemental sequence that is defined by the logical and structural model of a data repository having defined entity relationships, so as to provide a comprehensive metadata model of the source data.”); (f.2) transforming the extracted data as transforming the retrieved data (Savage, Column 24, lines 35-40 “Generally the ETL job retrieves existing data, transforms it in some manner, and uses the result to create a new collection of data. For example, a data migration process is an example of an ETL job”); and (f.3) loading the transformed data into as using the result to create the new collection of data (Savage, Column 24, lines 35-40 “Generally the ETL job retrieves existing data, transforms it in some manner, and uses the result to create a new collection of data. For example, a data migration process is an example of an ETL job”) the target tables (Savage, Column 22, lines 43-44 “Following the relevant processing by the ETLTOOL, the job can then be run to create the target table as specified”) of the target system as the target database (Savage, Column 23 lines 15-16 “the target database can be created and the source data migrated to it”). Savage does not explicitly teach wherein the user interface …providing object selector edit boxes and browse buttons. vonKaenel teaches wherein the user interface… providing object selector edit boxes (vonKaenel, ¶720 “The Edit Favorites pop up dialog box 8410 is sued whenever the user wants to edit a data layer favorites list to appear within the Favorites tab”; Figure 84, 8410) and browse (vonKaenel, ¶674 “Figure 78 illustrates a Data Layers UI screen 7810… The Data layers UI screen 7810 allows users to select one or more data layers to layer on top of the base map”; Figure 78 7810) buttons (vonKaenel, ¶544 “In creating user interface standards… the types of buttons that will be used in the design are divided into two categories, command buttons and hyperlink buttons”). It would have been obvious to one of ordinary skill to which said subject matter pertains at the time the invention was filed to have implemented the device UI taught by Savage using the specific user input techniques taught by vonKaenel as it yields the predictable results of retrieving the user’s manual entries (Savage, Column 22, lines 45-51) via the interface (Savage, Column 5, lines 53-55) using known interface techniques. With regard to claim 27 Savage teaches A system as an apparatus (Savage, Column 2, line 49 “One object of the present invention is to provide method and apparatus for acquiring the metadata of a source database in a manner which allows the acquired metadata to be used to generate diverse type EDM applications”) for performing an extract, transform and load (ETL) processes (Savage, Column 3, lines 28-31 “The present invention allows users to migrate multiple disparate systems by providing a complete understanding of the metadata and generating the ETL programs to merge the data”), comprising: (a) a computer having a one or more microprocessors as signal processors 38 and 40 (Savage, Column 6, lines 9-10 “The servers 34, 36 each include a known type signal processor 38, 40 and optional viewable display 42, 44”), and a business intelligence (BI) (Savage, Column 1, lines 30-33 “EDM applications include: … business intelligence (BI)”) server (Savage, Column 6, lines 9-10 “The servers 34, 36 each include a known type signal processor 38, 40 and optional viewable display 42, 44”) executing thereon, wherein the BI server connects (Savage, Column 23, lines 13-16 “Once the target schema and associated migration specification have been generated, the target database can be created and the source data migrated to it”; Column 26, lines 9-10 “(v)” a specification describing how to migrate (transform) data from one or more sources to a target”) source tables as multiple source tables (Savage, Column 22, lines 12-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”) on a source system as the source database (Savage, Column 4 lines 59-63 “It analyzes the source database in an elemental sequence that is defined by the logical and structural model of a data repository having defined entity relationships, so as to provide a comprehensive metadata model of the source data.”) to target tables as target tables (Savage, Column 22, lines 43-44 “Following the relevant processing by the ETLTOOL, the job can then be run to create the target table as specified”) on a target system as the target database (Savage, Column 23 lines 15-16 “the target database can be created and the source data migrated to it”); (b) a plurality of objects as the migration specifications (Savage, Column 21, lines 17-19 “Fig. 16, which is a summary entity relationship diagram, illustrates all of the entities and relationships used with the migration specifications”; lines 25-27 “A migration specification is focused upon the concept of defining the procedures required to produce the data values for a single column”) on the BI (Savage, Column 1, lines 30-33 “EDM applications include: … business intelligence (BI)”) server (Savage, Column 6, lines 9-10 “The servers 34, 36 each include a known type signal processor 38, 40 and optional viewable display 42, 44”), that are used by the extract, transform and load (ETL) process (Savage, Column 3, lines 28-31 “The present invention allows users to migrate multiple disparate systems by providing a complete understanding of the metadata and generating the ETL programs to merge the data”; Column 21 line 10-12 “This provides the repository with the ability to produce target tables for many types of extract, transform, and load (ETL jobs”) to extract as extract (Id), transform as load (Id), and to load data as load (Id) from the source system as the source database (Savage, Column 4 lines 59-63 “It analyzes the source database in an elemental sequence that is defined by the logical and structural model of a data repository having defined entity relationships, so as to provide a comprehensive metadata model of the source data.”) to the target system as the target database (Savage, Column 23 lines 15-16 “the target database can be created and the source data migrated to it”), wherein the plurality of objects as the migration specifications (Savage, Column 21, lines 17-19 and lines 25-27 “A migration specification is focused upon the concept of defining the procedures required to produce the data values for a single column”) comprises: (b.1) a plurality of transparent view objects as the set tables in the viewpoint table (Savage, Column 22, line 13-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”; line 36 “The first table in the join path is referred to as the “viewpoint table””), wherein each transparent view object, of the plurality of transparent view objects, comprises metadata as the possible sequence of joins (Savage, Column 22, lines 21-36 “Each possible sequence of joining source tables is noted as an instance of the JoinPath entity 230 (with primary key 231)… The first table in the join path is referred to as the “viewpoint table””; Please note this claim limitation has been construed in light of Paragraph [0027] which recites “The data transformation logic metadata can specify the data transformations, such as joins between participating entities, expressions etc.,”) that represents a data shape of a joined set (Savage, Column 21, lines 51-54 “the transformation expression defines the manner in which corresponding values of the referenced source columns are processed to produce a single value of the target column”; Savage, Column 22, lines 21-36 “Each possible sequence of joining source tables is noted as an instance of the JoinPath entity 230 (with primary key 231)… The first table in the join path is referred to as the “viewpoint table””) of source tables as the source tables (Id) using a transformation as the transformation expression (Id), and (b.2) an ETL mapping association object as the mapping expressions (Savage, Column 21, lines 5-8 “The migration specification is a complete and detailed collection of mapping expressions (i.e., statements of the functional relationship between a source column and a target column”) that maps as mapping (Id) the data shape of the joined set of source tables as the column of values produced by the transformation expression (Savage, Column 21, lines 51-54 “the transformation expression defines the manner in which corresponding values of the referenced source columns are processed to produce a single value of the target column”), as provided by one or more of the transparent view objects as the set tables in the viewpoint table (Savage, Column 22, line 13-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”; line 36 “The first table in the join path is referred to as the “viewpoint table””) to one or more of the target tables as the target column (Savage, Column 21, lines 51-54 “the transformation expression defines the manner in which corresponding values of the referenced source columns are processed to produce a single value of the target column”; For examination purposes this claim limitation has been construed to refer to --the target table--, as per the 112 rejection above.); (c) display labels associated with as presenting the entity’s 216, 218, and 220 and allowing the user to modify the relations (Savage, Column 21, lines 20-25 “The MigrationSpec entity 216 (with primary key 217) represents migration specifications. Specifications can include additional customer properties as defined in the MigrationSpecPropertyList entity 218 (with primary key 219) and MigrationSpecProperty entity 220 (with primary keys 221).”; Savage, Column 15, lines 19-32 “the process pauses at step 150 to provide a review phase in which the operator may review the inferences made by the present method in its column analysis phase… the operator may either accept or modify the inferences… The table analysis process 152 is one of examining the relationships between all of the columns of a source database table, and identifying all of the functional dependencies between them”; Column 18, lines 27-30 “following completion of the relational analysis procedure 174, the method provides 78 for another pause 186 to permit the operator to review the inferences made in the relational analysis and to accept or modify them on an individual basis”) the plurality of objects as the migration specifications (Savage, Column 21, lines 17-19 and lines 25-27 “A migration specification is focused upon the concept of defining the procedures required to produce the data values for a single column”) including a set of source as accessing the source tables (Savage, Column 22, line 13-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”; line 36 “The first table in the join path is referred to as the “viewpoint table””) and target links as accessing the target data (Id) associated with the transparent view objects as the viewpoint tables (Id), wherein the user interface supports (Savage, Column 5, lines 53-55 “In the best mode embodiment, the human operator is given access to the method through a graphical user interface (GUI) 24, comprising a known type signal processor 26, a GUI display 28, and an operator pointer/command entry device 30 (e.g., keyboard, mouse)”) a number of ETL mapping types, each ETL mapping type as the mapping expression (Savage, Column 21 lines 5-10 “The migration specification is a complete and detailed collection of mapping expressions (i.e., statements of the functional relationship between a source column and a target column). Within the transformation mapping environment (1ME), the mapping expression conforms to T=f(S), where T is a target column, S is a source column(s) and f is some function.” ) exposing a subset of data manipulation language options as transformation expression that conforms to the mapping expression T=f(S) (Column 21, lines 28-37 “The process by which every target table column is produced is defined by the ColumnMapping entity 222 (with primary key 223). The transformation expression (TransformExpr- see Appendix I) is an attribute of the ColumnMapping entity 222. The transformation expression can include any number of source column references, along with additional specifications that transform data, including (but not limited to): invocations of functions; simple or complex arithmetic operators; and string manipulation.”; Please note this claim limitation has been interpreted in light of Paragraph [0060] of the original specificity as transformation logic), the user interface (Savage, Column 5, lines 53-55 “In the best mode embodiment, the human operator is given access to the method through a graphical user interface (GUI) 24, comprising a known type signal processor 26, a GUI display 28, and an operator pointer/command entry device 30 (e.g., keyboard, mouse)”) constructed dynamically (Column 5, lines 48-52 “In the best mode embodiment, the invention provides the human operator, such as the EDM application developer, the opportunity to review the inferences made and either accept or modify them before proceeding to the next phase.”), and [[as the user electing to manually enter the specification (Savage, Column 22, lines 45-51 “The data repository architecture supports the manual entry of migration specifications, since this may continue to be the primary form of creation in the near term. However, as shown in Fig. 17, upon a request 240 by the user, the migration specification 242 may be automatically generated from the data analysis results for the source database, which is stored in the data repository 200”) the transparent view objects as the viewpoint table (Savage, Column 22, line 13-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”; line 36 “The first table in the join path is referred to as the “viewpoint table””) with a staging table as storing the transformed retrieved data, commonly known as data staging (Savage, Column 24, lines 35-40 “Generally the ETL job retrieves existing data, transforms it in some manner, and uses the result to create a new collection of data. For example, a data migration process is an example of an ETL job”; Column 1, lines 60-67 “the major step in data migration is the transformation of the source data… This step typically involves “data cleansing” or “data staging” processes”; Savage, Column 22, lines 43-44 “Following the relevant processing by the ETLTOOL, the job can then be run to create the target table as specified”; Column 5, lines 13-18 “At the conclusion of each phase the invention makes inferences regarding the source data construct. The inferences are dependent on the analysis results, and are stored with the analysis results within the data repository 37. Succeeding phases are also, to a degree, dependent on the inferences made in the preceding phase”), and to specify column mappings (Column 10, line27 “DictKeyColumnMap”; line 33 “DictRelation”; lines 57-61 “The columns that include a key of any type are represented using the DictKeyColumnMap entity 116, which relates the key to a set of one or more DictColumn 60 occurrences, each of which is defined for the containing DictTable instance.”; Column 15, lines 19-23 “Referring again to FIG. 4, following completion of the column analysis procedure 136 the process pauses at step 150 to provide a review phase in which the operator may review the inferences made by the present method in its colun111 analysis phase.”; Figure 4, 151, 160, 164, 172, 186); (d) receive a user input as user specifying the configuration settings, which grants control over the algorithms, such as the user electing the source tables (e.g. the location of the data repositories) or specifying the SQL used to retrieve the data (Savage, Column 6, line 66 – Column 7, line 4 “The server 34 communicates with the source DBMS 22 using a structured query language (SQL) format and the ODBC library of function calls. The analysis server uses the "catalog" and "retrieval" function calls to have the source DBMS 22 download the available source metadata.”; Column 6, lines 32-33 “In other words, the user may elect the location of the data repository as deemed suitable for a given application.”; Column 11, lines 36-40 “Many of the attributes of these entities are utilized as configuration settings that govern one or more phases of data analysis. All such attributes are configurable by the user, which grants considerable control over the algorithms to the user.”; Figure 4, 151, 160, 164, 172, 186) via the user interface as the human operator being given access to the GUI to specify the SQL (Savage, Column 5, lines 53-55 “In the best mode embodiment, the human operator is given access to the method through a graphical user interface (GUI) 24, comprising a known type signal processor 26, a GUI display 28, and an operator pointer/command entry device 30 (e.g., keyboard, mouse)”), the user input selecting and manipulating as user specifying the configuration settings, which grants control over the algorithms, such as the user electing the source tables (e.g. the location of the data repositories) or specifying the SQL used to retrieve the data (Savage, Column 6, line 66 – Column 7, line 4; Column 6, lines 32-33; Column 11, lines 36-40) the one or more transparent view objects and transformations associated therewith, as the set tables in the viewpoint table, e.g. the source tables (Savage, Column 22, line 13-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”; line 36 “The first table in the join path is referred to as the “viewpoint table””) and specifying that the one or more of the transparent view objects (Savage, Column 6, line 66 – Column 7, line 4; Column 6, lines 32-33; Column 11, lines 36-40) are to be used to project columns of data onto the target tables as part of the ETL process (Column 23, lines 8-13 “The specification can be used to move the original source data into the new target database schema, by generating an E1L job based on the specification. This 10 completes the preparation for projects in which the source database is being transformed into an analyzed/revised, perhaps normalized, target database.”; Please note this claim limitation has been identified as an intended use of the transparent view objects.); (e) a code generator that reads (Savage, Column 39-31 “When a migration specification is complete for a target table, the specificity can be used as the basis for generating an ETL job”) the transparent view objects as the view point tables directly referenced in the migration specifications (Savage, Column 22, lines 23-24 “Note that source tables may be directly referenced in a migration specification (in the transformation expression)”; Column 22, lines 35-38 “The first table in the join path is referred to as the "viewpoint table". The selection of a different viewpoint table often yields a very different result within the target table.”; Please see the 112 rejection above regarding claim interpretation for this claim language as a whole), specified by the user input as user specifying the configuration settings, which grants control over the algorithms, such as the user electing the source tables (e.g. the location of the data repositories) or specifying the SQL used to retrieve the data (Savage, Column 6, line 66 – Column 7, line 4; Column 6, lines 32-33; Column 11, lines 36-40), and the ETL mapping association object as the mapping expressions (Savage, Column 21, lines 5-8 “The migration specification is a complete and detailed collection of mapping expressions (i.e., statements of the functional relationship between a source column and a target column”), and generating, by the code generator, one or more ETL scripts as generating an ETL job (Id) that perform the ETL process as the job can create the target tables as specified (Column 22, lines 42-44 “Following the relevant processing by the ETLTool, the job can then be run to create the target table as specified.”); and (f) wherein the system executes the one or more ETL scripts as running the ETL job (Column 22, lines 42-44 “Following the relevant processing by the ETLTool, the job can then be run to create the target table as specified.”) to move a set of data from the source system to the target system as data migration process (Savage, Column 24, lines 35-40 “Generally the ETL job retrieves existing data, transforms it in some manner, and uses the result to create a new collection of data. For example, a data migration process is an example of an ETL job”), based on the transparent view objects as the set tables in the viewpoint table (Savage, Column 22, line 13-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”; line 36 “The first table in the join path is referred to as the “viewpoint table””) and the ETL mapping association object as the mapping expressions (Savage, Column 21, lines 5-8 “The migration specification is a complete and detailed collection of mapping expressions (i.e., statements of the functional relationship between a source column and a target column”), by: (f.1) extracting the data as retrieving existing data (Savage, Column 24, lines 35-40 “Generally the ETL job retrieves existing data, transforms it in some manner, and uses the result to create a new collection of data. For example, a data migration process is an example of an ETL job”) from the source tables (Savage, Column 22, lines 12-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”) of the source system as the source database (Savage, Column 4 lines 59-63 “It analyzes the source database in an elemental sequence that is defined by the logical and structural model of a data repository having defined entity relationships, so as to provide a comprehensive metadata model of the source data.”); (f.2) transforming the extracted data as transforming the retrieved data (Savage, Column 24, lines 35-40 “Generally the ETL job retrieves existing data, transforms it in some manner, and uses the result to create a new collection of data. For example, a data migration process is an example of an ETL job”); and (f.3) loading the transformed data into as using the result to create the new collection of data (Savage, Column 24, lines 35-40 “Generally the ETL job retrieves existing data, transforms it in some manner, and uses the result to create a new collection of data. For example, a data migration process is an example of an ETL job”) the target tables (Savage, Column 22, lines 43-44 “Following the relevant processing by the ETLTOOL, the job can then be run to create the target table as specified”) of the target system as the target database (Savage, Column 23 lines 15-16 “the target database can be created and the source data migrated to it”). Savage does not explicitly teach wherein the user interface …providing object selector edit boxes and browse buttons. vonKaenel teaches wherein the user interface… providing object selector edit boxes (vonKaenel, ¶720 “The Edit Favorites pop up dialog box 8410 is sued whenever the user wants to edit a data layer favorites list to appear within the Favorites tab”; Figure 84, 8410) and browse (vonKaenel, ¶674 “Figure 78 illustrates a Data Layers UI screen 7810… The Data layers UI screen 7810 allows users to select one or more data layers to layer on top of the base map”; Figure 78 7810) buttons (vonKaenel, ¶544 “In creating user interface standards… the types of buttons that will be used in the design are divided into two categories, command buttons and hyperlink buttons”). It would have been obvious to one of ordinary skill to which said subject matter pertains at the time the invention was filed to have implemented the device UI taught by Savage using the specific user input techniques taught by vonKaenel as it yields the predictable results of retrieving the user’s manual entries (Savage, Column 22, lines 45-51) via the interface (Savage, Column 5, lines 53-55) using known interface techniques. With regard to claim 34 Savage teaches A non-transitory computer-readable storage medium having instructions stored thereon (Savage, Column 2, line 49 “One object of the present invention is to provide method and apparatus for acquiring the metadata of a source database in a manner which allows the acquired metadata to be used to generate diverse type EDM applications”) for an supporting extract, transform and load (ETL) process (Savage, Column 3, lines 28-31 “The present invention allows users to migrate multiple disparate systems by providing a complete understanding of the metadata and generating the ETL programs to merge the data”)which instructions, when executed, cause a system executing on one or more microprocessors (Savage, Column 6, lines 9-10 “The servers 34, 36 each include a known type signal processor 38, 40 and optional viewable display 42, 44”) to perform steps comprising: (a) providing a business intelligence (BI) (Savage, Column 1, lines 30-33 “EDM applications include: … business intelligence (BI)”) server (Savage, Column 6, lines 9-10 “The servers 34, 36 each include a known type signal processor 38, 40 and optional viewable display 42, 44”), wherein the BI server connects (Savage, Column 23, lines 13-16 “Once the target schema and associated migration specification have been generated, the target database can be created and the source data migrated to it”; Column 26, lines 9-10 “(v)” a specification describing how to migrate (transform) data from one or more sources to a target”) source tables as multiple source tables (Savage, Column 22, lines 12-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”) on a source system as the source database (Savage, Column 4 lines 59-63 “It analyzes the source database in an elemental sequence that is defined by the logical and structural model of a data repository having defined entity relationships, so as to provide a comprehensive metadata model of the source data.”) to target tables as target tables (Savage, Column 22, lines 43-44 “Following the relevant processing by the ETLTOOL, the job can then be run to create the target table as specified”) on a target system as the target database (Savage, Column 23 lines 15-16 “the target database can be created and the source data migrated to it”); (b) providing a plurality of objects as the migration specifications (Savage, Column 21, lines 17-19 “Fig. 16, which is a summary entity relationship diagram, illustrates all of the entities and relationships used with the migration specifications”; lines 25-27 “A migration specification is focused upon the concept of defining the procedures required to produce the data values for a single column”) on the BI (Savage, Column 1, lines 30-33 “EDM applications include: … business intelligence (BI)”) server (Savage, Column 6, lines 9-10 “The servers 34, 36 each include a known type signal processor 38, 40 and optional viewable display 42, 44”), that are used by the extract, transform and load (ETL) process (Savage, Column 3, lines 28-31 “The present invention allows users to migrate multiple disparate systems by providing a complete understanding of the metadata and generating the ETL programs to merge the data”; Column 21 line 10-12 “This provides the repository with the ability to produce target tables for many types of extract, transform, and load (ETL jobs”) to extract as extract (Id), transform as load (Id), and to load data as load (Id) from the source system as the source database (Savage, Column 4 lines 59-63 “It analyzes the source database in an elemental sequence that is defined by the logical and structural model of a data repository having defined entity relationships, so as to provide a comprehensive metadata model of the source data.”) to the target system as the target database (Savage, Column 23 lines 15-16 “the target database can be created and the source data migrated to it”), wherein the plurality of objects as the migration specifications (Savage, Column 21, lines 17-19 and lines 25-27 “A migration specification is focused upon the concept of defining the procedures required to produce the data values for a single column”) comprises: (b.1) a plurality of transparent view objects as the set tables in the viewpoint table (Savage, Column 22, line 13-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”; line 36 “The first table in the join path is referred to as the “viewpoint table””), wherein each transparent view object, of the plurality of transparent view objects, comprises metadata as the possible sequence of joins (Savage, Column 22, lines 21-36 “Each possible sequence of joining source tables is noted as an instance of the JoinPath entity 230 (with primary key 231)… The first table in the join path is referred to as the “viewpoint table””; Please note this claim limitation has been construed in light of Paragraph [0027] which recites “The data transformation logic metadata can specify the data transformations, such as joins between participating entities, expressions etc.,”) that represents a data shape of a joined set (Savage, Column 21, lines 51-54 “the transformation expression defines the manner in which corresponding values of the referenced source columns are processed to produce a single value of the target column”; Savage, Column 22, lines 21-36 “Each possible sequence of joining source tables is noted as an instance of the JoinPath entity 230 (with primary key 231)… The first table in the join path is referred to as the “viewpoint table””) of source tables as the source tables (Id) using a transformation as the transformation expression (Id), and (b.2) an ETL mapping association object as the mapping expressions (Savage, Column 21, lines 5-8 “The migration specification is a complete and detailed collection of mapping expressions (i.e., statements of the functional relationship between a source column and a target column”) that maps as mapping (Id) the data shape of the joined set of source tables as the column of values produced by the transformation expression (Savage, Column 21, lines 51-54 “the transformation expression defines the manner in which corresponding values of the referenced source columns are processed to produce a single value of the target column”) provided by one or more of the transparent view objects as the set tables in the viewpoint table (Savage, Column 22, line 13-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”; line 36 “The first table in the join path is referred to as the “viewpoint table””) to one or more of the target tables as the target column (Savage, Column 21, lines 51-54 “the transformation expression defines the manner in which corresponding values of the referenced source columns are processed to produce a single value of the target column”; For examination purposes this claim limitation has been construed to refer to --the target table--, as per the 112 rejection above.); (c) a user interface (Savage, Column 5, lines 53-55 “In the best mode embodiment, the human operator is given access to the method through a graphical user interface (GUI) 24, comprising a known type signal processor 26, a GUI display 28, and an operator pointer/command entry device 30 (e.g., keyboard, mouse)”) to said BI (Savage, Column 1, lines 30-33 “EDM applications include: … business intelligence (BI)”) server (Savage, Column 6, lines 9-10 “The servers 34, 36 each include a known type signal processor 38, 40 and optional viewable display 42, 44”) that is configured to: display labels associated with as presenting the entity’s 216, 218, and 220 and allowing the user to modify the relations (Savage, Column 21, lines 20-25 “The MigrationSpec entity 216 (with primary key 217) represents migration specifications. Specifications can include additional customer properties as defined in the MigrationSpecPropertyList entity 218 (with primary key 219) and MigrationSpecProperty entity 220 (with primary keys 221).”; Savage, Column 15, lines 19-32 “the process pauses at step 150 to provide a review phase in which the operator may review the inferences made by the present method in its column analysis phase… the operator may either accept or modify the inferences… The table analysis process 152 is one of examining the relationships between all of the columns of a source database table, and identifying all of the functional dependencies between them”; Column 18, lines 27-30 “following completion of the relational analysis procedure 174, the method provides 78 for another pause 186 to permit the operator to review the inferences made in the relational analysis and to accept or modify them on an individual basis”) the plurality of objects as the migration specifications (Savage, Column 21, lines 17-19 and lines 25-27 “A migration specification is focused upon the concept of defining the procedures required to produce the data values for a single column”) including a set of source as accessing the source tables (Savage, Column 22, line 13-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”; line 36 “The first table in the join path is referred to as the “viewpoint table””) and target links as accessing the target data (Id) associated with the transparent view objects as the viewpoint tables (Id), wherein the user interface supports (Savage, Column 5, lines 53-55 “In the best mode embodiment, the human operator is given access to the method through a graphical user interface (GUI) 24, comprising a known type signal processor 26, a GUI display 28, and an operator pointer/command entry device 30 (e.g., keyboard, mouse)”) a number of ETL mapping types, each ETL mapping type as the mapping expression (Savage, Column 21 lines 5-10 “The migration specification is a complete and detailed collection of mapping expressions (i.e., statements of the functional relationship between a source column and a target column). Within the transformation mapping environment (1ME), the mapping expression conforms to T=f(S), where T is a target column, S is a source column(s) and f is some function.” ) exposing a subset of data manipulation language options as transformation expression that conforms to the mapping expression T=f(S) (Column 21, lines 28-37 “The process by which every target table column is produced is defined by the ColumnMapping entity 222 (with primary key 223). The transformation expression (TransformExpr- see Appendix I) is an attribute of the ColumnMapping entity 222. The transformation expression can include any number of source column references, along with additional specifications that transform data, including (but not limited to): invocations of functions; simple or complex arithmetic operators; and string manipulation.”; Please note this claim limitation has been interpreted in light of Paragraph [0060] of the original specificity as transformation logic), the user interface (Savage, Column 5, lines 53-55 “In the best mode embodiment, the human operator is given access to the method through a graphical user interface (GUI) 24, comprising a known type signal processor 26, a GUI display 28, and an operator pointer/command entry device 30 (e.g., keyboard, mouse)”) constructed dynamically (Column 5, lines 48-52 “In the best mode embodiment, the invention provides the human operator, such as the EDM application developer, the opportunity to review the inferences made and either accept or modify them before proceeding to the next phase.”), and [[associate as the user electing to manually enter the specification (Savage, Column 22, lines 45-51 “The data repository architecture supports the manual entry of migration specifications, since this may continue to be the primary form of creation in the near term. However, as shown in Fig. 17, upon a request 240 by the user, the migration specification 242 may be automatically generated from the data analysis results for the source database, which is stored in the data repository 200”) the transparent view objects as the viewpoint table (Savage, Column 22, line 13-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”; line 36 “The first table in the join path is referred to as the “viewpoint table””) with a staging table as storing the transformed retrieved data, commonly known as data staging (Savage, Column 24, lines 35-40 “Generally the ETL job retrieves existing data, transforms it in some manner, and uses the result to create a new collection of data. For example, a data migration process is an example of an ETL job”; Column 1, lines 60-67 “the major step in data migration is the transformation of the source data… This step typically involves “data cleansing” or “data staging” processes”; Savage, Column 22, lines 43-44 “Following the relevant processing by the ETLTOOL, the job can then be run to create the target table as specified”; Column 5, lines 13-18 “At the conclusion of each phase the invention makes inferences regarding the source data construct. The inferences are dependent on the analysis results, and are stored with the analysis results within the data repository 37. Succeeding phases are also, to a degree, dependent on the inferences made in the preceding phase”), and to specify column mappings (Column 10, line27 “DictKeyColumnMap”; line 33 “DictRelation”; lines 57-61 “The columns that include a key of any type are represented using the DictKeyColumnMap entity 116, which relates the key to a set of one or more DictColumn 60 occurrences, each of which is defined for the containing DictTable instance.”; Column 15, lines 19-23 “Referring again to FIG. 4, following completion of the column analysis procedure 136 the process pauses at step 150 to provide a review phase in which the operator may review the inferences made by the present method in its colun111 analysis phase.”; Figure 4, 151, 160, 164, 172, 186); (d) receive a user input as user specifying the configuration settings, which grants control over the algorithms, such as the user electing the source tables (e.g. the location of the data repositories) or specifying the SQL used to retrieve the data (Savage, Column 6, line 66 – Column 7, line 4 “The server 34 communicates with the source DBMS 22 using a structured query language (SQL) format and the ODBC library of function calls. The analysis server uses the "catalog" and "retrieval" function calls to have the source DBMS 22 download the available source metadata.”; Column 6, lines 32-33 “In other words, the user may elect the location of the data repository as deemed suitable for a given application.”; Column 11, lines 36-40 “Many of the attributes of these entities are utilized as configuration settings that govern one or more phases of data analysis. All such attributes are configurable by the user, which grants considerable control over the algorithms to the user.”; Figure 4, 151, 160, 164, 172, 186) via the user interface as the human operator being given access to the GUI to specify the SQL (Savage, Column 5, lines 53-55 “In the best mode embodiment, the human operator is given access to the method through a graphical user interface (GUI) 24, comprising a known type signal processor 26, a GUI display 28, and an operator pointer/command entry device 30 (e.g., keyboard, mouse)”), the user input as user specifying the configuration settings, which grants control over the algorithms, such as the user electing the source tables (e.g. the location of the data repositories) or specifying the SQL used to retrieve the data (Savage, Column 6, line 66 – Column 7, line 4; Column 6, lines 32-33; Column 11, lines 36-40) selecting and manipulating the one or more transparent view objects and transformations associated therewith as the set tables in the viewpoint table, e.g. the source tables (Savage, Column 22, line 13-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”; line 36 “The first table in the join path is referred to as the “viewpoint table””), representing the data shape of the joined set of source tables (Savage, Column 21, lines 51-54 “the transformation expression defines the manner in which corresponding values of the referenced source columns are processed to produce a single value of the target column”; Savage, Column 22, lines 21-36 “Each possible sequence of joining source tables is noted as an instance of the JoinPath entity 230 (with primary key 231)… The first table in the join path is referred to as the “viewpoint table””) and specifying that the one or more transparent view objects as the viewpoint table (Savage, Column 22, line 13-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”; line 36 “The first table in the join path is referred to as the “viewpoint table””) are to be used to project columns of data onto the target tables as part of the ETL process (Column 23, lines 8-13 “The specification can be used to move the original source data into the new target database schema, by generating an E1L job based on the specification. This 10 completes the preparation for projects in which the source database is being transformed into an analyzed/revised, perhaps normalized, target database.”; Please note this claim limitation has been identified as an intended use of the transparent view objects.); (e) a code generator that reads (Savage, Column 39-31 “When a migration specification is complete for a target table, the specificity can be used as the basis for generating an ETL job”) the transparent view objects as the view point tables directly referenced in the migration specifications (Savage, Column 22, lines 23-24 “Note that source tables may be directly referenced in a migration specification (in the transformation expression)”; Column 22, lines 35-38 “The first table in the join path is referred to as the "viewpoint table". The selection of a different viewpoint table often yields a very different result within the target table.”; Please see the 112 rejection above regarding claim interpretation for this claim language as a whole), specified by the user input as user specifying the configuration settings, which grants control over the algorithms, such as the user electing the source tables (e.g. the location of the data repositories) or specifying the SQL used to retrieve the data (Savage, Column 6, line 66 – Column 7, line 4; Column 6, lines 32-33; Column 11, lines 36-40), and the ETL mapping association object as the mapping expressions (Savage, Column 21, lines 5-8 “The migration specification is a complete and detailed collection of mapping expressions (i.e., statements of the functional relationship between a source column and a target column”), and generating, by the code generator, one or more ETL scripts as generating an ETL job (Id) that perform the ETL process as the job can create the target tables as specified (Column 22, lines 42-44 “Following the relevant processing by the ETLTool, the job can then be run to create the target table as specified.”); and (f) wherein the system executes the one or more ETL scripts as running the ETL job (Column 22, lines 42-44 “Following the relevant processing by the ETLTool, the job can then be run to create the target table as specified.”) to move a set of data from the source system to the target system as data migration process (Savage, Column 24, lines 35-40 “Generally the ETL job retrieves existing data, transforms it in some manner, and uses the result to create a new collection of data. For example, a data migration process is an example of an ETL job”), based on the transparent view objects as the set tables in the viewpoint table (Savage, Column 22, line 13-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”; line 36 “The first table in the join path is referred to as the “viewpoint table””) and the ETL mapping association object as the mapping expressions (Savage, Column 21, lines 5-8 “The migration specification is a complete and detailed collection of mapping expressions (i.e., statements of the functional relationship between a source column and a target column”), by: (f.1) extracting the data as retrieving existing data (Savage, Column 24, lines 35-40 “Generally the ETL job retrieves existing data, transforms it in some manner, and uses the result to create a new collection of data. For example, a data migration process is an example of an ETL job”) from the source tables (Savage, Column 22, lines 12-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”) of the source system as the source database (Savage, Column 4 lines 59-63 “It analyzes the source database in an elemental sequence that is defined by the logical and structural model of a data repository having defined entity relationships, so as to provide a comprehensive metadata model of the source data.”); (f.2) transforming the extracted data as transforming the retrieved data (Savage, Column 24, lines 35-40 “Generally the ETL job retrieves existing data, transforms it in some manner, and uses the result to create a new collection of data. For example, a data migration process is an example of an ETL job”); and (f.3) loading the transformed data into as using the result to create the new collection of data (Savage, Column 24, lines 35-40 “Generally the ETL job retrieves existing data, transforms it in some manner, and uses the result to create a new collection of data. For example, a data migration process is an example of an ETL job”) the target tables (Savage, Column 22, lines 43-44 “Following the relevant processing by the ETLTOOL, the job can then be run to create the target table as specified”) of the target system as the target database (Savage, Column 23 lines 15-16 “the target database can be created and the source data migrated to it”). Savage does not explicitly teach wherein the user interface …providing object selector edit boxes and browse buttons. vonKaenel teaches wherein the user interface… providing object selector edit boxes (vonKaenel, ¶720 “The Edit Favorites pop up dialog box 8410 is sued whenever the user wants to edit a data layer favorites list to appear within the Favorites tab”; Figure 84, 8410) and browse (vonKaenel, ¶674 “Figure 78 illustrates a Data Layers UI screen 7810… The Data layers UI screen 7810 allows users to select one or more data layers to layer on top of the base map”; Figure 78 7810) buttons (vonKaenel, ¶544 “In creating user interface standards… the types of buttons that will be used in the design are divided into two categories, command buttons and hyperlink buttons”). It would have been obvious to one of ordinary skill to which said subject matter pertains at the time the invention was filed to have implemented the device UI taught by Savage using the specific user input techniques taught by vonKaenel as it yields the predictable results of retrieving the user’s manual entries (Savage, Column 22, lines 45-51) via the interface (Savage, Column 5, lines 53-55) using known interface techniques. With regard to claims 35, 39, and 43 Savage further teaches wherein the BI (Savage, Column 1, lines 30-33 “EDM applications include: … business intelligence (BI)”) server (Savage, Column 6, lines 9-10 “The servers 34, 36 each include a known type signal processor 38, 40 and optional viewable display 42, 44”) can associate an entity in a target data warehouse as the data warehouse (Savage, Column 1, lines 48-53 “All of the transferred operational source data must be logically transformed from the operational system model to the logical and physical structure of the data warehouse, which aligns with the business structure. As an example, a data warehouse may combine data from different applications”; Column 22, lines 4-15 “if a source database is being used to produce a data warehouse… Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”) with data objects from multiple data sources as combining data from different applications (Id), by extracting data as retrieving existing data (Savage, Column 24, lines 35-40 “Generally the ETL job retrieves existing data, transforms it in some manner, and uses the result to create a new collection of data. For example, a data migration process is an example of an ETL job”) from the data sources (Savage, Column 22, lines 12-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”) into a staging area as transforming the retrieved data, commonly known as data staging (Savage, Column 24, lines 35-40 “Generally the ETL job retrieves existing data, transforms it in some manner, and uses the result to create a new collection of data. For example, a data migration process is an example of an ETL job”; Column 1, lines 60-67 “the major step in data migration is the transformation of the source data… This step typically involves “data cleansing” or “data staging” processes”), wherein the ETL mapping association object maps as the mapping expressions (Savage, Column 21, lines 5-8 “The migration specification is a complete and detailed collection of mapping expressions (i.e., statements of the functional relationship between a source column and a target column”) the transformation contained in a first transparent view object as a first viewpoint table (Savage, Column 22, line 13-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”; line 36 “The first table in the join path is referred to as the “viewpoint table””; Please note this claim limitation has been construed in light of the objection above to mean –a first transparent view object--), of the plurality of transparent view objects as the set tables in the viewpoint table (Savage, Column 22, line 13-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”; line 36 “The first table in the join path is referred to as the “viewpoint table””), to a staging table tables as transforming the retrieved data, commonly known as data staging (Savage, Column 24, lines 35-40 “Generally the ETL job retrieves existing data, transforms it in some manner, and uses the result to create a new collection of data. For example, a data migration process is an example of an ETL job”; Column 1, lines 60-67 “the major step in data migration is the transformation of the source data… This step typically involves “data cleansing” or “data staging” processes”; Savage, Column 22, lines 43-44 “Following the relevant processing by the ETLTOOL, the job can then be run to create the target table as specified”) in the target data warehouse (Savage, Column 1, lines 48-53 “All of the transferred operational source data must be logically transformed from the operational system model to the logical and physical structure of the data warehouse, which aligns with the business structure. As an example, a data warehouse may combine data from different applications”; Column 22, lines 4-15 “if a source database is being used to produce a data warehouse… Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”). With regard to claims 36, 40, and 44 Savage further teaches wherein the user interface (Savage, Column 5, lines 53-55 “In the best mode embodiment, the human operator is given access to the method through a graphical user interface (GUI) 24, comprising a known type signal processor 26, a GUI display 28, and an operator pointer/command entry device 30 (e.g., keyboard, mouse)”) provides object selector (vonKaenel, ¶720 “The Edit Favorites pop up dialog box 8410 is used whenever the user wants to edit a data layer favorites list to appear within the Favorites tab”; Figure 84, 8410) edit boxes (vonKaenel, ¶720 “The Edit Favorites pop up dialog box 8410 is sued whenever the user wants to edit a data layer favorites list to appear within the Favorites tab”; Figure 84, 8410) and browse (vonKaenel, ¶674 “Figure 78 illustrates a Data Layers UI screen 7810… The Data layers UI screen 7810 allows users to select one or more data layers to layer on top of the base map”; Figure 78 7810) buttons (vonKaenel, ¶544 “In creating user interface standards… the types of buttons that will be used in the design are divided into two categories, command buttons and hyperlink buttons”) to associate as the user electing to manually enter the specification (Savage, Column 22, lines 45-51 “The data repository architecture supports the manual entry of migration specifications, since this may continue to be the primary form of creation in the near term. However, as shown in Fig. 17, upon a request 240 by the user, the migration specification 242 may be automatically generated from the data analysis results for the source database, which is stored in the data repository 200”) the transparent view object as the viewpoint table (Savage, Column 22, line 13-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”; line 36 “The first table in the join path is referred to as the “viewpoint table””) with the staging table as transforming the retrieved data, commonly known as data staging (Savage, Column 24, lines 35-40 “Generally the ETL job retrieves existing data, transforms it in some manner, and uses the result to create a new collection of data. For example, a data migration process is an example of an ETL job”; Column 1, lines 60-67 “the major step in data migration is the transformation of the source data… This step typically involves “data cleansing” or “data staging” processes”; Savage, Column 22, lines 43-44 “Following the relevant processing by the ETLTOOL, the job can then be run to create the target table as specified”). With regard to claims 38, 42, and 46 Savage further teaches wherein a first transparent view object as a first viewpoint table (Savage, Column 22, line 13-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”; line 36 “The first table in the join path is referred to as the “viewpoint table””; Please note this claim limitation has been construed in light of the objection above to mean –a first transparent view object--), of the plurality of transparent view objects as the set tables in the viewpoint table (Savage, Column 22, line 13-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”; line 36 “The first table in the join path is referred to as the “viewpoint table””), spans multiple databases (Savage, Column 22, line 13-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”; line 36 “The first table in the join path is referred to as the “viewpoint table””) and tables (Savage, Column 22, lines 21-36 “the architecture supports automatic identification of all possible sequences in which all required source tables might be linked, or joined. Note that source tables may be directly referenced in a migration specification (in the transformation expression), yet additional source tables often must be involved in the sequence of joining tables. This occurs when the referenced source tables are not all directly related to one another, and must be indirectly joined via other source tables. Each possible sequence of joining source tables is noted as an instance of the JoinPath entity 230 (with primary key 231)… The first table in the join path is referred to as the “viewpoint table””), wherein users progressively build as follow the join path (Savage, Column 22, lines 23-38 “… Each possible sequence of joining source tables is noted as an instance of the JoinPath entity 230 (with primary key 231)… The first table in the join path is referred to as the “viewpoint table””) the data shape as the single value produced by the transformation expression for the column (Savage, Column 21, lines 51-54 “the transformation expression defines the manner in which corresponding values of the referenced source columns are processed to produce a single value of the target column”; Savage, Column 22, lines 21-36 “the architecture supports automatic identification of all possible sequences in which all required source tables might be linked, or joined. Note that source tables may be directly referenced in a migration specification (in the transformation expression), yet additional source tables often must be involved in the sequence of joining tables. This occurs when the referenced source tables are not all directly related to one another, and must be indirectly joined via other source tables. Each possible sequence of joining source tables is noted as an instance of the JoinPath entity 230 (with primary key 231)… The first table in the join path is referred to as the “viewpoint table””) by nesting objects within each other (Savage, Column 25, lines 26-28 “a named dimension may consist of a group of other named dimensions, any of which in turn may consist of other dimensions”). With regard to claims 47, 52, and 55 Savage further teaches wherein the transparent view objects are defined in the context of one or more physical tables (Savage, Column 8, line 19-22 “At the minimum, this includes the physical and logical name of the column; the physical sequence (placement) of the column within the table”). With regard to claims 48, 53, and 56 Savage further teaches wherein the ETL mapping association object as the mapping expressions (Savage, Column 21, lines 5-8 “The migration specification is a complete and detailed collection of mapping expressions (i.e., statements of the functional relationship between a source column and a target column”) provides one or more ETL mappings as mappings (Id) between one or more of the transparent view objects as the source tables in the viewpoint table (Savage, Column 22, line 13-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”; line 36 “The first table in the join path is referred to as the “viewpoint table””) and one or more physical tables as the target column within the target table (Savage, Column 21, lines 5-8 “The migration specification is a complete and detailed collection of mapping expressions (i.e., statements of the functional relationship between a source column and a target column”; Column 21, lines 51-54 “the transformation expression defines the manner in which corresponding values of the referenced source columns are processed to produce a single value of the target column”). With regard to claims 58 and 59 the proposed combination further wherein the BI (Savage, Column 1, lines 30-33 “EDM applications include: … business intelligence (BI)”) server (Savage, Column 6, lines 9-10 “The servers 34, 36 each include a known type signal processor 38, 40 and optional viewable display 42, 44”) operates to allow BI applications to use high-level analytical queries (Savage, Column 6, lines 58-59 “The driver translates the application data queries into commands that the DBMS understands”) to scan and analyze large volumes of data (Savage, Column 5, lines 17-18 “for use in analyzing the data stored in a source database 22.”) in a data warehouse (Savage, Column 22, lines 6-8 “This can occur, for instance, if a source database is being used to produce a data warehouse in which only a portion of the data will be available”), wherein the data warehouse is sourced from multiple data source systems associated with the BI applications (Savage, Column 22, lines 12-14 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables.”), wherein the BI server operates to associate an entity in a target data as building target data(Savage, Column 22, lines 14-16 “In such cases, data from the source tables must 15 be combined in an appropriate manner to yield correct results during the process of building the target data”) a warehouse as the data warehouse produced(Savage, Column 22, lines 6-8 “This can occur, for instance, if a source database is being used to produce a data warehouse in which only a portion of the data will be available”) with data objects from multiple data sources (Savage, Column 22, lines 14-16 “In such cases, data from the source tables must 15 be combined in an appropriate manner to yield correct results during the process of building the target data”), by extracting data from the multiple data sources into a single staging area as storing the transformed retrieved data, commonly known as data staging (Savage, Column 24, lines 35-40 “Generally the ETL job retrieves existing data, transforms it in some manner, and uses the result to create a new collection of data. For example, a data migration process is an example of an ETL job”; Column 1, lines 60-67 “the major step in data migration is the transformation of the source data… This step typically involves “data cleansing” or “data staging” processes”; Savage, Column 22, lines 43-44 “Following the relevant processing by the ETLTOOL, the job can then be run to create the target table as specified”; Column 5, lines 13-18 “At the conclusion of each phase the invention makes inferences regarding the source data construct. The inferences are dependent on the analysis results, and are stored with the analysis results within the data repository 37. Succeeding phases are also, to a degree, dependent on the inferences made in the preceding phase”), where data conformance is performed before conformed data as transforming (Savage, Column 2, lines 15-16 “…manipulate ("transform") the source data in the intermediate database into the form needed for the EDM application;”) is loaded into the target data warehouse as loading the transformed source data (Savage, Column 2, lines 17-19 “(iv) load the transformed source data into the EDM application, and format it in those schemas necessary for OLAP or other EDM applications.”). Claims 50, 54 and 57 is rejected under pre-AIA 35 U.S.C. 103(a) as being unpatentable over Savage in view of vonKaenel and Porter [7051334]. With regard to claims 50, 54 and 57 proposed combination further teaches wherein the ETL mapping association object as the mapping expressions (Savage, Column 21, lines 5-8 “The migration specification is a complete and detailed collection of mapping expressions (i.e., statements of the functional relationship between a source column and a target column”) provides a [[ as a one-to-many relationship (Savage Column 9, lines 18-20 “The relationships between the entities are generally "one-to-many,"”) between the plurality of transparent view objects as the source tables in the viewpoint table (Savage, Column 22, line 13-15 “Migration specifications are often used to produce target data using a process that must combine data from multiple source tables”; line 36 “The first table in the join path is referred to as the “viewpoint table””) and a plurality of physical tables as the target column within the target table (Savage, Column 21, lines 5-8 “The migration specification is a complete and detailed collection of mapping expressions (i.e., statements of the functional relationship between a source column and a target column”; Column 21, lines 51-54 “the transformation expression defines the manner in which corresponding values of the referenced source columns are processed to produce a single value of the target column”). Savage does not explicitly teach a many-to-many relationship. Porter teaches a many-to-many ETL mapping relationship as a many-to-many relationship (Porter, Column 3, lines 27-31 “These relationships can be intermixed and multidirectional, thus defining a many-to-many relationship between business information domains”; Column 3, lines 61-62 “The Distributed ETL overcomes the many-to-many relationships that exist between information sources and targets”; Column 12, lines 5-9 “the Distributed ETL architecture of FIG. 9 is depicted in FIG. 10. The information domains 10, 40, 120 and 150 link to a logical hub 70 to resolve their many-to-many relationships.”) between the plurality of transparent view objects as information sources (Id) and a plurality of physical tables as targets (Id). It would have been obvious to one of ordinary skill in the art to which said subject matter pertains at the time in which the invention was filed to have implemented the relationships taught by the proposed combiantion to include Many-to-many relationships as depicted by porter as it is a traditional relationship found and managed within the field of art, as detailed by Porter. “Some warehouse information can be summarized, a characteristic of OLAP summary reporting, while OLAP tools sometimes drill down into detailed warehouse tables. Because of the differing data structures required to support the differing business uses for the data, the same information may be used for multiple business functions. Therefore, the information domains are interdependent upon each other.” (Porter, Column 3, lines 17-25). “Traditional EAI applications direct the flow of information between domains through a center point called a hub, or more commonly a message broker to manage many-to-many relationships in a scalable manner.” (Porter Column 3, lines 42-45). It is noted that the device taught by Savage is a distributed system (Savage, Column 14, line 45-46 “They represent partial distributions of the source data.”) to which the distributed ETL process presented by Porter may be implemented to address the Many-to-many relationships (Porter, Column 12, lines 5-9 “the Distributed ETL architecture of FIG. 9 is depicted in FIG. 10. The information domains 10, 40, 120 and 150 link to a logical hub 70 to resolve their many-to-many relationships.”). Response to Arguments Applicant's arguments filed January 19, 2026 have been fully considered but they are not persuasive. With regard to the 101 Applicant Argues (Page 11 of remarks) that the BI server can maintain a plurality of metadata objects to support the ELT process. In response, what the metadata objects are does not change the fact that they are generated and input by the user. The claim is directed to those objects being provided to the BI server. The intelligence of the system is the human that is operating the system. The claimed system is merely receiving the ‘metadata objects’ from the human user, and storing them using standard generic computing devices. With regard to the 101, Applicant argues (Page 11 of remarks) that the UI is constructed dynamically by providing selector boxes for users to input information. In response, this is a generic UI. Selector boxes are the term used for boxes that enable users to select information, e.g. input it. Applicant has provided no facts or information to suggest that the UI is any different than any generic GUI that has been standard within the field of art. The UI is still merely receiving input from the user. With regard to the 101, applicant argues (Page 111) that the UI enables the user to specify each ETL mapping type. Applicant references to Figure 4. In response, the actual XML declarations and ETL mappings are intelligence provided by the human user of the device. The device is not performing any function beyond merely displaying a GUI that enables the system to receive the user input. What a GUI is known and expected to do within the field of art. The crafting of GUIs are basic and standard components withing the field of computer science. The GUI as claimed is merely claimed as being able to providing data, and receiving selections from the user. Which one of ordinary skill in the art would recognize as standard GUI operations. With regard to the 101, applicant argues that the ETL process claimed executes ETL scripts to move the data based on the view objects and ETL mapping. In response, the claimed ETL process appears to be a standard conventional ETL process wherein the ETL mapping is specifies the translation, the ‘transparent view objects’ are the data being extracted. The claim language is recited at a high level of generality, that one of ordinary skill in the art would recognize as conventional ETL processing. With regard to the prior art, applicant argues that the prior art uses a viewpoint table, rather than a transparent view object. Applicant argues that the viewpoint table describes each possible sequence of joining source tables. In response, it is noted that the ‘transparent view object’ as recited in the claims comprises ‘metadata that represents a data shape of a joined set’. One of ordinary skill in the art would recognize the possible sequence as joins as a form of metadata that represents the ‘data shape of a joined set’. As it represents the ‘shape’ of how the sets are joined together. Applicant makes no effort to explain or clarify the meaning of the term ‘transparent view object’ or ‘data shape’. One of ordinary skill in the art would reasonably read the sets of tables in the viewpoint table as being within the broadest reasonable interpretation of ‘a plurilaty of transparent view objects’. Applicant asserts that they are not the same, but has been unable to articulate the scope of a ‘transparent view object’ or ‘data shape’ which would not read on the prior arts viewpoint tables. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to AMANDA WILLIS whose telephone number is (571)270-7691. The examiner can normally be reached Monday-Friday 8am-2pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ajay Bhatia can be reached at 571-272-3906. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /AMANDA L WILLIS/ Primary Examiner, Art Unit 2156
Read full office action

Prosecution Timeline

Sep 23, 2020
Application Filed
Jul 28, 2022
Non-Final Rejection — §101, §103
Feb 03, 2023
Response Filed
Apr 03, 2023
Final Rejection — §101, §103
Oct 06, 2023
Request for Continued Examination
Oct 11, 2023
Response after Non-Final Action
May 03, 2024
Non-Final Rejection — §101, §103
Oct 08, 2024
Response Filed
Nov 06, 2024
Final Rejection — §101, §103
May 09, 2025
Request for Continued Examination
May 12, 2025
Response after Non-Final Action
Jul 16, 2025
Non-Final Rejection — §101, §103
Jan 19, 2026
Response Filed
Feb 02, 2026
Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602380
SUBSUMPTION OF VIEWS AND SUBQUERIES
2y 5m to grant Granted Apr 14, 2026
Patent 12585675
HYBRID POSITIONAL POSTING LISTS
2y 5m to grant Granted Mar 24, 2026
Patent 12579206
AUTOMATIC ARTICLE ENRICHMENT BY SOCIAL MEDIA TRENDS
2y 5m to grant Granted Mar 17, 2026
Patent 12461960
SYSTEMS AND METHODS FOR MACHINE LEARNING-BASED CLASSIFICATION AND GOVERNANCE OF UNSTRUCTURED DATA USING CURATED VIRTUAL QUEUES
2y 5m to grant Granted Nov 04, 2025
Patent 12443613
REDUCING PROBABILISTIC FILTER QUERY LATENCY
2y 5m to grant Granted Oct 14, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

7-8
Expected OA Rounds
36%
Grant Probability
62%
With Interview (+26.6%)
4y 8m
Median Time to Grant
High
PTA Risk
Based on 345 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month