Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 12/19/2025 has been entered.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-7 are rejected under 35 U.S.C. 102(a)(1)/102(a)(2) as being anticipated by Liu et al. (WO 2018/035139).
Regarding Claim 1, Liu discloses a system comprising:
a query router operatively coupled to an embedding generator comprising a neural network, the query router configured to ([0025], Liu):
receive a search query associated with an entity from an end user system ([0025], Liu);
request, in response to the search query, the neural network to generate a query embedding based on the search query ([0037], Liu);
and
an index manager operatively coupled to the query router, a plurality of search provider sources associated with the entity and the embedding generator, wherein the index manager comprises a plurality of structured data (SD) elements associated with the entity stored in an indexed data store ([0078], Liu), the index manager configured to:
request the embedding generator to return a plurality of embedding vectors for the plurality of SD elements ([0078], Liu), wherein the plurality of embedding vectors for the plurality of SD elements are generated by the neural network ([0067], “the training search query is processed by the first machine learned model of the neural network. In operation 530, the first machine learned model outputs a training semantic search vector,” and [0068], “the training publication is processed by the second machine learned model of the neural network. In operation 560, the second machine learned model outputs a training publication semantic vector;” wherein Lie discloses two machine learned models in the same neural network; Liu);
update the indexed data store such that each of the plurality of SD elements is associated with a corresponding embedding vector of the plurality of embedding vectors ([0078] and [00110], Liu);
generate, by an indexing service module, an entity-specific index for the plurality of SD elements, such that each of the plurality of SD elements is associated with a name field associated with the entity and the corresponding embedding vector ([0079], Liu);
identify, in response to a request by the query router, a candidate SD element as a search result in response to the search query, wherein the identify comprises comparing the query embedding of the search query associated with the entity-specific index to the plurality of embedding vectors of the plurality of SD elements using a distance measurement, resulting in a first set of ranking having ([00123], “LI ranking 1330 is performed after L0 filtration 13 10. At 1332, a forward index is generated of semantic vectors for the candidate publications from 1320,” wherein the L1 ranking is an example of the first set of rankings claimed; Liu) an ordered list of SD elements within each search provider source of the plurality of search provider sources corresponding to the search query based on the distance measurement between the query embedding and the plurality of embedding vectors ([0078], [00102], [00104], and [00109], Liu), wherein the identity further comprises generating a second set of rankings associated with the plurality of search provider sources based on search provider source criteria ([00125], “L2 aspect filtration 1340 is performed after LI ranking 1330. At 1342, a forward aspect index 1342 extends beyond aspects provide by seller to find the most relevant candidate publications. Additional aspects are found from the publication via NER (name entity recognition),” wherein the L2 is an example of the second set of rankings claimed; Liu); and
generating the search result in response to the search query, wherein the search result comprises the candidate SD element within the ordered list having a smallest distance based on the first and the second set of rankings ([0082], [0092], [00106], [00109], [00127], “L3 final ranking 1350 is performed after L2 aspect filtration 1340. The L3 ranking module 1352 is a machine-learned model that takes inputs such as historic user behavior data, L0 score, BM25 score, LI score, reputation score, value, publication condition, user profile, session context, and demotion signal. Finally, at 1360, the most relevant N publications are returned to the user device, based on L3 ranking 1350,” wherein the L3 is based on L2 which is based on L1; Liu).
Regarding Claim 2, Liu discloses a system of claim 1, further comprising:
a cluster generator configured to ([0074]-[0075], Liu):
compare a first embedding vector associated with a first SD element to a second embedding vector associated with a second SD element ([0078]-[0082], Liu); and
identify a cluster comprising the first and the second SD element wherein a vector representation of the first embedding vector is within a threshold in comparison to the vector representation of the second embedding vector ([0078]-[0082], Liu).
Regarding Claim 3, Liu discloses a system of claim 1, wherein the query router is further configured to:
generate an interface for a display of the search result that are identified by the index manager, wherein the display of the search result comprises a ranked listing of the plurality of SD elements based on the ordered list ([00122], [00127], Liu).
Regarding Claim 4, Liu discloses a system of claim 1, wherein the index manager is further configured to:
generate, based on the comparing, scores representing a level of matching between the query embedding and the plurality of embedding vectors of the plurality of SD elements ([0072], Liu).
Regarding Claim 5, Liu discloses a system of claim 1, wherein information from the plurality of search provider sources is converted to SD elements associated with the entity ([0072], Liu), and wherein the plurality of search provide sources comprise the entity and one or more third-party search providers (Fig. 1, 130, 110, and 102, [0023], [0028], “the third-party application 132, utilizing information retrieved from the networked system 102, supports one or more features or functions on a website hosted by the third party. The third-party website, for example, provides one or more functions that are supported by the relevant applications of the networked system 102,” Liu).
Regarding Claim 6, Liu discloses a system of claim 1, wherein the embedding generator is configured to:
continuously generate embedding vectors for the plurality of SD elements and store the embedding vectors into the indexed data store, wherein the generation of embedding vectors occurs in advance of receiving the search query ([0078], Liu).
Regarding Claim 7, Liu discloses a system of claim 1, wherein the neural network of the embedding generator can be a bidirectional encoder representations (BERT) system, a fastText system, a Word2Vec system, or a Healthcare Word2Vec system ([0078], Liu).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim 8 is rejected under 35 U.S.C. 103 as being unpatentable over Liu et al. (WO 2018/035139) in view of Mass et al. (US 2021/0390418).
Regarding Claim 8, Liu discloses all the limitations as discussed above but does not expressly disclose a previously answered frequently asked question associated with the query. Mass discloses: a previously answered frequently asked question associated with the query ([0017] and [0029], Mass). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the system of Liu by incorporating a previously answered frequently asked question associated with the query, as disclosed by Mass, in order to help users find relevant answers to common questions ([0017], Mass). See: KSR International Co. v. Teleflex Inc., 82 USPQ 1385, 1396 (US 2007); MPEP § 2143.
Response to Arguments
Applicant argues that; “Liu does not describe any ranking of either the plurality of SD elements or the plurality of search provider sources. Since at least one claim feature is not disclosed or adequately suggested by Liu, claim 1 patentably distinguishes from Liu and is in condition for allowance.”
The Examiner respectfully disagrees. The applied art does disclose: resulting in a first set of ranking having ([00123], “LI ranking 1330 is performed after L0 filtration 13 10. At 1332, a forward index is generated of semantic vectors for the candidate publications from 1320,” wherein the L1 ranking is an example of the first set of rankings claimed; Liu) an ordered list of SD elements within each search provider source of the plurality of search provider sources corresponding to the search query based on the distance measurement between the query embedding and the plurality of embedding vectors ([0078], [00102], [00104], and [00109], Liu), wherein the identity further comprises generating a second set of rankings associated with the plurality of search provider sources based on search provider source criteria ([00125], “L2 aspect filtration 1340 is performed after LI ranking 1330. At 1342, a forward aspect index 1342 extends beyond aspects provide by seller to find the most relevant candidate publications. Additional aspects are found from the publication via NER (name entity recognition),” wherein the L2 is an example of the second set of rankings claimed; Liu).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to GIOVANNA B COLAN whose telephone number is (571)272-2752. The examiner can normally be reached on Mon - Fri 8:30-5:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Aleksandr Kerzhner can be reached on (571) 270-1760. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see https://ppair-my.uspto.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/GIOVANNA B COLAN/Primary Examiner, Art Unit 2165
March 18, 2026