DETAILED ACTION
This rejection is in response to Amendments filed on 10/17/2025.
Claims 1 and 3-20 are currently pending and have been examined.
Claim 2 is cancelled
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant's arguments filed 10/17/2025 have been fully considered but they are not persuasive.
With respect to applicant’s arguments on pages 11-15 of remarks filed 10/17/2025 that the claims improve technology because the custom attributes improve the relevance of search results and reduce the number of subsequent search queries which reduces computer resource consumption, Examiner respectfully disagrees.
If it is asserted that the invention improves upon conventional functioning of a computer, or upon conventional technology or technological processes, a technical explanation as to how to implement the invention should be present in the specification. That is, the disclosure must provide sufficient details such that one of ordinary skill in the art would recognize the claimed invention as providing an improvement. The specification need not explicitly set forth the improvement, but it must describe the invention such that the improvement would be apparent to one of ordinary skill in the art. Conversely, if the specification explicitly sets forth an improvement but in a conclusory manner (i.e., a bare assertion of an improvement without the detail necessary to be apparent to a person of ordinary skill in the art), the examiner should not determine the claim improves technology. An indication that the claimed invention provides an improvement can include a discussion in the specification that identifies a technical problem and explains the details of an unconventional technical solution expressed in the claim, or identifies technical improvements realized by the claim over the prior art. See MPEP 2106.05(a).
To show that the involvement of a computer assists in improving the technology, the claims must recite the details regarding how a computer aids the method, the extent to which the computer aids the method, or the significance of a computer to the performance of the method. Merely adding generic computer components to perform the method is not sufficient. Thus, the claim must include more than mere instructions to perform the method on a generic component or machinery to qualify as an improvement to an existing technology. See MPEP 2106.05(a) and (f).
Applicant’s specification makes a bare assertion of an improvement to reducing computing resource consumption by improving the relevancy of search results which eliminates or reduces repetitive search queries. (Applicant’s specification paragraph [0034]). However, it is unclear to one or ordinary skill in the art how technology is improved by improving the relevancy of search results. The claims do not recite any technical limitations on how repetitive search queries are eliminated or reduced to reduce computing resources. For example, claim 1 only recites that the computing device is used as a tool to implement the abstract idea of searching for items based on custom attribute. Therefore, the claims do not integrate the judicial exception into a practical application because the claims do not appear to reflect the asserted improvement of reducing resource consumption and merely use the computing device as a tool to implement the abstract idea.
With respect to applicant’s arguments on page 16-17 of remarks filed 10/17/2025 that Chaturvedi does not teach a user interacting with an image of an item from a product catalog, Examiner respectfully disagrees.
In response to applicant's argument that the references fail to show certain features of the invention, it is noted that the features upon which applicant relies (i.e., a user interacting with an image of an item from a product catalog) are not recited in the rejected claim(s). Although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims. See In re Van Geuns, 988 F.2d 1181, 26 USPQ2d 1057 (Fed. Cir. 1993).
With respect to applicant’s arguments on page 17-18 of remarks filed 10/17/2025 that the prior art does not teach “selecting a portion of the target item for custom attribute extraction” because Chaturvedi teaches selecting an item from a search result and does not teach determine attributes from an image of an item from an item or product database displayed to the user because Chaturvedi determines attributes from images of the real environment not on an image of an item, Examiner respectfully disagrees.
Chaturvedi teaches selecting a portion of the target item for custom attribute extraction because this reference teaches that the user interacts with the display screen as shown in Fig. 2, to select a portion of the image (e.g. a portion of the couch) of the scene which is used to identify and extract a patch of the image to identify scene and color information that is then used to filter types of items to be searched. Chaturvedi teaches that the physical environment that the attributes are extracted from are not limited to a captured image but can include image data from different sources rather than just being captured by the camera of the computing device (e.g. catalog of images or from other third party courses). (Chaturvedi, FIG. 2, [0042]; [0038]; [0039]; [0043]; [0061]; [0062]; [0035]; [0058]; [0121]).
With respect to applicant’s arguments on page 18 of remarks filed 10/17/2025 that applicants arguments were not against the references individually and neither reference, not their combination teaches processing images of items from an item listing datastore to determine attribute values for a custom attribute for each item where the item is not already present in the item data for the items in the item listing datastore, Examiner respectfully disagree.
Where a rejection of a claim is based on two or more references, a reply that is limited to what a subset of the applied references teaches or fails to teach, or that fails to address the combined teaching of the applied references may be considered to be an argument that attacks the reference(s) individually. See MPEP 2145(IV).
Applicant’s arguments were against the reference individually on page 6 of remarks filed 05/13/2025 because the rejection is based on two references and Applicant’s reply failed to address the combined teaching of the applied reference and attacked each reference individually as not teaching the entire limitation that had been rejected using the combination of references.
In response to applicant’s argument that there is no teaching, suggestion, or motivation to combine the references, the examiner recognizes that obviousness may be established by combining or modifying the teachings of the prior art to produce the claimed invention where there is some teaching, suggestion, or motivation to do so found either in the references themselves or in the knowledge generally available to one of ordinary skill in the art. See In re Fine, 837 F.2d 1071, 5 USPQ2d 1596 (Fed. Cir. 1988), In re Jones, 958 F.2d 347, 21 USPQ2d 1941 (Fed. Cir. 1992), and KSR International Co. v. Teleflex, Inc., 550 U.S. 398, 82 USPQ2d 1385 (2007). In this case, Chaturvedi teaches analyzing the portions of the image to determine color and scene information and providing the types of items or subsequent items that include one or more colors that are visually similar to a color from the extracted patch of the captured image. Color distances from the colors from image are processed and compared to colors using product information that includes information about items of that type stored on the database to provide visually similar types of items that may be provided as item data, and related items or products may be subsequently provided. (Chaturvedi, [0038]; [0052]; [0053]; [0120]; [0121]) Chaturvedi teaches the custom attribute, but Chaturvedi is not relied upon to teach a custom attribute that is not currently available. However, Erickson is relied upon to teach a custom attribute that is not currently available because this reference teaches determining that a derived attribute is unavailable (Erickson, [0017]; [0057]). Therefore, it would have been obvious to have modified the custom attribute of Chaturvedi with a custom attribute that is not currently available as taught by Erickson because the results of such a modification would be predictable.
With respect to applicant’s arguments on page 18-19 of remarks filed 10/17/2025 that the claim does recite the feature of generating search results for items based on attribute values of a custom attribute determined by analyzing images of the items because the claim recites: "analyzing, by at least one server of the one or more servers, the image of the target item to determine an attribute corresponding to the portion of the target item as a custom attribute that is not currently available as a search facet for a search engine of the listing platform...," "analyzing, by at least one server of the one or more servers, the image of each of the plurality of other items to determine an attribute value of the custom attribute for each of the plurality of other items that is not available in the item data...,” and "generating, by the search engine on at least one server of the one or more servers, a set of search results based at least in part on the attribute value of the custom attribute for each of the plurality of other items..., Examiner respectfully disagrees.
Applicant’s arguments on page 6-7 filed 05/13/2025 argued the specific limitations of “generating, by the search engine on at least one server of the one or more servers, a set of search results based at least in part on the attribute value of the custom attribute for each of the plurality of other items; and providing, by at least one server of the one or more servers, a search results user interface presenting the set of search results for presentation on the user device” which is not the same as the features upon which applicant relies (i.e., generating search results for items based on attribute values of a custom attribute determined by analyzing images of the items), therefore Examiner had interpreted the feature that Applicant relied upon as what Applicant was arguing is the recitation of claim limitations.
With respect to applicant’s arguments on page 19 of remarks filed 10/17/2025 that Chaturvedi and Erickson, either alone or in combination, fail to teach the independent claims or dependent claims, Examiner respectfully disagrees.
Applicant's arguments fail to comply with 37 CFR 1.111(b) because they amount to a general allegation that the claims define a patentable invention without specifically pointing out how the language of the claims patentably distinguishes them from the references.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1 and 3-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (an abstract idea) without significantly more.
Under Step 1 of the Subject Matter Eligibility Test, it must be considered whether the claims are directed to one of the four statutory classes of invention. See MPEP § 2106. In the instant case, claims 1 and 3-11 are directed to computer storage media (Applicant’s specification [0086]: computer storage media does not comprise signals per se), claims 12-16 are directed to a method, and claims 17-20 are directed to a system which falls within one of the four statutory categories of invention(process/apparatus). Accordingly, the claims will be further analyzed under revised step 2:
Under step 2A (prong 1) of the Subject Matter Eligibility Test, it must be considered whether the claims recite a judicial exception if so, then determine in Prong Two if the recited judicial exception is integrated into a practical application of that exception. If the claim recites a judicial exception (i.e., an abstract idea), the claim requires further analysis in Prong Two. One of the enumerated groupings of abstract ideas is defined as certain methods of organizing human activity that includes fundamental economic principles or practices (including hedging, insurance, mitigating risk); commercial or legal interactions (including agreements in the form of contracts; legal obligations; advertising, marketing or sales activities or behaviors; business relations); managing personal behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions). See MPEP § 2106.04(a)(2).
Regarding representative independent claim 1, the abstract idea includes:
…wherein the target item is a first item type from a plurality of item types available on the listing platform;
receiving,…, user input associated with the image of the target item selecting a portion of the target item for custom attribute extraction;
responsive to the user input, analyzing,…, the image of the target item to determine an attribute corresponding to the portion of the target item as a custom attribute that is not currently available as a search facet for a search engine of the listing platform to search the item listing datastore for item listings for items of the first item type;
querying, …, the item listing datastore to identify a plurality of other item listings for a plurality of other items of the first item type;
accessing, …, an image for each of the plurality of other items from item data for the plurality of other item listings…;
analyzing,…, the image of each of the plurality of other items to determine an attribute value of the custom attribute for each of the plurality of other items that is not available in the item data for the plurality of other item listings…;
generating, …, a set of search results based at least in part on the attribute value of the custom attribute for each of the plurality of other items; and
This arrangement amounts to certain methods of organizing human activity associated with sales activities and commercial interactions involving searching for items based on custom attributes determined by analyzing an image for a custom attribute based on user input. Such concepts have been considered ineligible certain methods of organizing human activity by the Courts. See MPEP § 2106.
The Step 2A (prong 2) of the Subject Matter Eligibility Test, is the next step in the eligibility analyses and looks at whether the abstract idea is integrated into a practical application. This requires an additional element or combination of additional elements in the claims to apply, rely on, or use the judicial exception in a manner that imposes a meaningful limit on the judicial exception, such that the claim is more than a drafting effort designed to monopolize the exception. See MPEP § 2106.
In this instance, the claims recite the additional elements such as:
One or more computer storage media storing computer-useable instructions that, when used by a computing device, cause the computing device to perform operations, the operations comprising (Claim 1);
providing, by at least one server of one or more servers of a listing platform, a user interface presenting an image of a target item for presentation on a user device, wherein the image of the target item is accessed from item data for a first item listing stored in an item listing datastore of the listing platform, and …;…via the user interface…;…by at least one server of the one or more servers…;….by the search engine on at least one server of the one or more servers…; …, by at least one server of the one or more servers… stored in the item listing datastore;… by at least one server of the one or more servers… stored in the item listing datastore; … by the search engine on at least one server of the one or more servers…; providing, by at least one server of the one or more servers, a search results user interface presenting the set of search results for presentation on the user device (Claims 1, 12, and 17);
the search results user interface presenting the set of search results (Claims 5, 14, and 18);
providing the search results user interface presenting the set of search results (Claim 6-8, 15-16, and 19-20);
storing data regarding the attribute values for the custom attributes for the other items in an item listing datastore (Claim 9);
a user interface (Claim 10);
A computer-implemented method comprising: … storing in the item listing datastore, by at least one server of the one or more servers (Claim 12 );
A computer system of a listing platform comprising: an item listing data store storing item data for a plurality of items of a plurality of item types; and one or more servers having one or more processors and one or more computer storage media storing computer-useable instructions that, when used by one or more processors, causes the computer system to perform operations comprising: … (Claim 17).
However, these elements do not amount to an improvement in the functioning of a computer or any other technology or technical field, apply the judicial exception with, or by use of, a particular machine, or apply or use the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception.
Independent claims and dependent claims also fail to recite elements which amount to an improvement in the functioning of a computer or any other technology or technical field, apply the judicial exception with, or by use of, a particular machine, or apply or use the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception. For example, independent claims and dependent claims are directed to the abstract idea itself and do not amount to an integration according to any one of the considerations above.
Step 2B is the next step in the eligibility analyses and evaluates whether the claims recite additional elements that amount to an inventive concept (i.e., “significantly more”) than the recited judicial exception. According to Office procedure, revised Step 2A overlaps with Step 2B, and thus, many of the considerations need not be re-evaluated in Step 2B because the answer will be the same. See MPEP § 2106.
In Step 2A, several additional elements were identified as additional limitations:
One or more computer storage media storing computer-useable instructions that, when used by a computing device, cause the computing device to perform operations, the operations comprising (Claim 1);
providing, by at least one server of one or more servers of a listing platform, a user interface presenting an image of a target item for presentation on a user device, wherein the image of the target item is accessed from item data for a first item listing stored in an item listing datastore of the listing platform, and …;…via the user interface…;…by at least one server of the one or more servers…;….by the search engine on at least one server of the one or more servers…; …, by at least one server of the one or more servers… stored in the item listing datastore;… by at least one server of the one or more servers… stored in the item listing datastore; … by the search engine on at least one server of the one or more servers…; providing, by at least one server of the one or more servers, a search results user interface presenting the set of search results for presentation on the user device (Claims 1, 12, and 17);
the search results user interface presenting the set of search results (Claims 5, 14, and 18);
providing the search results user interface presenting the set of search results (Claim 6-8, 15-16, and 19-20);
storing data regarding the attribute values for the custom attributes for the other items in an item listing datastore (Claim 9);
a user interface (Claim 10);
A computer-implemented method comprising: … storing in the item listing datastore, by at least one server of the one or more servers (Claim 12 );
A computer system of a listing platform comprising: an item listing data store storing item data for a plurality of items of a plurality of item types; and one or more servers having one or more processors and one or more computer storage media storing computer-useable instructions that, when used by one or more processors, causes the computer system to perform operations comprising: … (Claim 17).
These additional limitations, including the limitations in the independent claims and dependent claims, do not amount to an inventive concept because the recitations above do not amount to an improvement in the functioning of a computer or any other technology or technical field, apply the judicial exception with, or by use of, a particular machine, or apply or use the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception. In addition, they were already analyzed under Step 2A and did not amount to a practical application of the abstract idea.
For these reasons, the claims are rejected under 35 U.S.C. 101.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1 and 3-20 are rejected under 35 U.S.C. 103 as being unpatentable over Chaturvedi et al. (US Pub. No. 20200258144 A1, hereinafter “Chaturvedi”) in view of Erickson et al. (US Pub. No. 20230128276 A1, hereinafter “Erickson”).
Regarding claim 1, 12, and 17
Chaturvedi discloses one or more computer storage media storing computer-useable instructions that, when used by a computing device, cause the computing device to perform operations, the operations comprising (Chaturvedi, [0122]: computer-readable storage medium):
providing, by at least one server of one or more servers of a listing platform, a user interface presenting an image of a target item for presentation on a user device, wherein the image of the target item is accessed from item data for a first item listing stored in an item listing datastore of the listing platform, and wherein the target item is a first item type from a plurality of item types available on the listing platform (Chaturvedi, [0061] The computing device 304 displays the types of items 308 on a display screen 306 of the computing device 304; [0058]: select from the types of items or products of an item or product database; [0037]: a database 222 of items or products and may include items or products sold directly by from a third party or provider that submits its inventory information of items and products; [0095]: The set of images may be from an image database that stores images; [0086]: images received from previously stored data; [0035]: server and image from catalog of images; [0121]: datastore includes databases storing data related to product data);
receiving, via the user interface, user input associated with the image of the target item, selecting a portion of the target item for custom attribute extraction; responsive to the user input, analyzing, by at least one server of the one or more servers, the image of the target item to determine an attribute corresponding to the portion of the target item as a custom attribute…as a search facet for a search engine on the listing platform to search the item listing datastore for item listings for items of the first item type (Chaturvedi, FIG. 2, [0042]: the user 202 can interact with a display screen of the computing device 204 to manually define the portion or region, e.g., a circular or rectangular portion 226A, 226B (e.g. includes a portion of the couch), in the image 224 from which scene and/or color information may be identified; [0045]: scene information may include a room with furniture; [0038]: the user 202 interacts with the display screen of the computing device 204, through a software application executing on the computing device 204 to select one or more portions 226A, 226B of the image 224 for determining spatial information (e.g., dimensions), color information, and scene information as to the portions 226A, 226B; [0039]: a portion of the image 226 may be selected by using a zoom mode and when the zoom mode is used during capture, more features and clarity is obtained to help the system identify objects, available space, and colors; [0043] Extracting a patch from the image 224 can also allow for identification of a spectrum of colors that are present in the image 224 or in the patch of image 224. Further, user 202 may seek to use colors from one portion 226B of image and scene information and colors extracted are used to filter types of items; [0061]: user defines aspects for item; [0062]: selected or determined aspects are used to search for items associated aspects as a visual search to the server for analysis and determination of items from stored information for each type of item in the types of items; [0035]: server and image from catalog of images; [0058]: select from the types of items or products of an item or product database; [0121]: datastore includes databases storing data related to product data);
querying, by the search engine on at least one server of the one or more servers, the item listing datastore to identify a plurality of other item listings for a plurality of other items of the first item type; accessing, by at least one server of the one or more servers, an image for each of the plurality of other items from item data for the plurality of other item listings stored in the item listing datastore (Chaturvedi, [0062]: visual search from stored information for each type of item in the types of items to match items with visual aspects of image and display and generate item listing with matching items; [0120]: server 908 can include any appropriate hardware and software for integrating with the data store 918 and handling the image data and/or visual queries to access graphics from datastore; [0121]: datastore includes databases storing data related to product data and in response to user submitting a search, access stored information about items; [0035]: server and image from catalog of images; [0058]: select from the types of items or products of an item or product database);
analyzing, by at least one server of the one or more servers, the image of each of the plurality of other items to determine an attribute value of the custom attribute for each of the plurality of other items …in the item data for the plurality of other item listings stored in the item listing datastore (Chaturvedi, [0038]: Thereafter, the types of items 208 may be provided as item data 212, based on the portion(s) 226A, 226B of the image 224, and related items or products may be subsequently provided based in part on a selection of one or more types of items from the types of items 208; [0052]: The product search system or server 220 determines a color family histogram for at least a portion of the pixels of the image data based on color information scaled with weighed factor value; [0053]: generating or providing the types of items or items with visually similar colors from the image data that is compared with known color information; [0120]: server 908 access the data store 918 and handling the image data and/or visual queries to access graphics from datastore; [0121]: datastore includes databases storing data related to product data and in response to user submitting a search, access stored information about items); and
generating, by the search engine on at least one server of the one or more servers, a set of search results based at least in part on the attribute value of the custom attribute for each of the plurality of other items; and providing, by at least one server of the one or more servers, a search results user interface presenting the set of search results for presentation on the user device (Chaturvedi, [0062]: server generates and display items on display of computing device based on performing visual search based on featured of image; [0121]: submit a search request for a certain type of item. In this case, the data store might access the user information to verify the identity of the user and can access the catalog detail information to obtain information about items of that type. The information can then be returned to the user, such as in a results listing on a web page that the user is able to view via a browser on the computing device 902).
Chaturvedi does not teach:
a custom attribute that is not currently available… (emphasis added);
the custom attribute … that is not available…(emphasis added);
However, Erickson teaches:
a custom attribute that is not currently available (emphasis added); the custom attribute … that is not available (emphasis added) (Erickson, [0017]: determines the unavailability of the derived attribute in the derived attribute cache; [0057]: determines that the derived attribute 314 is unavailable).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the custom attribute to search for items of Chaturvedi with a custom attribute that is not currently available as taught by Erickson because the results of such a modification would be predictable. Specifically, Chaturvedi would continue to teach the custom attribute to search for items except that now the custom attribute is not currently available is used according to the teachings of Erickson in order to determine if the derived attribute is unavailable. This is a predictable result of the combination. (Erickson, [0057]).
Regarding claims 12 and 17
Claims 12 and 17 are substantially similar to independent claim 1. However, claim 12 recites an additional limitation: storing, in the item listing datastore, by at least one server of the one or more servers, data associating the attribute values for the custom attributes with the other items (Chaturvedi, [0099]; [0026]; [0121]). However, claim 17 also recites an additional limitation: an item listing data store storing item data for a plurality of items of a plurality of item types (Chaturvedi, [0099]; [0026]; [0121]).
Regarding claim 3
The combination of Chaturvedi and Erickson teaches the one or more computer storage media of claim 1, wherein analyzing the image of the target item to determine the attribute of the target item comprises analyzing the portion of the target item (Chaturvedi, [0038]: processing of the one or more portion(s) 226A, 226B (from one or more different locations or directions of the camera) in the image data 210, instead of the whole image data from the image 224 when the user 202 select one or more portions 226A, 226B of the image 224 for determining spatial information (e.g., dimensions), color information, and scene information as to the portions 226A, 226B; [0039]: a portion of the image 226 may be selected and more features and clarity is obtained to help the system identify objects, available space, and colors).
Regarding claim 4
The combination of Chaturvedi and Erickson teaches the one or more computer storage media of claim 3, wherein the attribute of the target item comprises a dimension of the portion of the target item determined using one or more known dimensions of the target item (Chaturvedi, [0139]: the relative sizes of an object's features are known; [0038]: processing of the one or more portion(s) 226A, 226B (from one or more different locations or directions of the camera) in the image data 210, instead of the whole image data from the image 224 when the user 202 select one or more portions 226A, 226B of the image 224 for determining spatial information (e.g., dimensions); [0061]: The system will be able to use the previously detailed AR measurement features to scale the spatial area and to consider items (with the selected one or more types of items) to fit those spaces).
Regarding claims 5, 14, and 18
The combination of Chaturvedi and Erickson teaches the one or more computer storage media of claim 1, wherein the search results user interface presenting the set of search results comprises presenting an indication of the attribute value of the custom attribute corresponding with each search result (Chaturvedi, [0062]: The selected or the determined aspects and the selected types of items may be provided as a visual search and determination of items or products that are associated with these two features and display best matches of item listings on display; FIG. 4E, [0077]: display a slider filter 486 to further filter the items available by various aspects. The slide filter 486 provides fine tuning to the various aspects 486A-C, including dimensions, colors, etc., determined to generate the at least two items 484A, 484B. However, the slide filter 486 may also be used to add additional aspects to further filter the items in the Item Listing; [0035]: display screen).
Regarding claim 6
The combination of Chaturvedi and Erickson teaches the one or more computer storage media of claim 5, wherein the custom attribute comprises a dimension of a portion of the other items, and wherein providing the search results user interface presenting the set of search results comprises presenting, within an image for each search result, a size reference image providing an indication of a relative size of a corresponding item for the search result (Chaturvedi, [0084]: filter by size and an input via an input feature (similar to the prior examples of FIGS. 4A, 4C) confirms the selection of the item 482A provided by the user and causes the alternate item 484B to be displayed in the space as an updated overlay to the curated environment; [0090]: generates items that fit the available spaces based on item size information; [0061]: The system will be able to use the previously detailed AR measurement features to scale the spatial area and to consider items (with the selected one or more types of items) to fit those spaces; [0121]: submit a search request for a certain type of item and display item in overlay; [0038]: processing of the one or more portion(s) 226A, 226B (from one or more different locations or directions of the camera) in the image data 210, instead of the whole image data from the image 224 when the user 202 select one or more portions 226A, 226B of the image 224 for determining spatial information (e.g., dimensions); [0035]: display screen).
Regarding claims 7, 15, and 19
The combination of Chaturvedi and Erickson teaches the one or more computer storage media of claim 1, wherein providing the search results user interface presenting the set of search results comprises ordering the search results based at least in part on the attribute values of the custom attribute for at least a portion of the other items (Chaturvedi, [0062]: visual search to find items based on features. The order of identification of the items or the products may be reversed—first to the types of items and then the aspects—in an implementation. In either implementation, the items or the products may be filtered to a best match (e.g., an item matching as many of the selected or the determined aspects ranks the item at the top or places the item in prominence in the overlay for the AR view and displayed to user on display; [0057]: display items in a higher numerical order; [0035]: display screen).
Regarding claims 8, 16, and 20
The combination of Chaturvedi and Erickson teaches the one or more computer storage media of claim 1, wherein providing the search results user interface presenting the set of search results comprises selecting the search results based at least in part on the attribute values of the custom attribute for at least a portion of the other items (Chaturvedi, [0062]: display results on display after searching image based on features of image; [0054]: when identifying colors that are visually similar to colors in the extracted patch, the product search system or sever 220 only evaluates colors that have been determined to be popular. Popular colors may be colors that are associated with certain items, e.g., items or products that have been identified as being popular for a particular physical environment. Evaluating sales data to identify a popular product can include determining whether the item or the product satisfies a threshold sales volume or threshold revenue. Items or products that are in demand for a physical environment, e.g., trending, can also be identified as being popular; [0042]: . The patch may be associated with the portions 226A, 226B; [0038]: processing of the one or more portion(s) 226A, 226B (from one or more different locations or directions of the camera) in the image data 210, instead of the whole image data from the image 224 when the user 202 select one or more portions 226A, 226B of the image 224 for determining spatial information (e.g., dimensions); [0035]: display screen).
Regarding claim 9
The combination of Chaturvedi and Erickson teaches the one or more computer storage media of claim 1, wherein the operations further comprise: storing data regarding the attribute values for the custom attributes for the other items in an item listing datastore (Chaturvedi, [0099]: memory for storing output values indicative of one or more attributes, entities, or concepts that the trained NN can identify, such as product attributes shown on web pages or catalogs; [0026]: save items; [0121]: database for storing data related to aspects).
Regarding claim 10
The combination of Chaturvedi and Erickson teaches the one or more computer storage media of claim 1, wherein the operations further comprise: updating a user interface to add a filter option for the custom attribute (Chaturvedi, [0051]: The user 202 operating the computing device 204 can select a color, either as an initial aspect from the live camera view or as an aspect filter from a sliding filter after a first (or initial) list of items is provided. When the sliding filter is used, a response may be provided from the server 202 to update the first list of times; [0044]: filter 486 in FIG. 4E), on the display screen of the computing device 204; [0077]: add items to AR view after filter is applied).
Regarding claim 11
The combination of Chaturvedi and Erickson teaches the one or more computer storage media of claim 1, wherein the operations further comprise: generating a recommendation for the custom attribute (Chaturvedi, [0053]: generating or providing the types of items or items with visually similar colors from the image data may be based in part on processing color distances from image data of pixels. The image data is compared with known color information from the database 222 using distance measurements; [0054]: the color samples from which visually similar colors are identified are restricted to popular colors. For example, when identifying colors that are visually similar to colors in the extracted patch, the product search system or sever 220 only evaluates colors that have been determined to be popular; [0117]: a user may be recommended virtual products based on color in the frame).
Regarding claim 13
The combination of Chaturvedi and Erickson teaches the computer-implemented method of claim 12, wherein the user input comprises a selection of a portion of the target item in the image of the target item, and wherein analyzing the image of the target item to determine the attribute of the target item comprises analyzing the portion of the target item (Chaturvedi, [0038]: processing of the one or more portion(s) 226A, 226B (from one or more different locations or directions of the camera) in the image data 210, instead of the whole image data from the image 224 when the user 202 select one or more portions 226A, 226B of the image 224 for determining spatial information (e.g., dimensions), color information, and scene information as to the portions 226A, 226B; [0039]: a portion of the image 226 may be selected and more features and clarity is obtained to help the system identify objects, available space, and colors).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure is cited as D'Souza et al. (US 20210125251 A1) related to providing a user with recommendations when generating an image of a product, Kraft et al. (US 9424598 B1) related to a visual search where the customer can request additional information of a specific product by submitting an image of the specific product from a computing device, and non-patent literature, Visual Search at Alibaba, related to visual search algorithm using deep learning to search for images by visual features to provide users with relevant image list.
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to LATASHA DEVI RAMPHAL whose telephone number is (571)272-2644. The examiner can normally be reached 11 AM - 7:30 PM (EST).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jeffrey A Smith can be reached on 5712726763. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/LATASHA D RAMPHAL/Examiner, Art Unit 3688
/Jeffrey A. Smith/Supervisory Patent Examiner, Art Unit 3688