DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendments
The action is responsive to the Applicant’s Amendment filed on 12/10/2025.
Claims 1-21 are pending in the application. Claims 2, 8, 11 and 13 are amended.
Response to Arguments
Applicant’s arguments with respect to the rejections previously made and the amended claims filed on 12/10/2025 have been fully considered but they are not persuasive. In view of the claim amendments, the rejections are being updated accordingly.
In regards to independent claim 1, Applicant argued that the cited reference Horn does not teach or suggest "generating, by the device, a second set of system-wide metadata distinct from the first set of application-generated metadata by processing the first image file and the first set of application-generated metadata".
In response to the arguments, it is submitted the cited limitations are being properly addressed by Horn based at least on Horn disclosing the following:
Firstly, neither the claim nor the specification defines the terms “system-wide metadata” or “application-generated metadata”. Also, the specification does not teach a second set of metadata “distinct” from the first set. Given the broadest reasonable interpretation, Horn discloses generating, by the device, a second set of system-wide metadata distinct from the first set of application-generated metadata by processing the first image file and the first set of application-generated metadata in Fig. 1, the catalog metadata database 107, paragraphs [0043]-[0046], [0109], and [0124] as outlined in the Office Action.
Secondly, Applicant also argues that “Horn does not disclose any mechanism for creating metadata that is distinct from application-generated metadata”. However, the instant specification does not explicitly define or contain the term “distinct”. Paragraph [0028] states, “The process in search module 150 continues by processing the image file and associated first set of metadata to generate a second set of metadata associated with the image file (block 230).” It does not teach that this second set of metadata is “distinct” from the first set. Applicant goes on to say that Horn “cannot satisfy the claim's recitation of ‘application-generated metadata.’” However, that term does not appear anywhere in the specification, nor is it defined by the claims.
Thirdly, Applicant states that, “Furthermore, the UUID and UID also do not teach the claimed ‘second set of system-wide metadata.’” As stated above, the instant specification does not explicitly define the term system-wide metadata. Although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims (see MPEP 2145 (VI)). The terms are not defined by the claim, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention.
Thus, for at least the reasons as set forth above, it is submitted that the amended limitations are properly addressed by the cited references.
In regards to independent claims 11 and 16, the emphasized limitations that the Applicant argues in claims 11 and 16 are similar to the emphasized limitations of claim 1, which have been addressed above. See the response of claim 1 above for explanation.
Furthermore, it is also submitted that all limitations in pending claims, including those not specifically argued, are properly addressed. The reason is set forth in the rejections. See claim analysis below for detail.
Claim Rejection - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1-2, 8, 11, 13 and 16 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Regarding claims 1-2, 8, 11, 13 and 16, the term "system-wide metadata" is not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention.
Any claim not specifically addressed is being rejected for being incorporate of the claim it is dependent upon.
Claim Rejections - 35 USC § 103
4. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
6. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-15 are rejected under 35 U.S.C. 103 as being unpatentable over Dua et al. (US 20190220708 A1, hereinafter Dua) in view of Horn (US 20160117071 A1) and Morris et al. (US 20150381534 A1).
Regarding Claim 1, Dua discloses a method, comprising:
receiving, from a first application running on a device, a first image file (Fig. 1;[0033]: The image application 103 may include a thin-client image application 103b stored on the user device 115a… For example, the user 125a may capture images using the user device 115a and transmit the images to the image server 101 for the image application 103a) and a first set of application-generated metadata associated with the first image file (Fig. 1; [0020]: Metadata may be based on data generated by a user device, such as an image capture device used to capture an image; [0052]: The images may be associated with metadata);
updating an on-device index based on the first image file and the first set of application-generated metadata to include the first image file in the on-device index (Figs. 1-2; [0006]: The system may further include an indexing module stored in the memory… The indexing module may be further operable to… update the mapping by adding the identifying information to the one or more of the images; [0019]: the images may be captured by a user device associated with the user, stored on the user device associated with the user; [0033]: For example, the image application 103a may generate an index for the user based on the images; [0102]: The image assistant 208 may… instruct the indexing module 204 to update the index by adding the identifying information to corresponding images);
However, Dua does not explicitly teach “generating, by the device, a second set of system-wide metadata distinct from the first set of application-generated metadata by processing the first image file and the first set of application-generated metadata; updating the on-device index based on the second set of system-wide metadata; receiving an image search query; identifying a plurality of candidate image files from the on-device index based on the image search query and the on-device index updated based on the first set of application- generated metadata and the second set of system-wide metadata, wherein the plurality of candidate image files originate from at least one of the first application or the second application; and providing the plurality of candidate image files for display in response to the image search query, and the on-device index including a second image file previously received from a second application running on the device”.
On the other hand, in the same field of endeavor, Horn teaches
generating, by the device, a second set of system-wide metadata distinct from the first set of application-generated metadata by processing the first image file and the first set of application-generated metadata (Fig. 1, the catalog metadata database 107; [0043]-[0046]: This metadata is attached to the reference objects, which are stored in the catalog… As reference objects are created or are updated by MFS, they are collected into system and user-defined collections, which provide a logical grouping of objects based on one or more of three criteria… system-defined metadata query specification(s); [0124]: The MFS system tags objects of various kinds with the special attributes, links, and general descriptive metadata… and contain objects that are logically-grouped by… system-defined metadata query; [0109]: In addition, for images, MFS stores and maintains up to date in the catalog metadata);
updating the on-device index based on the second set of system-wide metadata (Fig. 1, updater 104; [0046]: Updates and changes to the reference objects also update the metadata in the catalog with the changes rippling throughout all the images in all collections instantly and simultaneously; [0092]: FIG. 33 is a schematic describing in detail the updater process: how objects' properties are updated and their values stored into the metadata catalog);
receiving an image search query ([0046]: Searching the metadata, via automatic or user selected or created queries recalls the single reference object from the OODB; [0126]: Fig. 14; The collection's metadata query is specified in an information window, which consists of the collection name (1401) and a pane of terms (1402) which must be satisfied for objects to be collected);
identifying a plurality of candidate image files from the on-device index based on the image search query and the on-device index updated based on the first set of application- generated metadata and the second set of system-wide metadata (Fig. 1, Fig. 3; [0046]: MFS creates an internal representation of the object and stores the representation in the MFS object oriented database (OODB)… and if selected, the external object is retrieved; [0110]: Objects may be quickly retrieved by any expression denoting desired property values stored in the MFS metadata… See Fig. 3); [0126] User-defined metadata queries, as shown in FIG. 14, provide automatic grouping of objects that share certain property values),
wherein the plurality of candidate image files originate from at least one of the first application ([0028]- [0036]: Reference Object: means an object internally created and stored in the catalog and object store, which represents data originating externally… Working Set: means the set of sources of information, either created internally or imported from or received from external originators, that the inventive MFS, metadata filing system, manages; [0109]: In addition, for images, MFS stores and maintains up to date in the catalog metadata representing the image dimension… For Adobe documents, special Adobe-specific properties called XMP (Extended Metadata Protocol) is read from each document and stored in the metadata database catalog as well. These properties may be available by examining the images); and
providing the plurality of candidate image files for display in response to the image search query ([0037]: metadata filing system, includes… a metadata database structure, or catalog, for the management and rendering of these objects to a display viewable by a user in response to user input, regardless of the source or nature of the object; [0045]: The inventive MFS-configured system provides an organizational structure and methodology for information management, including… logically grouping, and display of informational objects of all kinds; [0064]: FIG. 5 is a display showing the preview viewing mode for images).
Additionally, Morris teaches the on-device index including a second image file previously received from a second application running on the device (Fig. 1A; [0017]-[0019]: The image from a local image store could be a picture taken previously, an image downloaded from an image archive, or an image the user created using some other application. Additionally, a user can choose to copy an image from a message or web page… a user may upload a previously created selfiecon set from, for example…from a storage location of the computing device…. In some embodiments, the computing device may add the image uploaded at 123 or captured by the camera at 125. [This is nonfunctional descriptive material and is not functionally involved in the step recited].
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the teaching of Dua to incorporate the teachings of Horn and Morris to generate a second set of system-wide metadata distinct from the first set of application-generated metadata, provide image files for display in response to the image search query, and to include a second image file previously received from a second application running on the device.
The motivation for doing so would be to manage data items, regardless of the source, for grouping and retrieval from any collection as recognized by Horn ([Abstract] of Horn: The system provides… a metadata database structure which stores only one instance of each object while linking the object to multiple collections and domains for grouping into and retrieval from any of the collections), and to utilize existing images in a local image store created from other applications, as recognized by Morris ([0019] of Morris: The image from a local image store could be… an image the user created using some other application).
Regarding Claim 2, the combined teachings of Dua, Horn, and Morris disclose the method of claim 1.
Dua further teaches wherein the first set of application-generated metadata is generated by the first application based on a first knowledge graph comprising information accessible to the first application (Fig. 1; [0064]: The search module 206 may retrieve the information from a search engine, a third-party server 120, such as a third-party server 120 that generates a server-hosted knowledge graph, etc.), and
wherein the second set of system-wide metadata is generated based on a second knowledge graph, different from the first knowledge graph, comprising system-wide information (Figs. 1-2; [0084]-[0088]: In some implementations, the search module 206 accesses a third-party server 120 from FIG. 1 to obtain additional information in real-time to categorize user queries… The search module 206 may use the additional information to translate the one or more user-generated search terms into categorized search terms… The search module 206 may access multiple third-party servers 120 to obtain the additional information [nonfunctional descriptive material describing the data]).
Regarding Claim 3, the combined teachings of Dua, Horn, and Morris disclose the method of claim 2.
Dua further teaches further comprising:
receiving, from the second application, the second image file and a third set of metadata associated with the second image file ([0035]-[0036]: In some implementations, the third-party server 120 sends and receives data to and from one or more of the image server 101 and the user devices 115a-115n via the network 105… For example, the third-party server 120 may include a social network application that manages social network profiles, an email application that includes profile images of users; [0052]: The images may be associated with metadata);
updating the on-device index based on the second image file and the third set of metadata (Figs. 1-2; [0006]: The system may further include an indexing module stored in the memory… The indexing module may be further operable to… update the mapping by adding the identifying information to the one or more of the images; [0033]: For example, the image application 103a may generate an index for the user based on the images; [0102]: The image assistant 208 may… instruct the indexing module 204 to update the index by adding the identifying information to corresponding images);
processing the second image file and the third set of metadata to generate a fourth set of metadata ([0033]: The image application 103a stored on the image server 101 may process the images and send additional information back to the image application 103b stored on the user device 115a; Fig. 2; [0051]: The image processing module 202 may determine one or more labels for an image where the one or more labels may include metadata); and
updating the on-device index based on the fourth set of metadata (Figs. 1-2; [0033]: the image application 103a may generate an index for the user based on the images; Fig. 2; [0051]: once the types of labels are recognized or generated by the image processing module 202, the image processing module 202 treats them as labels that are associated with an image; [0072]: In some implementations, the indexing module 204 generates an index from images that are associated with one or more labels where the labels include metadata; See also paras [0071]-[0075]).
Regarding Claim 4, the combined teachings of Dua, Horn, and Morris disclose the method of claim 3.
Dua further teaches wherein the plurality of candidate image files comprises one or more image files received from the first application ([0109]: For example, the user may state: “Show me pictures from Samantha's birthday party.”… Once the image assistant 208 receives an answer, the image assistant 208 may generate labels that include “Samantha's birthday party” for the matching images) and
one or more image files received from the second application ([0109]: For example, the user may be able to specify a data source that may include the information (e.g., “Check my calendar.”); See also paras [0037], [0058]).
Regarding Claim 5, the combined teachings of Dua, Horn, and Morris disclose the method of claim 3.
Dua further teaches wherein the third set of metadata is generated by the second application based on a third knowledge graph, different from the first knowledge graph and the second knowledge graph, comprising information accessible to the second application (Figs. 1-2; [0064]-[0065]: The search module 206 may retrieve the information from a search engine, a third-party server 120, such as a third-party server 120 that generates a server-hosted knowledge graph, etc.)
Additionally, Horn teaches knowledge graphs different from the first knowledge graph ([0020] Catalog: means a special database built upon the object store that… performs queries on objects by specified metadata property selection or designation; notifies other processes of the metadata property changes; and maintains a dependency graph of object; [0239]: For each object to be reclassified, all collection specifications are evaluated, resulting in a new set of collections for the changed object. For key phrases, the classifier can return all matching collections in one pass: key phrases are compiled into a graph, with terminal nodes listing all matching collections).
Regarding Claim 6, the combined teachings of Dua, Horn, and Morris disclose the method of claim 3.
Dua further teaches further comprising:
providing a first user interface affordance for display with the plurality of candidate image files in a first result screen ([0103]: Turning to FIG. 5, a graphic representation 500 of a user interface… The user interface module 210 provides… images 510, 515);
receiving a selection of the first user interface affordance ([0103]: In this example, the user may confirm or reject the user's identification using a “yes” button 520 and a “no” button 525); and
providing the plurality of candidate image files for display in a second result screen, wherein the plurality of candidate image files are organized into a first set corresponding to image files received from the first application and a second set corresponding to image files received from the second application (Figs. 2, 5; [0104]: The image assistant 208 may organize the images by instructing the user interface module 210 to provide images of people that frequently appear in images associated with the user).
Regarding Claim 7, the combined teachings of Dua, Horn, and Morris disclose the method of claim 6.
Dua further teaches further comprising: providing a second user interface affordance for display with the first set of the plurality of candidate image files ([0103]: Turning to FIG. 5, a graphic representation 500 of a user interface… The user interface module 210 provides… images.. 515);
receiving a selection of the second user interface affordance ([0103]: In this example, the user may confirm or reject the user's identification using a “yes” button 520 and a “no” button 525); and
launching the first application in response to the selection of the second user interface affordance ([0103]: If the user identifies either of the images as properly identifying the user in the image, the image assistant 208 may instruct the index module 204 to add a label to the image that identifies the user. The label may include at least one of the user's name, “me,” and “I” so that the search module 206 may identify images when the user asks for, for example, “Show me images of me.”).
Regarding Claim 8, the combined teachings of Dua, Horn, and Morris disclose the method of claim 1.
Dua further teaches wherein processing the first image file comprises performing optical character recognition on the first image file, and wherein the second set of system-wide metadata comprises results from the optical character recognition (Fig. 2; [0053]: The image processing module 202 may also identify text, such as by applying optical character recognition (OCR) or another text recognition algorithm to identify text related to the objects, such as text on book covers or signs… For example, the image processing module 202 may identify a title of a book from the book cover and add a label that includes the book title).
Regarding Claim 9, the combined teachings of Dua, Horn, and Morris disclose the method of claim 1.
Dua further teaches wherein the first set of metadata comprises optical character recognition processing results generated by the first application (Fig. 2; [0053]: The image processing module 202 may also identify text, such as by applying optical character recognition (OCR) or another text recognition algorithm to identify text related to the objects, such as text on book covers or signs).
Regarding Claim 10, the combined teachings of Dua, Horn, and Morris disclose the method of claim 1.
Dua further teaches processing the received image search query using natural language processing to generate a refined image search query (Fig. 2; [0094]: The suggested search terms may include keywords or natural language expressions that are generated automatically based on the user-generated search terms and the index),
wherein the plurality of candidate image files is identified based on the refined image search query (Fig. 2; [0094]: For example, if the user inputs the partial user query “Pictures of m” the search module 206 may determine that the index includes the following labels that begin with “m” in order of decreasing numbers: mom, Martha, and monkey. As a result, the search module 206 may suggest “mom” to autocomplete the search query; [0097]-[099]: In some implementations, the search module 206 may perform filtering of images based on indicators in search queries. The filtering may include performing subsequent narrowing of the search results).
Regarding Claim 11, Dua discloses a non-transitory computer-readable medium storing instructions ([0048]: The storage device 247 may be a non-transitory computer-readable storage medium that stores data that provides the functionality described herein) which, when executed by one or more processors, cause the one or more processors to perform operations comprising:
receiving, from a first application running on a device, a first image file (Fig. 1;[0033]: The image application 103 may include a thin-client image application 103b stored on the user device 115a… For example, the user 125a may capture images using the user device 115a and transmit the images to the image server 101 for the image application 103a) and a first set of application-generated metadata associated with the first image file (Fig. 1; [0020]: Metadata may be based on data generated by a user device, such as an image capture device used to capture an image; [0052]: The images may be associated with metadata);
wherein the first set of application-generated metadata is generated by the first application based on a first knowledge graph comprising information accessible to the first application (Fig. 1; [0064]-[0084]: The search module 206 may retrieve the information from a search engine, a third-party server 120, such as a third-party server 120 that generates a server-hosted knowledge graph, etc. [This is nonfunctional descriptive material and is not functionally involved in the step recited]);
updating an on-device index based on the first image file and the first set of application-generated metadata to include the first image file in the on-device index (Figs. 1-2; [0006]: The system may further include an indexing module stored in the memory… The indexing module may be further operable to… update the mapping by adding the identifying information to the one or more of the images; [0019]: the images may be captured by a user device associated with the user, stored on the user device associated with the user; [0033]: For example, the image application 103a may generate an index for the user based on the images; [0102]: The image assistant 208 may… instruct the indexing module 204 to update the index by adding the identifying information to corresponding images);
However, Dua does not explicitly teach “generating, by the device, a second set of system-wide metadata distinct from the first set of application-generated metadata by processing the first image file and the first set of application-generated metadata, wherein the second set of system-wide metadata is generated based at least in part on a second knowledge graph, different from the first knowledge graph; updating the on-device index based on the second set of system-wide metadata; receiving an image search query; identifying a plurality of candidate image files from the on-device index based on the image search query and the on-device index updated based on the first set of application-generated metadata and the second set of system-wide metadata, wherein the plurality of candidate image files originate from at least one of the first application or the second application; and providing the plurality of candidate image files for display in response to the image search query, and the on-device index including a second image file previously received from a second application running on the device”.
On the other hand, in the same field of endeavor, Horn teaches
generating, by the device, a second set of system-wide metadata distinct from the first set of application-generated metadata by processing the first image file and the first set of application-generated metadata (Fig. 1, the catalog metadata database 107; [0043]-[0046]: This metadata is attached to the reference objects, which are stored in the catalog… As reference objects are created or are updated by MFS, they are collected into system and user-defined collections, which provide a logical grouping of objects based on one or more of three criteria… system-defined metadata query specification(s); [0124]: The MFS system tags objects of various kinds with the special attributes, links, and general descriptive metadata… and contain objects that are logically-grouped by… system-defined metadata query; [0109]: In addition, for images, MFS stores and maintains up to date in the catalog metadata),
wherein the second set of system-wide metadata is generated based at least in part on a second knowledge graph, different from the first knowledge graph ([0124]: The MFS system tags objects of various kinds with the special attributes, links, and general descriptive metadata… and contain objects that are logically-grouped by… system-defined metadata query This is nonfunctional descriptive material and is not functionally involved in the step recited]);
updating the on-device index based on the second set of system-wide metadata (Fig. 1, updater 104; [0046]: Updates and changes to the reference objects also update the metadata in the catalog with the changes rippling throughout all the images in all collections instantly and simultaneously; [0092]: FIG. 33 is a schematic describing in detail the updater process: how objects' properties are updated and their values stored into the metadata catalog);
receiving an image search query ([0046]: Searching the metadata, via automatic or user selected or created queries recalls the single reference object from the OODB; [0126]: Fig. 14; The collection's metadata query is specified in an information window, which consists of the collection name (1401) and a pane of terms (1402) which must be satisfied for objects to be collected);
identifying a plurality of candidate image files from the on-device index based on the image search query and the on-device index updated based on the first set of application- generated metadata and the second set of system-wide metadata (Fig. 1, Fig. 3; [0046]: MFS creates an internal representation of the object and stores the representation in the MFS object oriented database (OODB)… and if selected, the external object is retrieved; [0110]: Objects may be quickly retrieved by any expression denoting desired property values stored in the MFS metadata… See Fig. 3); [0126]: User-defined metadata queries, as shown in FIG. 14, provide automatic grouping of objects that share certain property values),
wherein the plurality of candidate image files originate from at least one of the first application ([0028]- [0036]: Reference Object: means an object internally created and stored in the catalog and object store, which represents data originating externally… Working Set: means the set of sources of information, either created internally or imported from or received from external originators, that the inventive MFS, metadata filing system, manages; [0109]: In addition, for images, MFS stores and maintains up to date in the catalog metadata representing the image dimension… For Adobe documents, special Adobe-specific properties called XMP (Extended Metadata Protocol) is read from each document and stored in the metadata database catalog as well. These properties may be available by examining the images); and
providing the plurality of candidate image files for display in response to the image search query ([0037]: metadata filing system, includes… a metadata database structure, or catalog, for the management and rendering of these objects to a display viewable by a user in response to user input, regardless of the source or nature of the object; [0045]: The inventive MFS-configured system provides an organizational structure and methodology for information management, including… logically grouping, and display of informational objects of all kinds; [0064]: FIG. 5 is a display showing the preview viewing mode for images).
Additionally, Morris teaches the on-device index including a second image file previously received from a second application running on the device(Fig. 1A; [0017]-[0019]: The image from a local image store could be a picture taken previously, an image downloaded from an image archive, or an image the user created using some other application. Additionally, a user can choose to copy an image from a message or web page… a user may upload a previously created selfiecon set from, for example…from a storage location of the computing device…. In some embodiments, the computing device may add the image uploaded at 123 or captured by the camera at 125. [This is nonfunctional descriptive material and is not functionally involved in the step recited].
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the teaching of Dua to incorporate the teachings of Horn and Morris to generate a second set of system-wide metadata distinct from the first set of application-generated metadata, provide image files for display in response to the image search query, and to include a second image file previously received from a second application running on the device.
The motivation for doing so would be to manage data items, regardless of the source, for grouping and retrieval from any collection as recognized by Horn ([Abstract] of Horn: The system provides… a metadata database structure which stores only one instance of each object while linking the object to multiple collections and domains for grouping into and retrieval from any of the collections), and to utilize existing images in a local image store created from other applications, as recognized by Morris ([0019] of Morris: The image from a local image store could be… an image the user created using some other application).
Regarding Claim 12, the combined teachings of Dua, Horn, and Morris disclose the non-transitory computer-readable medium of claim 11.
Dua further teaches wherein the operations further comprise:
providing a first user interface affordance for display with the plurality of candidate image files in a first result screen ([0103]: Turning to FIG. 5, a graphic representation 500 of a user interface… The user interface module 210 provides… images 510, 515);
receiving a selection of the first user interface affordance receiving a selection of the first user interface affordance ([0103]: In this example, the user may confirm or reject the user's identification using a “yes” button 520 and a “no” button 525); and
providing, in response to the selection of the first user interface affordance, the plurality of candidate image files for display in a second result screen, wherein the plurality of candidate image files are organized into a first set corresponding to image files having scene analysis results at least partially matching the image search query (Figs. 2, 5; [0104]: The image assistant 208 may organize the images by instructing the user interface module 210 to provide images of people that frequently appear in images associated with the user), and
a second set corresponding to image files having optical character recognition results at least partially matching the image search query (Fig. 2, Fig. 5; [0053]: The image processing module 202 may also identify text, such as by applying optical character recognition (OCR) or another text recognition algorithm to identify text related to the objects, such as text on book covers or signs… For example, the image processing module 202 may identify a title of a book from the book cover and add a label that includes the book title).
Regarding Claim 13, the combined teachings of Dua, Horn, and Morris disclose the non-transitory computer-readable medium of claim 12.
Dua further teaches wherein processing the first image file comprises performing optical character recognition on the first image file, and wherein the second set of system-wide metadata comprises results from the optical character recognition (Fig. 2; [0053]: The image processing module 202 may also identify text, such as by applying optical character recognition (OCR) or another text recognition algorithm to identify text related to the objects, such as text on book covers or signs… For example, the image processing module 202 may identify a title of a book from the book cover and add a label that includes the book title).
Regarding Claim 14, the combined teachings of Dua, Horn, and Morris disclose the non-transitory computer-readable medium of claim 12.
Dua further teaches wherein the first set of metadata comprises optical character recognition results generated by the first application (Fig. 2; [0053]: The image processing module 202 may also identify text, such as by applying optical character recognition (OCR) or another text recognition algorithm to identify text related to the objects, such as text on book covers or signs… For example, the image processing module 202 may identify a title of a book from the book cover and add a label that includes the book title).
Regarding Claim 15, the combined teachings of Dua, Horn, and Morris disclose the non-transitory computer-readable medium of claim 12.
Dua further teaches wherein the operations further comprise:
providing a second user interface affordance for display with the first set of the plurality of candidate image files ([0103]: Turning to FIG. 5, a graphic representation 500 of a user interface… The user interface module 210 provides… images.. 515);
receiving a selection of the second user interface affordance ([0103]: In this example, the user may confirm or reject the user's identification using a “yes” button 520 and a “no” button 525); and
launching the first application in response to the selection of the second user interface affordance ([0103]: If the user identifies either of the images as properly identifying the user in the image, the image assistant 208 may instruct the index module 204 to add a label to the image that identifies the user. The label may include at least one of the user's name, “me,” and “I” so that the search module 206 may identify images when the user asks for, for example, “Show me images of me.”).
Claims 16-21 are rejected under 35 U.S.C. 103 as being unpatentable over Dua et al. (US 20190220708 A1, hereinafter Dua) in view of Horn (US 20160117071 A1).
Regarding Claim 16, Dua discloses a device ([0012]: FIG. 2 illustrates a block diagram of an example computing device that organizes images), comprising:
a display (Fig. 2, display 241) a memory storing: a plurality of computer programs (Fig. 2, memory 237); and
an on-device index (Fig. 2, indexing module 204); and
one or more processors configured to execute instructions of the plurality of computer programs (Fig. 2, processor 235)to:
receive, from a first application, a first image file and a first set of application- generated metadata associated with the first image file (Fig. 1;[0033]: The image application 103 may include a thin-client image application 103b stored on the user device 115a… For example, the user 125a may capture images using the user device 115a and transmit the images to the image server 101 for the image application 103a) and a first set of application-generated metadata associated with the first image file (Fig. 1; [0020]: Metadata may be based on data generated by a user device, such as an image capture device used to capture an image; [0052]: The images may be associated with metadata);
update the on-device index based on the first image file and the first set of application-generated metadata (Figs. 1-2; [0006]: The system may further include an indexing module stored in the memory… The indexing module may be further operable to… update the mapping by adding the identifying information to the one or more of the images; [0019]: the images may be captured by a user device associated with the user, stored on the user device associated with the user; [0033]: For example, the image application 103a may generate an index for the user based on the images; [0102]: The image assistant 208 may… instruct the indexing module 204 to update the index by adding the identifying information to corresponding images);
However, Dua does not explicitly teach “generate a second set of system-wide metadata by processing the first image file and the first set of application-generated metadata; update the on-device index based on the second set of system-wide metadata; receive an image search query; identify a plurality of candidate image files from the on-device index based on the image search query and the on-device index updated based on the first set of application- generated metadata and the second set of system-wide metadata; display a first set of the plurality of candidate image files in a first result screen on the display in response to the image search query; display a first user interface affordance with the plurality of candidate image files in the first result screen on the display; receive a selection of the first user interface affordance; and display, in response to the selection and in a second result screen, a second set of the plurality of candidate image files in a first group and a third set of the plurality of candidate image files in a second group different from the first group, wherein the second set of the plurality of candidate image files is received from the first application running on the device and the third set of the plurality of candidate image files is received from a second application running on the device”.
On the other hand, in the same field of endeavor, Horn teaches
generate a second set of system-wide metadata by processing the first image file and the first set of application-generated metadata (Fig. 1, the catalog metadata database 107; [0043]-[0046]: This metadata is attached to the reference objects, which are stored in the catalog… As reference objects are created or are updated by MFS, they are collected into system and user-defined collections, which provide a logical grouping of objects based on one or more of three criteria… system-defined metadata query specification(s); [0124]: The MFS system tags objects of various kinds with the special attributes, links, and general descriptive metadata… and contain objects that are logically-grouped by… system-defined metadata query; [0109]: In addition, for images, MFS stores and maintains up to date in the catalog metadata);
update the on-device index based on the second set of system-wide metadata (Fig. 1, updater 104; [0046]: Updates and changes to the reference objects also update the metadata in the catalog with the changes rippling throughout all the images in all collections instantly and simultaneously; [0092]: FIG. 33 is a schematic describing in detail the updater process: how objects' properties are updated and their values stored into the metadata catalog);
receive an image search query ([0046]: Searching the metadata, via automatic or user selected or created queries recalls the single reference object from the OODB; [0126]: Fig. 14; The collection's metadata query is specified in an information window, which consists of the collection name (1401) and a pane of terms (1402) which must be satisfied for objects to be collected);
identify a plurality of candidate image files from the on-device index based on the image search query and the on-device index updated based on the first set of application- generated metadata and the second set of system-wide metadata (Fig. 1, Fig. 3; [0046]: MFS creates an internal representation of the object and stores the representation in the MFS object oriented database (OODB)… and if selected, the external object is retrieved; [0110]: Objects may be quickly retrieved by any expression denoting desired property values stored in the MFS metadata… See Fig. 3); [0126] User-defined metadata queries, as shown in FIG. 14, provide automatic grouping of objects that share certain property values);
display a first set of the plurality of candidate image files in a first result screen on the display in response to the image search query ([0045]: The inventive MFS-configured system provides an organizational structure and methodology for information management, including… logically grouping, and display of informational objects of all kinds; [0064]: FIG. 5 is a display showing the preview viewing mode for images);
display a first user interface affordance with the plurality of candidate image files in the first result screen on the display ([Abstract]: The system employs configurable, extensible attribute/properties of data objects in metadata format, and a user-configurable interface that facilitates information management; [0063]: FIG. 4 is a display depicting the MFS inventive system desktop interface);
receive a selection of the first user interface affordance ([0046]: Searching the metadata, via automatic or user selected or created queries recalls the single reference object from the OODB, and if selected, the external object is retrieved from the external source (hard drive or other data storage), permitting a comprehensive desktop interface; [0049]: Users can select predetermined collections provided in a basic menu); and
display, in response to the selection and in a second result screen, a second set of the plurality of candidate image files in a first group and a third set of the plurality of candidate image files in a second group different from the first group (Figs. 4- 12b; [0111]-[0122]: Comprehensive Desktop Interface…MFS presents information in a familiar desktop-style interface, with windows that show objects as icons or list views, among others… providing new and innovative viewing mechanisms that leverage the ability of MFS to store and retrieve arbitrary metadata… [0116]: In the icon views of FIGS. 7a, 7b and 7c, arbitrary layouts of icons and their related properties are possible; this can be done programmatically, or laid out by user preference),
wherein the second set of the plurality of candidate image files is received from the first application running on the device and the third set of the plurality of candidate image files is received from a second application running on the device ([0038]: The inventive MFS computer data processing system for… viewing of information objects from multiple sources; [0044]: The inventive MFS-configured CPU(s) streamline information management by providing a view of information objects of all domain natures (varieties) from different sources; [0112]: FIG. 4 shows a list of folders (401), an icon view of Photoshop files (402), a list of user-defined collections and the counts of objects within (403)… Views may also be sorted by a variety of properties that are shared by most objects: by name, by date, by size, by count (for folders and other containers); and by kind. This is also extensible by WS to new property types).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the teaching of Dua to incorporate the teachings of Horn to generate a second set of system-wide metadata distinct from the first set of application-generated metadata, and provide image files for display in response to the image search query.
The motivation for doing so would be to manage data items, regardless of the source, for grouping and retrieval from any collection as recognized by Horn ([Abstract] of Horn: The system provides… a metadata database structure which stores only one instance of each object while linking the object to multiple collections and domains for grouping into and retrieval from any of the collections).
Regarding Claim 17, the combined teachings of Dua and Horn disclose the device of claim 16.
Dua further teaches wherein the one or more processors are further configured to execute instructions to:
display a second user interface affordance with the second set of the plurality of candidate image file in the second result screen (Fig. 2; [0097]: the search module 206 may perform filtering of images… The search module 206 may provide the user with second search results that are filtered from the first search results and that match the second search query);
receive a selection of the second user interface affordance ([0097]: For example, the second search query may be: “Just show me the ones from last month.”); and
bring the first application to the foreground on the display in response to the selection of the second user interface affordance ([0097]: The search module 206 may provide the user with second search results that are filtered from the first search results and that match the second search query. In this example, the second search results may include pictures from San Francisco taken last month ).
Regarding Claim 18, the combined teachings of Dua and Horn disclose the device of claim 16.
Horn further teaches wherein the one or more processors are further configured to execute instructions to:
display, in the second result screen, a fourth set of the plurality of candidate image file in a third group and a fifth set of the plurality of candidate image file in a fourth group, wherein the fourth set of the plurality of candidate image files is received from the first application and the fifth set of the plurality of candidate image files is received from the second application, and wherein each image in fourth and fifth sets of the plurality of candidate image files is contains recognized text comprising at least a portion of the image search query (Figs. 4-12b; [0038]: The inventive MFS computer data processing system for automatic organization, indexing and viewing of information objects from multiple sources; [0111]-[0122]: Comprehensive Desktop Interface… For example, preview images are created and stored by MFS as annotations, and can be very quickly displayed in a slide view… In the icon views of FIGS. 7a, 7b and 7c, arbitrary layouts of icons and their related properties are possible; this can be done programmatically, or laid out by user preference).
Regarding Claim 19, the combined teachings of Dua and Horn disclose the device of claim 16.
Dua further teaches wherein the one or more processors are further configured to execute instructions to:
receive, from the second application, a second image file and a third set of metadata associated with the second image file ([0035]-[0036]: In some implementations, the third-party server 120 sends and receives data to and from one or more of the image server 101 and the user devices 115a-115n via the network 105… For example, the third-party server 120 may include a social network application that manages social network profiles, an email application that includes profile images of users);
update the on-device index based on the second image file and the third set of metadata (Figs. 1-2; [0006]: The system may further include an indexing module stored in the memory… The indexing module may be further operable to… update the mapping by adding the identifying information to the one or more of the images; [0033]: For example, the image application 103a may generate an index for the user based on the images; [0102]: The image assistant 208 may… instruct the indexing module 204 to update the index by adding the identifying information to corresponding images);
process the second image file and the third set of metadata to generate a fourth set of metadata ([0033]: The image application 103a stored on the image server 101 may process the images and send additional information back to the image application 103b stored on the user device 115a; Fig. 2; [0051]: The image processing module 202 may determine one or more labels for an image where the one or more labels may include metadata); and
update the on-device index based on the fourth set of metadata (Figs. 1-2; [0033]: the image application 103a may generate an index for the user based on the images; Fig. 2; [0051]: once the types of labels are recognized or generated by the image processing module 202, the image processing module 202 treats them as labels that are associated with an image; [0072]: In some implementations, the indexing module 204 generates an index from images that are associated with one or more labels where the labels include metadata; See also paras [0071]-[0075]).
Regarding Claim 20, the combined teachings of Dua and Horn disclose the device of claim 16.
Horn further teaches wherein the plurality of candidate image files comprises one or more image files received from the first application and one or more image files received from the second application ([0038]: The inventive MFS computer data processing system for automatic organization, indexing and viewing of information objects from multiple sources; [0111]-[0122]: Comprehensive Desktop Interface).
Regarding Claim 21, the combined teachings of Dua and Horn disclose the method of claim 1.
Dua further teaches wherein the first set of application-generated metadata comprises results of application-specific operations performed by the first application on the first image file, and wherein the application-specific operations comprise one or more of performing a scene analysis of the first image file or referencing a knowledge graph comprising information associated with the first application (Fig. 2; [0053]: The image processing module 202 may perform image recognition to identify an entity (e.g., people, objects, or places) in an image… The image processing module 202 may identify characteristics that represent attributes of an image, such as “on the beach,” “in the rain,” “fog,” “sunny,” “snowing,” “inside,” “outside,” “in front,” etc.).
Conclusion
THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SHIRLEY D. HICKS whose telephone number is (571)272-3304. The examiner can normally be reached Mon - Fri 7:30 - 4:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Charles Rones can be reached on (571) 272-4085. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/S D H/Examiner, Art Unit 2168
/CHARLES RONES/Supervisory Patent Examiner, Art Unit 2168