Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant’s arguments with respect to claim(s) 1-9, 20-30 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on January 5, 2026 has been entered.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-4, 20, 22, 23, 26, 27 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 20140120887 to Huang in view of US 20230169625 to Kumar.
Regarding claim 1, Huang discloses a method, comprising:
providing user information regarding a user location and a user preference of image information (paragraph 209-214, 357-359; position info of terminal (location) and user personalization (user preference of image) including screen resolution, type of AR content is provided), wherein the user preference includes at least one from the group consisting of an image quality factor (paragraph 212; user personalization includes screen resolution (quality factor));
receiving, at a client device, one or more sets of image information associated with one or more items of interests (paragraph 209-211; user personalization preference includes point of interest) in an AR environment based on the user information (paragraph 164, 297, 338, 371-379; in step F24, AR terminal (client device) receives AR content in step F24 that includes images filtered based on user personalization preference and context (location));
generating, at the client device, a user interface for displaying the one or more sets of image information (paragraph 135; user interface function generates interface to display AR content), wherein the user interface is generated at least based on the user preference (paragraph 209-214, 375; user interface is generated based on user preference such as whether type of AR content is image or video or the resolution of terminal); and
receiving input to enable an action associated with the one or more sets of image information via the user interface (paragraph 135-139; user can select by clicking the displayed AR content on interface and detail of the AR content is acquired as an action).
However Huang does not disclose wherein the user preference includes at least one from the group consisting of a timing factor, an event factor.
Kumar discloses wherein the user preference includes at least one from the group consisting of a timing factor, an event factor (paragraph 38; for target image 20, user preference as selected by user such as date 110a (timing factor), event 110c factor can be used to filter images used to make a larger image 24 from the target image).
It would have been obvious to one of ordinary skill in the art at the time of the invention was made to modify the system of Huang as taught by Kumar to provide user preference associated with event/dates.
The motivation to combine the references is to provide user with images similar to a target image on the user device based on user selected preference associated with events/dates that are related to the target image such that user can generate large images using the target image and based on the acquired related images (paragraph 38).
Regarding claim 2, Huang discloses the method of claim 1, wherein the user location is a location from a portable device carried by a user (paragraph 4, 118, 187; GPS module of mobile terminal of user provides location).
Regarding claim 3, Huang discloses the method of claim 1, wherein the user preference of image information includes a criterion defining at least one characteristic of images (paragraph 209-212, 375; user personalization preference includes resolution of image (characteristic) as criteria for filtering AR content).
Regarding claim 4, Kumar discloses the method of claim 1, wherein the one or more items of interests include a historic building (paragraph 16; region of interest includes historical buildings in the target image).
Regarding claim 20, Huang discloses a computing system comprising (paragraph 115; system shown in Fig. 1):
cause the computing system to perform a process (paragraph 355; flowchart for performing mobile AR service) comprising:
providing user information regarding a user location and a user preference of image information (paragraph 209-214, 357-359; position info of terminal (location) and user personalization (user preference of image) including screen resolution, type of AR content is provided), wherein the user preference includes at least one from the group consisting of image quality factor (paragraph 212; user personalization includes screen resolution (quality factor));
receiving, at a client device, one or more sets of image information associated with one or more items of interests (paragraph 209-211; user personalization preference includes point of interest) in an AR environment based on the user information (paragraph 164, 297, 338, 371-379; in step F24, AR terminal (client device) receives AR content in step F24 that includes images filtered based on user personalization preference and context (location));
generating, at the client device, a user interface for displaying the one or more sets of image information (paragraph 135; user interface function generates interface to display AR content), wherein the user interface is generated at least based on the user preference (paragraph 209-214, 375; user interface is generated based on user preference such as whether type of AR content is image or video or the resolution of terminal); and
receiving input to enable an action associated with the one or more sets of image information via the user interface (paragraph 135-139; user can select by clicking the displayed AR content on interface and detail of the AR content is acquired as an action).
However Huang does not disclose at least one processor; and at least one memory storing instructions that, when executed by the at least one processor, cause the computing system to perform a process;
wherein the user preference includes at least one from the group consisting of a timing factor, an event factor.
Kumar discloses at least one processor; and at least one memory storing instructions that, when executed by the at least one processor, cause the computing system to perform a process (paragraph 53-54, 56, 59; storage 406 stores program instructions to be executed by processor 402 to perform process);
wherein the user preference includes at least one from the group consisting of a timing factor, an event factor (paragraph 38; for target image 20, user preference as selected by user such as date 110a (timing factor), event 110c factor can be used to filter images used to make a larger image 24 from the target image).
It would have been obvious to one of ordinary skill in the art at the time of the invention was made to modify the system of Huang as taught by Kumar to provide user preference associated with event/dates.
The motivation to combine the references is to provide user with images similar to a target image on the user device based on user selected preference associated with events/dates that are related to the target image such that user can generate large images using the target image and based on the acquired related images (paragraph 38).
Regarding claim 22, see rejection of claim 3.
Regarding claim 23, see rejection of claim 4.
Regarding claim 26, Huang discloses cause the computing system to perform operations comprising (paragraph 115; system shown in Fig. 1; paragraph 355; flowchart for performing mobile AR service):
providing user information regarding a user location and a user preference of image information (paragraph 209-214, 357-359; position info of terminal (location) and user personalization (user preference of image) including screen resolution, type of AR content is provided), wherein the user preference includes at least one from the group consisting of image quality factor (paragraph 212; user personalization includes screen resolution (quality factor));
receiving, at a client device, one or more sets of image information associated with one or more items of interests (paragraph 209-211; user personalization preference includes point of interest) in an AR environment based on the user information (paragraph 164, 297, 338, 371-379; in step F24, AR terminal (client device) receives AR content in step F24 that includes images filtered based on user personalization preference and context (location) );
generating, at the client device, a user interface for displaying the one or more sets of image information (paragraph 135; user interface function generates interface to display AR content), wherein the user interface is generated at least based on the user preference (paragraph 209-214, 375; user interface is generated based on user preference such as whether type of AR content is image or video or the resolution of terminal); and
receiving input to enable an action associated with the one or more sets of image information via the user interface (paragraph 135-139; user can select by clicking the displayed AR content on interface and detail of the AR content is acquired as an action).
However Huang does not disclose a non-transitory computer-readable medium storing instructions that, when executed by a computing system, cause the computing system to perform operations, wherein the user preference includes at least one from the group consisting of a timing factor, an event factor.
Kumar discloses a non-transitory computer-readable medium storing instructions that, when executed by a computing system, cause the computing system to perform operations (paragraph 53-54, 56, 59; storage 406 stores program instructions to be executed by processor 402 to cause computing system 400 to perform process), wherein the user preference includes at least one from the group consisting of a timing factor, an event factor (paragraph 38; for target image 20, user preference as selected by user such as date 110a (timing factor), event 110c factor can be used to filter images used to make a larger image 24 from the target image).
It would have been obvious to one of ordinary skill in the art at the time of the invention was made to modify the system of Huang as taught by Kumar to provide user preference associated with event/dates.
The motivation to combine the references is to provide user with images similar to a target image on the user device based on user selected preference associated with events/dates that are related to the target image such that user can generate large images using the target image and based on the acquired related images (paragraph 38).
Regarding claim 27, Huang discloses the non-transitory computer-readable medium of claim 26, wherein the operations further comprise: wherein the user location is a location from a portable device carried by a user (paragraph 4, 118, 187; GPS module of mobile terminal of user provides location), and wherein the user preference of image information includes a criterion defining at least one characteristic of images (paragraph 209-212, 375; user personalization preference includes resolution of image (characteristic) as criteria for filtering AR content).
1. Claim(s) 6, 9, 21, 29 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 20140120887 to Huang in view of US 20230169625 to Kumar further in view of US 20150248783 to Fayle.
Regarding claim 6, Huang does not disclose the method of claim 1, wherein the user interface includes a first section and a second section, wherein the first section is configured to display images associated with the one or more items of interest.
Fayle discloses wherein the user interface includes a first section and a second section, wherein the first section is configured to display images associated with the one or more items of interest (paragraph 14, 57, 61; Fig. 3C display interface having first section for displaying images F1a-F3a having dates associated with descriptor interest and remaining section (second section)).
It would have been obvious to one of ordinary skill in the art at the time of the invention was made to modify the system of Huang as taught by Fayle to provide first/second section in the user interface.
The motivation to combine the references is to provide thumbnail images as representative images to be displayed in the user interface to reduce the amount display space needed to display images and also minimize bandwidth associated with downloading images by providing thumbnail version of images (paragraph 15, 20).
Fayle does not specifically order the images in chronology order.
However Examiner is taking Official Notice of fact that ordering images in chronological order is notoriously well known in the art before the invention was effectively filed. By ordering the images in chronological order on display user can efficiently search images based on their chronological order such as date of capturing.
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Fayle to display images in chronological order wherein such ordering can result in efficient searching and analysis of images in order of capturing date.
Regarding claim 9, Fayle discloses the method of claim 1, wherein the action associated with the one or more sets of image information includes at least one of the following: adding a comment or tag, providing an edit, and saving the image information at the client device (paragraph 90; user can attach comment to the images).
Regarding claim 21, Fayle discloses the computing system of claim 20, wherein the user location is a location from a portable device carried by a user (paragraph 50-51; portable communication system carried by user includes module 148 having GPS for providing user location), and wherein the action associated with the one or more sets of image information includes at least one of the following: adding a comment or tag, providing an edit, and saving the image information at the client device (paragraph 90; user can attach comment to the images).
Regarding claim 29, Kumar discloses the non-transitory computer-readable medium of claim 26, wherein the operations further comprise: wherein the one or more items of interests include a historic building (paragraph 16; region of interest includes historical buildings in the target image), and Fayle discloses wherein the action associated with the one or more sets of image information includes at least one of the following: adding a comment or tag, providing an edit, and saving the image information at the client device (paragraph 90; user can attach comment to the images).
Claim(s) 5, 24, 28 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 20140120887 to Huang in view of US 20230169625 to Kumar further in view of JP 2016174290 to Yoshida.
Regarding claim 5, Kumar discloses the method of claim 1, wherein the one or more sets of image information include a set of images in a order (see Fig. 4a wherein the images P1, P2, P3 are arranged in order from left to right next to target image T; paragraph 28, 31). Huang in view of Kumar does not specifically order them in chronology order. However Examiner is taking Official Notice of fact that ordering images in chronological order is notoriously well known in the art before the invention was effectively filed. By ordering the images in chronological order on display user can efficiently search images based on their chronological order such as date of capturing.
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Huang to display images in chronological order wherein such ordering can result in efficient searching and analysis of images in order of capturing date.
However Huang does not disclose wherein the method further comprises comparing a current image captured by the client device with the set of images.
Yoshida discloses wherein the method further comprises comparing a current image captured by the client device with the set of images (paragraph 26; comparing of captured image by mobile device with acquired images from server).
It would have been obvious to one of ordinary skill in the art at the time of the invention was made to modify the system of Huang as taught by Yoshida to provide comparison of set of images with captured images.
The motivation to combine the references is to provide playback of video data in AR environment when there is correspondence between captured images and image sets received from server thereby providing entertainment for the user in terms of video image overlaid on the image of the printout on the camera (paragraph 29, 47, 75-76).
Regarding claim 24, Kumar discloses the computing system of claim 20, wherein the one or more sets of image information include a set of images in a order (see Fig. 4a wherein the images P1, P2, P3 are arranged in order from left to right next to target image T; paragraph 28, 31). Further Yoshida discloses wherein the process further comprises comparing a current image captured by the client device with the set of images (paragraph 26; comparing of captured image by mobile device with acquired images from server).
Huang in view of Kumar does not specifically order them in chronology order. However Examiner is taking Official Notice of fact that ordering images in chronological order is notoriously well known in the art before the invention was effectively filed. By ordering the images in chronological order on display user can efficiently search images based on their chronological order such as date of capturing.
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Fayle to display images in chronological order wherein such ordering can result in efficient searching and analysis of images in order of capturing date.
Regarding claim 28, Kumar discloses the non-transitory computer-readable medium of claim 26, wherein the operations further comprise: wherein the one or more sets of image information include a set of images in a order (see Fig. 4a wherein the images P1, P2, P3 are arranged in order from left to right next to target image T; paragraph 28, 31). Further Yoshida discloses wherein the operations further comprise comparing a current image captured by the client device with the set of images(paragraph 26; comparing of captured image by mobile device with acquired images from server).Huang in view of Kumar does not specifically order them in chronology order. However Examiner is taking Official Notice of fact that ordering images in chronological order is notoriously well known in the art before the invention was effectively filed. By ordering the images in chronological order on display user can efficiently search images based on their chronological order such as date of capturing.
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Huang to display images in chronological order wherein such ordering can result in efficient searching and analysis of images in order of capturing date.
Claim(s) 7, 8, 25, 30 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 20140120887 to Huang in view of US 20230169625 to Kumar further in view of US 20150248783 to Fayle further in view of CN 113849734 to Ma.
Regarding claim 7, Huang does not disclose the method of claim 6, wherein the second section is configured to display information associated with the images in the chronological order.
Ma discloses wherein the second section is configured to display information associated with the images in the chronological order (paragraph n0030, n0071, n0075; third area (second section) includes comments (information associated) with images in first area arranged in timeline (chronological order)).
It would have been obvious to one of ordinary skill in the art at the time of the invention was made to modify the system of Huang as taught by Ma to provide arrangement for displaying image related information on display interface.
The motivation to combine the references is to provide enhancement in terms of how the user can interact with the displayed images by adding comments that also allow user to acquire additional information about the images (paragraph 45, 90).
Regarding claim 8, Ma discloses the method of claim 7, wherein the information associated with the images includes at least one of the following: a description, al ink, a source, a comment, and metadata (paragraph n0030, n0071, n0075; third area (second section) includes comments (information associated) with images in first area arranged in timeline (chronological order)).
Regarding claim 25, Fayle discloses the computing system of claim 20, wherein the user interface includes a first section and a second section, wherein the first section is configured to display images associated with the one or more items of interest (paragraph 14, 57, 61; Fig. 3C display interface having first section for displaying images F1a-F3a having dates associated with descriptor interest and remaining section (second section)). Further Ma discloses wherein the second section is configured to display information associated with the images in the chronological order (paragraph n0030, n0071, n0075; third area (second section) includes comments (information associated) with images in first area arranged in timeline (chronological order)), and wherein the information associated with the images includes at least one of the following: a description, al ink, a source, a comment, and metadata (paragraph n0030, n0071, n0075; third area (second section) includes comments (information associated) with images in first area arranged in timeline (chronological order)).
Fayle does not specifically order them in chronology order. However Examiner is taking Official Notice of fact that ordering images in chronological order is notoriously well known in the art before the invention was effectively filed. By ordering the images in chronological order on display user can efficiently search images based on their chronological order such as date of capturing.
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Fayle to display images in chronological order wherein such ordering can result in efficient searching and analysis of images in order of capturing date.
Regarding claim 30, Fayle discloses the non-transitory computer-readable medium of claim 26, wherein the operations further comprise: wherein the user interface includes a first section and a second section, wherein the first section is configured to display images associated with the one or more items of interest (paragraph 14, 57, 61; Fig. 3C display interface having first section for displaying images F1a-F3a having dates associated with descriptor interest and remaining section (second section)). Further Ma discloses wherein the second section is configured to display information associated with the images in the chronological order(paragraph n0030, n0071, n0075; third area (second section) includes comments (information associated) with images in first area arranged in timeline (chronological order)), and wherein the information associated with the images includes at least one of the following: a description, al ink, a source, a comment, and metadata (paragraph n0030, n0071, n0075; third area (second section) includes comments (information associated) with images in first area arranged in timeline (chronological order)). Fayle does not specifically order them in chronology order.
However Examiner is taking Official Notice of fact that ordering images in chronological order is notoriously well known in the art before the invention was effectively filed. By ordering the images in chronological order on display user can efficiently search images based on their chronological order such as date of capturing.
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Fayle to display images in chronological order wherein such ordering can result in efficient searching and analysis of images in order of capturing date.
Other Prior Art Cited
14. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
US 20170256040 to Grauer.
US 20130113827 to Forutanpout.
M. Kasahara, K. Takano and K. F. Li, "A Personalized Learning System with an AR Augmented Reality Browser for Ecosystem Fieldwork," 2014 IEEE 28th International Conference on Advanced Information Networking and Applications, Victoria, BC, Canada, 2014, pp. 89-97.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to BENIYAM MENBERU whose telephone number is (571) 272-7465. The examiner can normally be reached on Monday-Friday, 10:00am-6:30pm.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Akwasi Sarpong can be reached on (571) 270-3438. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Any inquiry of a general nature or relating to the status of this application or proceeding should be directed to the customer service office whose telephone number is (571) 272-2600. The group receptionist number for TC 2600 is (571) 272-2600.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
For more information about the PAIR system, see <http://pair-direct.uspto.gov/>. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free).
Patent Examiner
Beniyam Menberu
/BENIYAM MENBERU/Primary Examiner, Art Unit 2681
02/05/2026