DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
This action is in response to the Amendment filed on 11/04/2025.
Claims 1-20 are pending. Claims 1, 3, 4, 7, 8, 9, 10, 11, 13, 14, 17, 18, 19, 20 have been amended.
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 11/04/2025 has been entered.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim 1-8, 11-18 are rejected under 35 U.S.C. 103 as being unpatentable by Applefeld (US 8,606,645 B1) in view of Boncyk et al. (US 7477780 B2, hereinafter Boncyk), further in view of Rauschnabel (“Virtually enhancing the real world with holograms: An exploration of expected gratifications of using augmented reality smart glasses”, 20180106).
Regarding Claim 1, Applefeld teaches a method of presenting packaging information to a user comprising (Applefeld, Column 1, Line 29-35, Column 2, Line 15-17, According to one aspect of the invention, a system, method, and computer-readable medium is disclosed for providing an augmented reality application for various retail applications… the augmented reality retail application may be configured to detect the triggering feature on a shopping bag, retail packaging): enabling capture by a portable camera [[ carried by wearable eyeglass frames,]] of an image(s) of a packaging object (Applefeld, Column 9, Line 53-55, a background image may be captured with an image capture unit, such as with image capture unit 103. The image capture unit may be part of a mobile device Column 10, Line 9-11, a background image having one of the hang-tags, shopping bags, or product packaging is captured); with at least one processor, recognizing, in real time, the packaging object from an image(s) captured by the portable camera, the recognizing based only on an area of recognition part of the captured image packaging object and associated captured image(s) [[ thereof without requiring the captured image(s) to contain any 2D bar code, AR marker or other special recognition marker ]] (Applefeld, Column 1, Line 54-60, “The feature detection component or another component of the augmented reality retail application may be configured to detect from the background image a triggering feature that triggers presentation of an augmented-reality-enhanced view. According to one implementation, the triggering feature may be associated with a retail product or any other retail item, such as product packaging, a shopping bag”; Column 10, Line 5-15, "a feature detection component may execute an image recognition algorithm that detects a recognizable feature from the background image," such as "a retailer's logo" printed on product packaging or shopping bags; Column 6, Line 53-55, “The computing devices may include a mobile device… digital camera”) matching the recognized packaging object with a data record stored in a database (Applefeld, Column 16, Line 5-10, the marketing information may present to a user retail products having a brand or price that matches a consumer preference or habit of the user. For example, if the database indicates that the user has recently bought a coat, the marketing information may display a matching scarf <read on matching the recognized object> that is available for purchase; Column 8, Line 37-39, “Module 111 may retrieve … information … from a database stored on a server.”); selecting an interactive media item in response to the recognition and/or the matching (Applefeld, Column 11, Line 54-56, a fashion show video <read on interactive media> of dresses from the retailer may be displayed if a purchased dress is recognized in the background image; it is noted the video is selected dynamically based on context and action so it is interactive media); with an electronic display device [[ carried by the wearable eyeglass frames ]] , superimposing the selected interactive media item [[ onto a real world view through the wearable eyeglass frames ]] of the packaging object (Applefeld, Column 6, Line 28-36, “The augmented reality retail application may present on user presentation device 105 of user interface device 100 an augmented-reality-enhanced view that overlays multimedia over a background image captured by image capture unit 103. In one example, the background image may contain an image of a retail item, such as a retail product, product packaging, a shopping bag, hang-tag, catalog, magazine, billboard, or gift card. The multimedia overlaid on the background image may be related to the retail item”).
However, Applefeld does not explicitly disclose [[ a portable camera ]] carried by wearable eyeglass frames… without requiring the captured image(s) to contain any 2D bar code, AR marker or other special recognition marker;
But, Boncyk teaches the recognizing based only on an area of recognition part of the captured image packaging object and associated captured image(s) (Boncyk, Column 2, Line 15-17, capturing imagery of the objects and then identifying the objects via image recognition performed on a local or remote computer) thereof without requiring the captured image(s) to contain any 2D bar code, AR marker or other special recognition marker (Boncyk, Column 1, Line 24-26, Column 2, Line 14-17,“capturing imagery of the objects and then identifying the objects via image recognition performed on a local or remote computer; “without modification or disfigurement of the object, without the need for any marks, symbols, codes, barcodes, or characters on the object” Column 1, Line 22-28, “The detection, identification, determination of position and orientation, and subsequent information provision and communication must occur without modification or disfigurement of the object, without the need for any marks, symbols, codes, barcodes, or characters on the object”).
Boncyk and Applefeld are analogous since both of them are dealing with processing objects captured in an image and providing associated information to a user. Applefeld provided a way of recognizing an object from an image and superimposing content (e.g., a fashion show video, marketing information) on the image when dealing with data in an augmented reality environment. Boncyk provided a way to automatically recognize the object from within the captured image, based on image recognition of the object region, without using any additional helper like markers, barcodes, or codes. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate automatic markerless object recognition taught by Boncyk into the modified invention of Applefeld such that, during the processing of image data in the augmented reality environment, the system will be able to automatically identify the object of interest from the captured image, enhancing system functionality and providing a more user-friendly system.
But the combination of Applefeld and Boncyk and Rauschnabel does not explicitly disclose [[ a portable camera ]] carried by wearable eyeglass frames.
However, Rauschnabel teaches recognizing object in real time from an image(s) captured by the portable camera (Rauschnabel, Page 1, 5, “ARSGs can provide relevant information in real time” “smart glasses” (ARSGs) offers users the opportunity to integrate three-dimensional, virtual elements realistically and in real time into their view field”) enabling capture, by a portable camera carried by wearable eyeglass frames of an image(s) of object (Rauschnabel, Page 2, Section 2.1 "Augmented and virtual reality, and ARSGs," "ARSGs ... are worn like regular glasses and integrate virtual information realistically into the user's view field ... through various sensors (e.g., cameras, GPS, microphone) that capture the real world," indicating that the camera is carried by the eyeglass frames and captures the realworld scene) with an electronic display device carried by the wearable eyeglass frames, superimposing the selected interactive media item onto a real world view through the wearable eyeglass frames of object (Rauschnabel, Page 1, Abstract; Page 2, Section 2.1, "Augmented reality smart glasses (ARSG) ... allow users to augment and enhance their subjective perceptions of reality" and "they are worn like regular glasses and integrate virtual information realistically into the user's view field," which corresponds to an electronic display device in the eyeglass frames that superimposes virtual or interactive media onto the user's real-world view through the glasses).
Rauschnabel and Applefeld are analogous since both are directed to augmented reality systems that overlay virtual information on real-world views to provide enhanced content to a user. Applefeld provided a way of capturing images of retail items (including product packaging) on a user device and superimposing multimedia content (such as a fashion show video or marketing information) as an augmented-reality-enhanced view over the captured background image. Rauschnabel provided a detailed description of augmented reality smart glasses worn like regular glasses, including cameras and displays that integrate virtual information realistically into the user's view field, thereby augmenting and enhancing the user's perception of reality. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate AR smart glasses taught by Rauschnabel into modified invention of Applefeld's augmented reality retail application such that the camera and display are carried by wearable eyeglass frames and the augmented multimedia (interactive media item) is superimposed onto the user's real-world view through those frames, thereby improving hands-free usability and user immersion in a familiar eyeglass form factor.
Regarding Claim 2, the combination of Applefeld and Boncyk and Rauschnabel teaches the invention in Claim 1.
The combination further teaches wherein the superimposing comprises using at least one of augmented reality, mixed reality and virtual reality (Applefeld, Column 2, Line 26-29, the augmented reality retail application may be configured to present an augmented-reality-enhanced view that overlays multimedia, such as a video, on the background image).
Regarding Claim 3, the combination of Applefeld and Boncyk and Rauschnabel teaches the invention in Claim 1.
The combination further teaches wherein recognizing is performed remotely [[ of the wearable eyeglass frames ]] and comprises recognizing an area of recognition disposed on at least one surface of a two-dimensional or three-dimensional packaging object with print on it (Applefeld, Column 3, Line 2-6, Column 10, Line 4-15, a retailer's logo may be used as a marker and may be printed on hangtags, shopping bags, or product packaging… , a 3D object such as a cube, sphere, or any other multi-faceted 3D object, and may present on a portion of the 3D object's surface one of the multiple views of the retail product; it is noted an object; Column 5, Line 1-5, “Each view may be placed on a portion of a surface of a 3D object” ).
Applefeld does not explicitly disclose but Rauschnabel teaches the wearable eyeglass frames (Rauschnabel, Page 2, Section 2.1 "Augmented and virtual reality, and ARSGs," "ARSGs ... are worn like regular glasses and integrate virtual information realistically into the user's view field ... through various sensors (e.g., cameras, GPS, microphone) that capture the real world," indicating that the camera is carried by the eyeglass frames and captures the realworld scene).
As explained in rejection of claim 1, the obviousness for combining of wearable eyeglass frames of Rauschnabel into Applefeld is provided above.
Regarding Claim 4, the combination of Applefeld and Boncyk and Rauschnabel teaches the invention in Claim 1.
The combination further teaches wherein the packaging object has indicia printed thereon, and the recognizing comprises recognizing at least some printed indicia (Applefeld, Column 10, Line 7-11, a retailer's logo <read on printed indicia> may be used as a marker and may be printed on hangtags, shopping bags, or product packaging. When a background image having one of the hang-tags, shopping bags, or product packaging is captured, the logo may be detected).
Regarding Claim 5, the combination of Applefeld and Boncyk and Rauschnabel teaches the invention in Claim 4.
The combination further teaches wherein the recognizing includes recognizing characters printed on the packaging object (Applefeld, Column 2, Line 22-25, An object associated with the triggering feature may include, for example, a retail product, retailer logo, text on a hang-tag, or any other object in the background image from which the triggering feature was detected).
Regarding Claim 6, the combination of Applefeld and Boncyk and Rauschnabel teaches the invention in Claim 5.
The combination further teaches wherein the packaging object comprises a printed product container (Applefeld, Column 10, Line 7-9, a retailer's logo may be used as a marker and may be printed on hang-tags <read on product container>, shopping bags, or product packaging).
Regarding Claim 7, the combination of Applefeld and Boncyk and Rauschnabel teaches the invention in Claim 6.
The combination further teaches wherein the selected interactive media item comprises a digital overlay that configured to lead to specific action selected from the group consisting of providing specific information; a video, tutorial, or any kind of displayable content (Applefeld, Abstract, Column 2, Line 34-37, “the augmented reality retail application is configured to overlay a fashion show video on the background image as part of an augmented-reality-enhanced view… The action performed may include presenting information on the retail product or facilitating a transaction for the retail product”; it is noted the video is selected dynamically based on context and action so it is interactive media).
Regarding Claim 8, the combination of Applefeld and Boncyk and Rauschnabel teaches the invention in Claim 1.
The combination further teaches wherein the superimposing is performed [[ on a smart glasses ]] (Applefeld, Column 6, Line 28-32, The augmented reality retail application may present on user presentation device 105 of user interface device 100 an augmented-reality-enhanced view that overlays multimedia over a background image captured by image capture unit 103).
But Applefeld does not explicitly disclose performance is on a smart glasses.
However, Rauschnabel teaches the superimposing is performed on a smart glasses (Rauschnabel, Page 2, Section 2.1, "ARSGs ... are worn like regular glasses and integrate virtual information realistically into the user's view field ... equipped with sensors (e.g., cameras) that capture the real world," <read on smart glasses on which augmented content is superimposed in the user's view>);
Rauschnabel and Applefeld are analogous since both disclose augmented-reality
systems overlaying digital information onto a user's view. Applefeld provided a way of capturing images of retail packaging on a handheld device and superimposing multimedia content related to the recognized item. Rauschnabel provided a way of presenting augmented-reality content through smart glasses equipped with cameras and see-through displays. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date to incorporate the smart-glasses platform taught by Rauschnabel into the modified invention of Applefeld such that the superimposing of multimedia information is performed on smart glasses worn by the user. The motivation is to provide a more natural, hands-free augmented-reality experience.
Regarding Claim 11, it recites limitations similar in scope to the limitations of claim 1, but in a system. As shown in the rejection, the combination of Applefeld and Boncyk and Rauschnabel teaches the limitations in Claim 1. Additionally, Applefeld discloses an system that maps to Fig. 1 and Column 6, Line 25-40, (Applefeld, Column 6, Line 25-40, According to one aspect of the invention, FIG.1 illustrates an example system, method, and computer-readable medium for providing an augmented reality retail application for various retail applications). Thus, Claim 11 is met by Applefeld according to the mapping presented in the rejection of claims 1, given the method corresponds to the system.
Regarding Claim 12, it recites limitations similar in scope to the limitations of Claim 2 and therefore is rejected under the same rationale.
Regarding Claim 13, it recites limitations similar in scope to the limitations of Claim 3 and therefore is rejected under the same rationale.
Regarding Claim 14, it recites limitations similar in scope to the limitations of Claim 4 and therefore is rejected under the same rationale.
Regarding Claim 15, it recites limitations similar in scope to the limitations of Claim 5 and therefore is rejected under the same rationale.
Regarding Claim 16, it recites limitations similar in scope to the limitations of Claim 6 and therefore is rejected under the same rationale.
Regarding Claim 17, it recites limitations similar in scope to the limitations of Claim 7 and therefore is rejected under the same rationale.
Regarding Claim 18, it recites limitations similar in scope to the limitations of Claim 8 and therefore is rejected under the same rationale.
Claim(s) 9-10, 19-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Applefeld (US 8,606,645 B1) in view of Boncyk et al. (US 7477780 B2, hereinafter Boncyk), further in view of Rauschnabel (“Virtually enhancing the real world with holograms: An exploration of expected gratifications of using augmented reality smart glasses”, 20180106) as applied to Claim 1 above and further in view of Carre et al. (US 20150348329 A1, hereinafter Carre).
Regarding Claim 9, the combination of Applefeld and Boncyk and Rauschnabel teaches the invention in Claim 1.
The combination further teaches wherein the selected media item comprises a [[ call ]] icon (Applefeld, Column 13, Line 17-21, Column 15, Line 57-59, The gift box multimedia may be an image, video, or animation of a gift box, and may be overlaid on the background image…an augmented-reality-enhanced view displayed on the touch screen…it may be received over where an image of a retail product is displayed or where an image of a button or other icon is displayed; it is noted the button on the touch screen is not an physical button but an icon as well).
But Applefeld does not explicitly disclose [[ wherein the selected media item comprises a ]] call [[ icon ]].
However, Carre teaches the selected media item comprises a call icon (Carre, Paragraph [0044], The ghost image is overlaid in the camera preview of the application…[0039], The content…might also include a transparent user interface that proposes one or more choices of actions or provides actionable buttons to the user…[0040], Augmented Reality Event (ARE) including data for the application to interact with the user…an instruction for a user action…a telephone number to call).
Carre and Applefeld are analogous since both of them are dealing with processing data in augmented reality environment. Applefeld provided a way of recognized object from image and superimposing action button on the image when dealing with data in the augmented reality environment. Carre provided a way of overlaying call button on the image while dealing with objects on the image in the augmented reality environment. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to incorporate overlaying call button taught by Carre into modified invention off Applefeld such that when dealing with data in the augmented reality environment, system will be able to provide phone call button which provide more user friendly user interface when using the system.
Regarding Claim 10, the combination of Applefeld and Boncyk and Rauschnabel teaches the invention in Claim 1.
The combination does not explicitly disclose but Carre teaches displaying any or all of the following action items in any combination or subcombination (Carre, Paragraph [0039],The content … might also include a transparent user interface that proposes one or more choices of actions or provides actionable buttons to the user; [0040], data associated with an Augmented Reality Event (ARE) including data for the application to interact with the user):
[[ Price Tag, Photo Gallery, ]] Videos (Carre, Paragraph [0040], data for the application to interact with the user (e.g., a video)), [[ Description, ]] Call (Carre, Paragraph [0040], a telephone number to call), [[ Mail, Shop link, Explanation, Intro, Social Media links, List of ingredients ]] Promo Code (Carre, Paragraph [0040], a coupon), [[ Other Products from this product line, Environmental facts and figures, Contest, ]] Rewards, collection of points (Carre, Paragraph [0040], a coupon), [[ Info about production, Sustainability, Allergies, ]] Reviews (Carre, Paragraph [0059], a consumer with usability content or entertainment/creative animation on an object or a brand logo and the ability to provide reviews), and/or [[ Translation ]].
Carre and Applefeld are analogous since both of them are dealing with processing data in augmented reality environment. Applefeld provided a way of recognized object from image and superimposing action button on the image when dealing with data in the augmented reality environment. Carre provided a way of overlaying button with multiple functions like videos, call, coupon code on the image while dealing with objects on the image in the augmented reality environment. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to incorporate overlaying multiple function action button taught by Carre into modified invention off Applefeld such that when dealing with data in the augmented reality environment, system will be able to provide additional functionalities which provide more user friendly user interface when using the system.
Applefeld does not explicitly disclose but Rauschnabel teaches action items include Translation (Rauschnabel, Page 2, “for example, with an AR translation app a user can hold a smartphone over any foreign language text to read a translation automatically”).
Rauschnabel and Applefeld are analogous since both are directed to augmented reality systems that overlay virtual information on real-world views to provide enhanced content to a user. Applefeld provided a way of capturing images of retail items (including product packaging) on a user device and superimposing multimedia content (such as a fashion show video or marketing information) as an augmented-reality-enhanced view over the captured background image. Rauschnabel provided a detailed description of augmented reality smart glasses worn like regular glasses, including cameras and displays that integrate virtual information realistically into the user's view field and provide additional function like translation. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate AR smart glasses with translation function taught by Rauschnabel into modified invention of Applefeld's augmented reality retail application such that the camera and display are carried by wearable eyeglass frames and the augmented multimedia (interactive media item) to support multiple functions like translation through those frames, thereby provide more user friendly wearable experience.
Regarding Claim 19, it recites limitations similar in scope to the limitations of Claim 9 and therefore is rejected under the same rationale.
Regarding Claim 20, it recites limitations similar in scope to the limitations of Claim 10 and therefore is rejected under the same rationale.
Response to Arguments
Applicant’s arguments with respect to claim 1 filed on 04/21/2025, with respect to rejection under 35 USC § 103 have been considered but are moot in view of the new ground(s) of rejection. it has now been taught by the combination of prior arts Applefeld , Boncyk and Rauschnabel.
In regard to Claims 2-10, 12-20 they directly/indirectly depends on independent Claim 1, 11 respectively. Applicant does not argue anything other than the independent claim 1, 11. The limitations in those claims in conjunction with combination previously established as explained.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to YUJANG TSWEI whose telephone number is (571)272-6669. The examiner can normally be reached 8:30am-5:30pm EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kent Chang can be reached on (571)272-7667. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/YuJang Tswei/Primary Examiner, Art Unit 2614