Prosecution Insights
Last updated: April 19, 2026
Application No. 17/689,906

AUGMENTED, VIRTUAL AND MIXED-REALITY CONTENT SELECTION & DISPLAY FOR CITY LIGHT

Non-Final OA §103
Filed
Mar 08, 2022
Examiner
YANG, YI
Art Unit
2616
Tech Center
2600 — Communications
Assignee
Techinvest Company Limited
OA Round
3 (Non-Final)
71%
Grant Probability
Favorable
3-4
OA Rounds
2y 9m
To Grant
88%
With Interview

Examiner Intelligence

Grants 71% — above average
71%
Career Allow Rate
295 granted / 415 resolved
+9.1% vs TC avg
Strong +17% interview lift
Without
With
+17.2%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
39 currently pending
Career history
454
Total Applications
across all art units

Statute-Specific Performance

§101
7.4%
-32.6% vs TC avg
§103
76.0%
+36.0% vs TC avg
§102
2.7%
-37.3% vs TC avg
§112
3.3%
-36.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 415 resolved cases

Office Action

§103
DETAILED ACTION The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 9/2/2025 has been entered. Claims 1-10 remain pending in the application. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim 1-4 and 6-8 are rejected under 35 U.S.C. 103 as being unpatentable over Bates U.S. Patent Application 20190205645 in view of Hodge U.S. Patent Application 20200349666, and further in view of Urashita U.S. Patent Application 20170011529. Regarding claim 1, Bates discloses a method of presenting city light related information to a user comprising: enabling capture by a portable camera of a city light comprising a billboard or other two-dimensional lighted display or lighted rendering of an advertisement, a lighted city skyline or part of a lighted city skyline, an electronically illuminated a jumbotron or other large lighted display, and/or a lighting pattern lighting a building, monument or other public object, to provide a captured image of the city light (paragraph [0004]: identifying an object within a field of view of an augmented reality device of a user... the field of view may be a room in which the user is present, or a street scene where a user is walking… The field of view could also be an image, such as a billboard, picture, magazine advertisement, etc. The augmented reality device may capture an image of such field of view); recognizing, as two-dimensional printed indicia, a dynamic city light pattern from the captured image; matching the recognized dynamic city light pattern with a data record, including accessing the data record over a network (paragraph [0004]: A media guidance application may process the image to detect an object in the image. The media guidance application may detect on the object a reference that may be related to a participant in an event. For example, the media guidance application may detect that there is a sports team logo in an image, such as on a billboard, vehicle, apparel, or another location in the image, that may be connected to a member of the sports team and the sports team may have a game or another upcoming event scheduled; paragraph [0005]: The media guidance application may search a profile of the user to find a connection between the user and the participant. For example, the media guidance application may search a social network for a message by the user that relates to the participant); selecting a media item in response to the matching (paragraph [0101]: At step 1325, the control circuitry 1004 for the media guidance application may search a database to identify content that matches at least a portion of the image of the object); and superimposing a selected media item onto a display of the captured image or an image derived therefrom (paragraph [0105]: at step 1350, combine, using the augmented reality device, at a position relative to the identified location, the content of the given message with the reference detected on the object; paragraph [0034]: the media guidance application may generate an overlay of the message to be displayed over the image in the augmented reality environment; paragraph [0013]: a user may have previously posted to a social network about a player on a sports team, and such social media message may be superimposed in the augmented reality field of view over the reference object). Bates discloses all the features with respect to claim 1 as outlined above. However, Bates fails to disclose enabling capture of a city light in a darkened environment, the city light comprising a lighted billboard, recognizing light pattern comprising shade of grey. Hodge discloses enabling capture of a city light in a darkened environment, the city light comprising a lighted billboard (paragraph [0162]: forward-facing camera 214b on client device 101 uses image recognition software to detect billboards in the field of view; paragraph [0163]: choosing whether to light a billboard at night or change the content displayed on electronic billboards). Therefore, it would have been obvious before the effective filing date of the claimed invention to combine Bates’ to detect billboards at night as taught by Hodge, to track total count of views and time spent viewing billboard efficiently. Bates as modified by Hodge discloses all the features with respect to claim 1 as outlined above. However, Bates as modified by Hodge fails to disclose recognizing light pattern comprising shade of grey. Urashita disclose recognizing light pattern comprising shade of grey (paragraph [0072]: when the environment information collecting means 3 collects environment information of a night time, the parameter changing means 42 changes a sensitivity threshold so that a change of the minimum unit is recognized at a point when a white color turns to a light gray color (i.e., so that a change is detected by a little disparity). Consequently, it becomes possible to accurately detect a small change and, for example, it becomes possible to detect a person wearing navy clothes (dark gray) without any problem even when the person is moving at night). Therefore, it would have been obvious before the effective filing date of the claimed invention to combine Bates and Hodge’s to detect shades of grey as taught by Urashita, to detect object in all kinds of environment. Regarding claim 2, Bates as modified by Hodge and Urashita discloses the method of claim 1 wherein the superimposing comprises using at least one of augmented reality, mixed reality and virtual reality (Bates’ paragraph [0034]: the media guidance application may generate an overlay of the message to be displayed over the image in the augmented reality environment; paragraph [0013]: a user may have previously posted to a social network about a player on a sports team, and such social media message may be superimposed in the augmented reality field of view over the reference object). Therefore, it would have been obvious before the effective filing date of the claimed invention to combine Bates’ to detect billboards at night as taught by Hodge, to track total count of views and time spent viewing billboard efficiently; and combine Bates and Hodge’s to detect shades of grey as taught by Urashita, to detect object in all kinds of environment. Regarding claim 3, Bates as modified by Hodge and Urashita discloses the method of claim 1 wherein recognizing comprises recognizing three-dimensional city light as two-dimensional printed grey scale indicia (Bates’ paragraph [0004]: A media guidance application may process the image to detect an object in the image; Hodge’s paragraph [0162]: forward-facing camera 214b on client device 101 uses image recognition software to detect billboards in the field of view; paragraph [0163]: choosing whether to light a billboard at night or change the content displayed on electronic billboards; Urashita’s paragraph [0072]: when the environment information collecting means 3 collects environment information of a night time, the parameter changing means 42 changes a sensitivity threshold so that a change of the minimum unit is recognized at a point when a white color turns to a light gray color (i.e., so that a change is detected by a little disparity). Consequently, it becomes possible to accurately detect a small change and, for example, it becomes possible to detect a person wearing navy clothes (dark gray) without any problem even when the person is moving at night). Therefore, it would have been obvious before the effective filing date of the claimed invention to combine Bates’ to detect billboards at night as taught by Hodge, to track total count of views and time spent viewing billboard efficiently; and combine Bates and Hodge’s to detect shades of grey as taught by Urashita, to detect object in all kinds of environment. Regarding claim 4, Bates as modified by Hodge and Urashita discloses the method of claim 1 wherein the city light has no bar code or AR marker specially designed or intended to be recognized by an augmented reality display device (Bates’ paragraph [0004]: identifying an object within a field of view of an augmented reality device of a user... the field of view may be a room in which the user is present, or a street scene where a user is walking… The field of view could also be an image, such as a billboard, picture, magazine advertisement, etc. The augmented reality device may capture an image of such field of view; Hodge’s paragraph [0162]: forward-facing camera 214b on client device 101 uses image recognition software to detect billboards in the field of view). Therefore, it would have been obvious before the effective filing date of the claimed invention to combine Bates’ to detect billboards at night as taught by Hodge, to track total count of views and time spent viewing billboard efficiently; and combine Bates and Hodge’s to detect shades of grey as taught by Urashita, to detect object in all kinds of environment. Regarding claim 6, Bates as modified by Hodge and Urashita discloses the method of claim 5 wherein the recognizing comprises recognizing a printed rendition of a city light (Bates’ paragraph [0004]: identifying an object within a field of view of an augmented reality device of a user... the field of view may be a room in which the user is present, or a street scene where a user is walking… The field of view could also be an image, such as a billboard, picture, magazine advertisement, etc. The augmented reality device may capture an image of such field of view; Hodge’s paragraph [0162]: forward-facing camera 214b on client device 101 uses image recognition software to detect billboards in the field of view). Therefore, it would have been obvious before the effective filing date of the claimed invention to combine Bates’ to detect billboards at night as taught by Hodge, to track total count of views and time spent viewing billboard efficiently; and combine Bates and Hodge’s to detect shades of grey as taught by Urashita, to detect object in all kinds of environment. Regarding claim 7, Bates as modified by Hodge and Urashita discloses the method of claim 6 wherein the selected media item comprises a digital overlay that leads to specific action selected from the group consisting of providing specific information; displaying a video, displaying a tutorial, or displaying any kind of displayable content (Bates’ paragraph [0105]: at step 1350, combine, using the augmented reality device, at a position relative to the identified location, the content of the given message with the reference detected on the object; paragraph [0013]: a user may have previously posted to a social network about a player on a sports team, and such social media message may be superimposed in the augmented reality field of view over the reference object). Therefore, it would have been obvious before the effective filing date of the claimed invention to combine Bates’ to detect billboards at night as taught by Hodge, to track total count of views and time spent viewing billboard efficiently; and combine Bates and Hodge’s to detect shades of grey as taught by Urashita, to detect object in all kinds of environment. Regarding claim 8, Bates as modified by Hodge and Urashita discloses the method of claim 1 wherein the superimposing is performed on a handheld display device, a user’s retina or smart glasses (Bates’ paragraph [0088] A user may use various content capture devices, such as camcorders, digital cameras with video mode, audio recorders, mobile phones, and handheld computing devices, to record content; Hodge’s paragraph [0109]: client device 101 may project visual cues onto the windshield of the vehicle (similar to a heads-up-display HUD system) to help the driver safely locate his or her awaiting passenger). Therefore, it would have been obvious before the effective filing date of the claimed invention to combine Bates’ to detect billboards at night as taught by Hodge, to track total count of views and time spent viewing billboard efficiently; and combine Bates and Hodge’s to detect shades of grey as taught by Urashita, to detect object in all kinds of environment. Claim 5 is rejected under 35 U.S.C. 103 as being unpatentable over Bates U.S. Patent Application 20190205645 in view of Hodge U.S. Patent Application 20200349666, in view of Urashita U.S. Patent Application 20170011529, and further in view of MacIntosh U.S. Patent Application 20170249491. Regarding claim 5, Bates as modified by Hodge and Urashita discloses all the features with respect to claim 4 as outlined above. However, Bates as modified by Hodge and Urashita fails to disclose recognizing characters formed by the city light. MacIntosh discloses recognizing characters formed by the city light (paragraph [0148]: This image excerpt is passed to a text detector module that identifies at least one prominent alphabetic character. (Known OCR techniques can be used.)). Therefore, it would have been obvious before the effective filing date of the claimed invention to combine Bates, Hodge and Urashita’s to use OCR as taught by MacIntosh, to assist user efficiently. Claim 9-10 are rejected under 35 U.S.C. 103 as being unpatentable over Bates U.S. Patent Application 20190205645 in view of Hodge U.S. Patent Application 20200349666, in view of Urashita U.S. Patent Application 20170011529, and further in view of Wang U.S. Patent Application 20150036678. Regarding claim 9, Bates as modified by Hodge and Urashita discloses all the features with respect to claim 1 as outlined above. However, Bates as modified by Hodge and Urashita fails to disclose the media item comprises a call button. Wang discloses the media item comprises a call button (paragraph [0037]: an overlay display of a conference call button can be provided by the network (e.g. by a communication server within the network) to the client as an interactive audio-visual object). Therefore, it would have been obvious before the effective filing date of the claimed invention to combine Bates, Hodge and Urashita’s to use call button as taught by Wang, to enable a fast and homogeneous implementation of network interaction services on various different types of communication end devices. Regarding claim 10, Bates as modified by Hodge, Urashita and Wang discloses the method of claim 1 further including displaying any or all of the following action buttons in any combination or subcombination: Price Tag, Photo Gallery, Videos, Description, Call, Mail, Shoplink, Explanation, Intro, Social Media links, Map, Discount codes, Reviews, Tutorials, Directions, Test drives, and/or Booking opportunities (Wang’s paragraph [0037]: an overlay display of a conference call button can be provided by the network (e.g. by a communication server within the network) to the client as an interactive audio-visual object). Therefore, it would have been obvious before the effective filing date of the claimed invention to combine Bates, Hodge and Urashita’s to use call button as taught by Wang, to enable a fast and homogeneous implementation of network interaction services on various different types of communication end devices. Response to Arguments Applicant's arguments filed 9/2/2025, page 5, with respect to the rejection(s) of claim(s) 1 under 103, have been fully considered and are moot upon a new ground(s) of rejection made under 35 U.S.C. 103 as being unpatentable over Bates U.S. Patent Application 20190205645 in view of Hodge U.S. Patent Application 20200349666, and further in view of Urashita U.S. Patent Application 20170011529, as outlined above. Applicant argues on page 5 that Bates et al does not disclose capturing and recognizing a city light... There is no disclosure of how to recognize such a T-shirt and logos in a darkened environment nor is a T-shirt a city light. In reply, the rejection is based on Bates, Hodge and Urashita combined. Bates discloses recognizing a city light pattern from the captured image (paragraph [0004]: A media guidance application may process the image to detect an object in the image. The media guidance application may detect on the object a reference that may be related to a participant in an event… such as on a billboard, vehicle, apparel). Hodge discloses enabling capture of a city light in a darkened environment, the city light comprising a lighted billboard (paragraph [0162]: forward-facing camera 214b on client device 101 uses image recognition software to detect billboards in the field of view; paragraph [0163]: choosing whether to light a billboard at night or change the content displayed on electronic billboards). Urashita disclose recognizing light pattern comprising shade of grey (paragraph [0072]: when the environment information collecting means 3 collects environment information of a night time, the parameter changing means 42 changes a sensitivity threshold so that a change of the minimum unit is recognized at a point when a white color turns to a light gray color (i.e., so that a change is detected by a little disparity). Consequently, it becomes possible to accurately detect a small change and, for example, it becomes possible to detect a person wearing navy clothes (dark gray) without any problem even when the person is moving at night). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Yi Yang whose telephone number is (571)272-9589. The examiner can normally be reached on Monday-Friday 9:00 AM-6:00 PM EST. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Daniel Hajnik can be reached on 571-272-7642. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). /YI YANG/ Primary Examiner, Art Unit 2616
Read full office action

Prosecution Timeline

Mar 08, 2022
Application Filed
Sep 06, 2024
Non-Final Rejection — §103
Mar 10, 2025
Response Filed
Mar 26, 2025
Final Rejection — §103
Sep 02, 2025
Response after Non-Final Action
Sep 30, 2025
Request for Continued Examination
Oct 05, 2025
Response after Non-Final Action
Nov 25, 2025
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12586304
PROGRAM, INFORMATION PROCESSING METHOD, AND INFORMATION PROCESSING SYSTEM
2y 5m to grant Granted Mar 24, 2026
Patent 12567129
Image Processing Method and Electronic Device
2y 5m to grant Granted Mar 03, 2026
Patent 12561276
SYSTEMS AND METHODS FOR UPDATING MEMORY SIDE CACHES IN A MULTI-GPU CONFIGURATION
2y 5m to grant Granted Feb 24, 2026
Patent 12541902
SIGN LANGUAGE GENERATION AND DISPLAY
2y 5m to grant Granted Feb 03, 2026
Patent 12541896
COMPUTER-BASED CONTENT PERSONALIZATION OF A VISUAL DISPLAY
2y 5m to grant Granted Feb 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
71%
Grant Probability
88%
With Interview (+17.2%)
2y 9m
Median Time to Grant
High
PTA Risk
Based on 415 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month