Prosecution Insights
Last updated: April 19, 2026
Application No. 18/111,394

ELECTRONIC DEVICE AND METHOD OF DISPLAYING EXTERNAL DEVICE CORRESPONDING TO OPERATION METHOD OF THE ELECTRONIC DEVICE

Final Rejection §102§103
Filed
Feb 17, 2023
Examiner
THAI, XUAN MARIAN
Art Unit
3715
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Samsung Electronics Co., Ltd.
OA Round
4 (Final)
2%
Grant Probability
At Risk
5-6
OA Rounds
3y 11m
To Grant
8%
With Interview

Examiner Intelligence

Grants only 2% of cases
2%
Career Allow Rate
4 granted / 175 resolved
-67.7% vs TC avg
Moderate +6% lift
Without
With
+5.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 11m
Avg Prosecution
28 currently pending
Career history
203
Total Applications
across all art units

Statute-Specific Performance

§101
22.3%
-17.7% vs TC avg
§103
37.0%
-3.0% vs TC avg
§102
17.7%
-22.3% vs TC avg
§112
18.8%
-21.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 175 resolved cases

Office Action

§102 §103
DETAILED ACTION This action is in response to the claim amendments received 02/03/2026. Claims 1, 2, 4, 5, 7-9, 11-13, 15, 16, 18, 20 and 21 are pending with claims 1, 12, 13, 15 and 20 currently amended, claims 10 and 19 cancelled and claim 21 newly added. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim(s) 1, 2, 4, 5, 7, 8, 11-13, 15, 16, 18, 20 and 21 is/are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Stoyles et al. [US11762620], hereinafter Stoyles. Regarding claim 1, Stoyles discloses a method of operating an electronic device, the method comprising: identifying, from among a plurality of preset operation modes, a first operation mode of the electronic device that is currently operating on the electronic device (col. 2, lines 30 -- 38, “A computer-generated reality environment (e.g., virtual reality or mixed reality environment) can have varying degrees of virtual content and/or physical content. A computer-generated reality environment can provide an intuitive interface for a user to interact with his/her physical environment. For example, using a reality interface that displays a representation of the user's physical environment, a user can access the functions of one or more external devices in the physical environment”); requesting, to an external device, information about a device type, and receiving, from the external device, the information about the device type; identifying the device type of the external device based on the received information about the device type (col. 14, line 64 – col. 15, line 2, “Specifically, upon detecting the one or more external devices, the user device sends a request to the one or more external devices which, when received by the one or more external devices, causes the one or more external devices to provide information specifying the device type and/or functions to the user device”); determining, based on the identified device type of the external device, whether the external device corresponds to the identified first operation mode (col. 2, lines 45 -- 49, “Specifically, the user device providing the reality interface would need to recognize that a particular object represented in the reality interface corresponds to a respective external device detected in the physical environment” and col. 19, lines 31-37, “In particular, if image recognition is used to determine whether the image data includes a representation of the first external device, then only the stored images corresponding to the identification information (e.g., corresponding to the same device identifier, device type, and/or device function) of the one or more detected external device are compared with the image data.”); displaying identification information about the external device, based on the identified device type of the external device corresponding to the identified first operation mode (col. 2, line 64 – col. 3, line 1, “In accordance with determining that the image data includes a representation of the first external device, a representation of the physical environment and an affordance corresponding to a function of the first external device are concurrently displayed”); and accessing the external device in response to an input selecting identification information about the external device (col. 3, lines 1 – 4, “The displayed affordance is configured such that user activation of the affordance causes the first external device to perform an action corresponding to the function”), wherein the displaying the identification information about the external device comprises: identifying, from among a plurality of panels respectively corresponding to the plurality of preset operation modes, a panel of a display corresponding to the identified first operation mode (Figs. 3C – 3F), and displaying, on the identified panel, the identification information about the external device and second identification information about at least one second external device corresponding to the identified first operation mode (col. 16, lines 21-37, “FIG. 3E is illustrative of block 410. As shown, representation 320 of physical environment 302 is displayed on user device 312 (e.g., according to block 406). In this embodiment, representation 320 has a larger field of view compared to representation 314 depicted in FIG. 3B. In particular, representation 320 includes representations of devices 304 and 306. In this embodiment, process 400 determines that the extrapolated visual axes of the user's eyes intersect with a plane of representation 320 at the region defined by dotted line 322. Thus, in this embodiment, the portion of representation 320 defined by dotted line 322 is the region of interest. In some embodiments, the determined region of interest is used to disambiguate between two or more possible electronic devices in the field of view of representation 320. Specifically, in these embodiments, based on the determined region of interest, it can be determined that the user intends to access the function of device 304, and not device 306” and col. 16, line 61 – col. 17, line 1, “The determination of block 412 serves to map one or more of the detected external devices of block 402 to one or more represented objects in the displayed representation of block 406. In this way, the specific external device(s) associated with functions the user wishes to access can be identified and thus suitable communication can be established with the external device(s) to obtain access to its functions”). Regarding claim 2, Stoyles discloses the method of claim 1, further comprising obtaining information about the external device comprising at least one of information about a device name of the external device, information about a user input related to the device type of the external device, and information about content reproduced in the electronic device (col. 11, lines 33 – 38, “the identification information also includes information specifying the device type and/or the function(s) offered by the respective external device. In a specific embodiment, the identification information received from external device 304 includes the identifier “DISPLAY01,” the device type “TELEVISION,” and the function “ON/OFF”). Regarding claim 4, Stoyles discloses the method of claim 1, wherein the determining whether the external device corresponds to the identified first operation mode comprises identifying the external device, having the device type corresponding to the identified first operation mode, based on the identified device type of the external device (col. 2, lines 45 -- 49, “Specifically, the user device providing the reality interface would need to recognize that a particular object represented in the reality interface corresponds to a respective external device detected in the physical environment”). Regarding claim 5, Stoyles discloses the method of claim 1, wherein the determining whether the external device corresponds to the identified first operation mode further comprises identifying the external device corresponding to a device list previously stored with respect to the identified first operation mode, based on information about a device name of the external device (col. 17, lines 16 – 23, “The plurality of stored images are stored, for example, in a database. Each stored image of the plurality of stored images corresponds to a respective external device. For example, an index of the database associates each stored image with a respective external device. Specifically, the index maps each stored image to a respective identifier, device type, and/or device function of a respective external device”). Regarding claim 7, Stoyles discloses the method of claim 1, wherein the plurality of preset operation modes include at least one of a movie mode, a game mode, and a music mode (col. 10, lines 2 – 5, “Electronic devices 228, 230, and 232 can include any type of remotely controlled electronic device, such as a light bulb, garage door, door lock, thermostat, audio player, television, or the like”). Regarding claim 8, Stoyles discloses the method of claim 1, wherein the determining whether the external device corresponds to the identified first operation mode further comprises identifying the external device, corresponding to the identified first operation mode, based on whether a function related to the identified first operation mode is performed in the electronic device (col. 2, line 64 – col. 3, line 4, “In accordance with determining that the image data includes a representation of the first external device, a representation of the physical environment and an affordance corresponding to a function of the first external device are concurrently displayed. The displayed affordance is configured such that user activation of the affordance causes the first external device to perform an action corresponding to the function”). Regarding claim 11, Stoyles discloses the method of claim 1, wherein the displaying the identification information about the external device further comprises displaying; at least one of an application, a webpage, and a menu function corresponding to the identified first operation mode, based on history information about the identified first operation mode (Figs. 3C – 3F). Regarding claims 12, 13, 15, 16 and 18, please refer to the claim rejections of claims 1, 2, 4, 5 and 8. Regarding claim 20, please refer to the claim rejection of claim 1. Regarding claim 21, Stoyles discloses the electronic device of claim 12, wherein the identified panel comprises a user interface for providing at least one of: a list of external devices usable in the first operation mode (col. 15, lines 20-23, “At block 406, a representation (e.g., representation 314) of the physical environment is displayed (e.g., on the display of the user device) according to the obtained image data of block 404” and col. 16, lines 21-37, “FIG. 3E is illustrative of block 410. As shown, representation 320 of physical environment 302 is displayed on user device 312 (e.g., according to block 406). In this embodiment, representation 320 has a larger field of view compared to representation 314 depicted in FIG. 3B. In particular, representation 320 includes representations of devices 304 and 306. In this embodiment, process 400 determines that the extrapolated visual axes of the user's eyes intersect with a plane of representation 320 at the region defined by dotted line 322. Thus, in this embodiment, the portion of representation 320 defined by dotted line 322 is the region of interest. In some embodiments, the determined region of interest is used to disambiguate between two or more possible electronic devices in the field of view of representation 320. Specifically, in these embodiments, based on the determined region of interest, it can be determined that the user intends to access the function of device 304, and not device 306”); or a menu or items used in the first operation mode. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 9 is rejected under 35 U.S.C. 103 as being unpatentable over Stoyles, in view of JEONG [US20220319458]. Regarding claim 9, Stoyles discloses the method of claim 8. However, Stoyles does not explicitly disclose wherein, when the identified first operation mode is a game mode, the identifying the external device corresponding to the identified first operation mode comprises identifying the external device, corresponding to the game mode, based on at least one of whether a type of content received from the external device is game content, and whether at least one of an auto low latency mode (ALLM) function, a variable refresh rate (VRR) function, and a Free Sync function are performed in the electronic device. Nevertheless, JEONG teaches in a like invention, when the identified first operation mode is a game mode, the identifying the external device corresponding to the identified first operation mode comprises identifying the external device, corresponding to the game mode, based on at least one of whether a type of content received from the external device is game content, and whether at least one of an auto low latency mode (ALLM) function, a variable refresh rate (VRR) function, and a Free Sync function are performed in the electronic device ([0062], “An external device connectable to the external device interface 135 can be one of a set-top box, a Blu-ray player, a DVD player, a game console, a sound bar, a smartphone, a PC, a USB Memory, and a home theater system but this is just exemplary” and [0160], “The controller170 may perform the VRR function under a specific condition. For example, the controller 170 may perform the VRR function in the game mode. In addition, the controller 170 may perform the VRR function, when receiving an image signal from an external device through the external device interface 135. In addition, the controller 170 may perform the VRR function, when determining a frame per second (FPS) of the image signal as not being constant, based on a processing result of the image signal.”). Thus, it would have been obvious to one having ordinary skill in the art before the time the invention was effectively filed to have modified the method disclosed by Stoyles, to have the game content and related function from the external device, as taught by JEONG, in order to provide more functions from the reality interface being generated to facilitate the user to have multiple different usage and have more fun. Response to Arguments Applicant's arguments filed 02/03/2026 have been fully considered but they are not persuasive. With respect to the claim rejections, applicant first argues “Applicant submits that the computer-generated reality (CGR) environment is not selected or identified as one of multiple preset operation modes. Thus, Applicant submits Stoyles does not disclose or suggest "identifying, from among a plurality of preset operation modes, a first operation mode of the electronic device that is currently operating on the electronic device," as recited in claim 1” (p. 10). Examiner respectfully submits that Stoyles discloses “A computer-generated reality environment (e.g., virtual reality or mixed reality environment) can have varying degrees of virtual content and/or physical content” (col. 2, lines 30 – 32). This “varying degrees of virtual content and/or physical content” teaches “a plurality of preset operation modes.” And it is only necessary to identify the external device in the physical environment under mixed reality environment. Thus, "identifying, from among a plurality of preset operation modes, a first operation mode of the electronic device that is currently operating on the electronic device," as recited in claim 1 is being implied. Applicant further argues “Stoyles does not disclose or suggest "determining, based on the identified device type of the external device, whether the external device corresponds to the identified first operation mode," as recited in claim 1” (p. 10). Examiner respectfully disagrees. Examiner respectfully submits that Stoyles teaches this feature in col. 19, lines 31-37, “In particular, if image recognition is used to determine whether the image data includes a representation of the first external device, then only the stored images corresponding to the identification information (e.g., corresponding to the same device identifier, device type, and/or device function) of the one or more detected external device are compared with the image data.” Applicant then argues “Stoyles does not disclose or suggest "wherein the displaying the identification information about the external device comprises: identifying, from among a plurality of panels respectively corresponding to the plurality of preset operation modes, a panel of a display corresponding to the identified first operation mode, and displaying, on the identified panel, the identification information about the external device and second identification information about at least one second external device corresponding to the identified first operation mode," as recited in claim 1” (p. 10 -11). Examiner respectfully submits that for this newly amended feature, the teaching is illustrated in the rejection. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to YINGCHUAN ZHANG whose telephone number is (571)272-1375. The examiner can normally be reached 8:00 - 4:30 M-F. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Xuan Thai can be reached at (571) 272-7147. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /YINGCHUAN ZHANG/Primary Examiner, Art Unit 3715
Read full office action

Prosecution Timeline

Feb 17, 2023
Application Filed
Feb 18, 2025
Non-Final Rejection — §102, §103
Apr 22, 2025
Examiner Interview Summary
Apr 22, 2025
Applicant Interview (Telephonic)
May 23, 2025
Response Filed
Jun 16, 2025
Final Rejection — §102, §103
Aug 18, 2025
Request for Continued Examination
Aug 27, 2025
Response after Non-Final Action
Oct 30, 2025
Non-Final Rejection — §102, §103
Feb 03, 2026
Response Filed
Feb 23, 2026
Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12551797
VIRTUAL OBJECT CONTROL METHOD AND APPARATUS, DEVICE, AND STORAGE MEDIUM
2y 5m to grant Granted Feb 17, 2026
Patent 8657605
VIRTUAL TESTING AND INSPECTION OF A VIRTUAL WELDMENT
2y 5m to grant Granted Feb 25, 2014
Patent 8398404
SYSTEM AND METHOD FOR ELEVATED SPEED FIREARMS TRAINING
2y 5m to grant Granted Mar 19, 2013
Patent null
Video display of high contrast graphics for newborns and infants
Granted
Patent null
Device including a lens array
Granted
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
2%
Grant Probability
8%
With Interview (+5.9%)
3y 11m
Median Time to Grant
High
PTA Risk
Based on 175 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month