Prosecution Insights
Last updated: April 18, 2026
Application No. 18/787,239

ELECTRONIC APPARATUS AND METHOD FOR CONTROLLING THEREOF

Non-Final OA §102§103
Filed
Jul 29, 2024
Examiner
WERNER, DAVID N
Art Unit
2487
Tech Center
2400 — Computer Networks
Assignee
Samsung Electronics Co., Ltd.
OA Round
1 (Non-Final)
68%
Grant Probability
Favorable
1-2
OA Rounds
3y 3m
To Grant
84%
With Interview

Examiner Intelligence

Grants 68% — above average
68%
Career Allow Rate
483 granted / 713 resolved
+9.7% vs TC avg
Strong +16% interview lift
Without
With
+16.2%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
32 currently pending
Career history
745
Total Applications
across all art units

Statute-Specific Performance

§101
7.4%
-32.6% vs TC avg
§103
44.8%
+4.8% vs TC avg
§102
23.1%
-16.9% vs TC avg
§112
16.1%
-23.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 713 resolved cases

Office Action

§102 §103
DETAILED ACTION This is the First Action on the Merits for U.S. Patent Application No. 18/787,239, filed 29 July 2024, which is a continuation of International Application No. PCT/KR2024/008437, and claims foreign priority to Korean Application No. KR10-2023-0081456, filed 23 June 2023. Claims 1–15 are pending. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 U.S.C. § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. §§ 102 and 103 (or as subject to pre-AIA 35 U.S.C. §§ 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. § 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 9 and 14 are rejected under 35 U.S.C. § 102(a)(1) as being anticipated by U.S. Patent Application Publication No. 2020/0019277 A1 (“Teraoka”)1. Teraoka, directed to a projector, teaches with respect to claim 9 a method for controlling an electronic apparatus, the method comprising: controlling a projector to output an original image as a picture on a projection surface (¶ 0046, modulating image projection illumination light to generate and project image light); acquiring a first image by setting at least one of a gain value and an exposure value of a camera to a first predetermined value (¶ 0055, exposure control) while the original image is output as the picture on the projection surface (¶ 0059, image scattered light and infrared from finger while the image light is being projected) and capturing the first image of the picture on the projection surface with the camera having the at least one of the gain value and the exposure value set to the first predetermined value (id.); acquiring a second image by setting at least one of the gain value and the exposure value of the camera to a second predetermined value while the original image is output as the picture on the projection surface (¶¶ 0070–71, Fig. 8, different exposure timings for different light quantities to vary brightness of captured image) and capturing the second image of the picture on the projection surface with the camera having the at least one of the gain value and the exposure value set to the second predetermined value (¶¶ 0083–84, modulating light continuously to maintain optimal light quantity during exposure period); and identifying a location of a touch of a user on the picture on the projection surface based on the first image and the second image (¶ 0059, identifying finger position), wherein the camera is configured to sense a greater amount of light when the at least one value of the gain value and the exposure value of the camera is set to the first predetermined value than when the at least one of the gain value and the exposure value of the camera is set to the second predetermined value (¶ 0071, exposure period and image brightness are proportional). Regarding claim 14, Teraoka teaches the method as claimed 9, further comprising: identifying at least one coordinate corresponding to the location of the tough among a plurality of predetermined coordinates included in the picture on the projection surface based on the second image (Teraoka ¶¶ 0049, 0059, coordinate data of detected object position); identifying a location corresponding to the at least one coordinate identified in the first image (¶¶ 0083–84, modulating light continuously to maintain optimal light quantity during exposure period); and identifying the location of the touch on the picture on the projection surface based on the identified location corresponding to the at least one coordinate (¶¶ 0049, 0059, again finding coordinate data further in modulated process). Claim Rejections - 35 U.S.C. § 103 The following is a quotation of 35 U.S.C. § 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1–3, 5–8, 10, 11, 13, and 15 are rejected under 35 U.S.C. 103 as being unpatentable over Teraoka in view of CN 109558033 A (“Dai”)2. Teraoka, directed to a projector, teaches with respect to claim 1 an electronic apparatus comprising: a projector (¶ 0042, Fig. 3, image projecting illumination section 1); a camera (id., imaging section 5); . . . control the projector to output an original image as a picture of the original image on a projection surface (¶ 0046, modulating image projection illumination light to generate and project image light); acquire a first image by controlling the camera to set at least one of a gain value and an exposure value of the camera to a first predetermined value (¶ 0055, exposure control) while the original image is output as the picture on the projection surface (¶ 0059, imaging scattered light and infrared from finger while the image light is being projected) and capture the first image of the picture on the projection surface with the at least one of the gain value and the exposure value set to the first predetermined value (id.); acquire a second image by controlling the camera to set at least one value of the gain value and the exposure value of the camera to a second predetermined value while the original image is output as the picture on the projection surface (¶¶ 0070–71, Fig. 8, different exposure timings for different light quantities to vary brightness of captured image) and capture the second image of the picture on the projection surface with the at least one of the gain value and the exposure value set to the second predetermined value (¶¶ 0083–84, modulating light continuously to maintain optimal light quantity during exposure period); and identify a location of a touch of a user on the picture on the projection surface based on the first image and the second image (¶ 0059, identifying finger position), and wherein the camera is configured to sense a greater amount of light when the at least one of the gain value and the exposure value of the camera is set to the first predetermined value than when the at least one of the gain value and the exposure value of the camera is set to the second predetermined value (¶ 0071, exposure period and image brightness are proportional). The claimed invention differs from Teraoka in that the claimed invention specifies the system is controlled by a computer. Teraoka does not disclose these limitations. However, Dai, directed to an interactive projector, teaches with respect to claim 1 a memory storing at least one instruction (¶ 0039, memory 12 in main control module 1); and one or more processors operatively connected to the projector, the camera, and the memory (id., controller 11), wherein the one or more processors are configured to execute the at least one instruction (¶¶ 0051–52, software). It would have been obvious to one of ordinary skill in the art at the time of effective invention to implement various components of Teraoka such as detection image processing section 6 or illumination control section 7 (¶ 0042) as software, due to the well-known and predictable advantages such as use of off-the-shelf components. M.P.E.P. § 2143(I)(D) (obvious to apply known technique to a known device ready for improvement to yield predictable results). Regarding claim 2, Teraoka in view of Dai teaches the electronic apparatus as claimed in claim 1, wherein the camera comprises a filter configured to transmit visible light and infrared light therethrough (Dai ¶ 0050, filtering device filters visible light and retains infrared light, yet image capturing module allows for both capturing visible light and infrared light), and wherein an amount of the infrared light transmitted through the filter is greater than an amount of the visible light transmitted through the filter by at least a predetermined amount (id.) Regarding claim 3, Teraoka in view of Dai teaches the electronic apparatus as claimed in claim 2, wherein the one or more processors are further configured to: acquire the first image by controlling the camera to sense light including the visible light and the infrared light transmitted through the filter of the camera in a state in which the exposure value of the camera is set to a first exposure value (Teraoka ¶¶ 0070–71, Fig. 7; exposure period with high light value); and acquire the second image by controlling the camera to sense light including the visible light and the infrared light transmitted through the filter of the camera in a state in which the exposure value of the camera is set to a second exposure value (¶¶ 0070–71, Fig. 8; exposure period with low light value), and wherein an amount of the visible light sensed through the camera in acquiring the second image is less than a threshold value (¶¶ 0066–67, average light quantity; some frames are expected to have light quantity below this value but above a global minimum quantity). Regarding claim 5, Teraoka in view of Dai teaches the electronic apparatus as claimed in claim 2, wherein the one or more processors are further configured to control the camera to set the at least one of the gain value and the exposure value of the camera so that a predetermined number of points of reflected light of the infrared light are identified on the acquired second image (Teraoka at, e.g., ¶ 0059, expected scattered light from finger). Regarding claim 6, Teraoka in view of Dai teaches the electronic apparatus as claimed in claim 1, wherein the one or more processors are further configured to: identify at least one coordinate corresponding to the location of the touch among a plurality of predetermined coordinates included in the picture on the projection surface based on the second image (Teraoka ¶¶ 0049, 0059, coordinate data of detected object position); identify a location corresponding to the at least one coordinate identified in the first image (¶¶ 0083–84, modulating light continuously to maintain optimal light quantity during exposure period); and identify the location of the touch on the picture on the projection surface based on the identified location corresponding to the at least one coordinate (¶¶ 0049, 0059, again finding coordinate data further in modulated process). Regarding claim 7, Teraoka in view of Dai teaches the electronic apparatus as claimed in claim 1, wherein the one or more processors are further configured to: identify a correction value for correcting the picture on the projection surface based on the first image and the original image (Dai ¶ 0045, obtaining projection error for correction); acquire a corrected first image and a corrected second image based on the correction value (¶ 0047, correcting subsequent images); and identify the location of the touch on the picture on the projection surface based on the corrected first image and the corrected second image (Teraoka ¶ 0059, detecting finger position). Regarding claim 8, Teraoka in view of Dai teaches the electronic apparatus as claimed in claim 1, wherein the one or more processors are further configured to: acquire spatial information corresponding to the projection surface based on the first image (Dai ¶ 0045, error based on difference between structure actually captured by the image capture module and expected result from projection module); and correct the location of the touch on the picture on the projection surface based on a location of the touch identified in the second image and the spatial information corresponding to the projection surface (¶ 0047, correction). Regarding claim 10, Teraoka in view of Dai teaches the method as claimed in claim 9, wherein the camera includes a filter configured to transmit visible light and infrared light therethrough (Dai ¶ 0050, filtering device filters visible light and retains infrared light, yet image capturing module allows for capturing visible light and infrared light), and wherein an amount of the infrared light transmitted through the filter is greater than an amount of the visible light transmitted through the filter by at least a predetermined amount (id.). Regarding claim 11, Teraoka in view of Dai teaches the method as claimed in claim 10, wherein the acquiring the first image comprises sensing light including the visible light and the infrared light transmitted through the filter of the camera in a state in which the exposure value of the camera is set to a first exposure value (Teraoka ¶¶ 0070–71, Fig. 7; exposure period with high light value), wherein the acquiring the second image comprises sensing light including the visible light and the infrared light transmitted through the filter of the camera in a state in which the exposure value of the camera is set to a second exposure value (¶¶ 0070–71, Fig. 8; exposure period with low light value), and wherein an amount of the visible light sensed through the camera in the acquiring the second image is less than a threshold value (¶ 0066–67, average light quantity; some frames are expected to have light quantity below this value but above a global minimum quantity). Regarding claim 13, Teraoka in view of Dai teaches the method as claimed in claim 10, further comprising setting at least one of the gain value and the exposure value of the camera so that a predetermined number of points of reflected light of the infrared light are identified on the acquired second image (Teraoka at, e.g., ¶ 0059, expected scattered light from finger). Regarding claim 15, Teraoka in view of Dai teaches the method as claimed in claim 9, further comprising: identifying a correction value based on a correction of the picture on the projection surface based on the first image and the original image (Dai ¶ 0045, obtaining projection error for correction); acquiring a corrected first image and a corrected second image based on the correction value (¶ 0047, correcting subsequent images); and identifying the location of the touch on the picture on the projection surface based on the corrected first image and the corrected second image (Teraoka ¶ 0059, detecting finger position). Claim Rejections - 35 U.S.C. § 103 Claims 4 and 12 are rejected under 35 U.S.C. § 103 as being unpatentable over Teraoka in view of Dai and further in view of U.S. Patent Application Publication No. 2021/0314501 A1 (“Chen”)3. Claims 4 and 12 specify adjusting a camera gain value, not an exposure value as in Teraoka. However, Chen, directed to a visible and infrared camera, teaches with respect to claim 4 the electronic apparatus as claimed in claim 2, wherein the one or more processors are further configured to: acquire the first image by controlling the camera to sense light including the visible light and the infrared light transmitted through the filter of the camera (Dai ¶ 0050, filtering device) in a state in which the gain value of the camera is set to a first gain value (Chen ¶ 0011, first sensor gain); and acquire the second image by controlling the camera to sense light including the visible light and the infrared light transmitted through the filter of the camera (Dai ¶ 0050, filtering device) in a state in which the gain value of the camera is set to a second gain value (Chen ¶ 0011, second sensor gain), and wherein an amount of the visible light sensed through the camera in acquiring the second image is less than a threshold value (e.g., 0079, controlling sensor gain and shutter value to retain difference between visible light exposure and target within a threshold; claimed threshold value is the sum of the Chen target exposure value plus the threshold). It would have been obvious to one of ordinary skill in the art at the time of effective filing to modify the Teraoka imaging device to modulate gain in addition to or in place of exposure time, as taught by Chen, in order to obtain a greater control and consistency of exposure. Chen ¶¶ 0005, 0047). Regarding claim 12, Teraoka in view of Dai and Chen teaches the method as claimed in claim 10, wherein the acquiring the first image comprises sensing light including the visible light and the infrared light transmitted through the filter of the camera (Dai ¶ 0050, filtering device) in a state in which the gain value of the camera is set to a first gain value (Chen ¶ 0011, first sensor gain), wherein the acquiring the second image comprises sensing light including the visible light and the infrared light transmitted through the filter of the camera (Dai ¶ 0050, filtering device) in a state in which the gain value of the camera is set to a second gain value (Chen ¶ 0011, first sensor gain), and wherein an amount of the visible light sensed through the camera in the acquiring the second image is less than a threshold value (e.g., ¶ 0079, controlling sensor gain and shutter value to retain difference between visible light exposure and target within a threshold; claimed threshold value is the sum of the Chen target exposure value plus the threshold). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. The following prior art was found using an Artificial Intelligence assisted search using an internal AI tool that uses the classification of the application under the Cooperative Patent Classification (CPC) system, as well as from the specification, including the claims and abstract, of the application as contextual information. The documents are ranked from most to least relevant. Where possible, English-language equivalents are given, and redundant results within the same patent families are eliminated. See “New Artificial Intelligence Functionality in PE2E Search”, 1504 OG 359 (15 November 2022), “Automated Search Pilot Program”, 90 F.R. 48,161 (8 October 2025). US 2021/0153310 A1 CN 102508546 A RU 2798966 C1 US 2012/0242581 A1 US 2022/0132081 A1 US 2011/0216091 A1 US 2019/0324591 A1 Any inquiry concerning this communication or earlier communications from the examiner should be directed to David N Werner whose telephone number is (571)272-9662. The examiner can normally be reached M--F 7:30--4:00 Central. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Dave Czekaj can be reached at 571.272.7327. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /David N Werner/Primary Examiner, Art Unit 2487 1 This reference was cited as an ‘A’ reference in the International Search Report for corresponding International Application No. PCT/KR2024/008437, and was listed in the 30 December 2024 Information Disclosure Statement. 2 This reference was cited as an ‘X’ reference in the International Search Report for corresponding International Application No. PCT/KR2024/008437, and was listed in the 30 December 2024 Information Disclosure Statement. A machine translation from the European Patent Office of the specification is provided. 3 This reference was listed in the 29 July 2024 Information Disclosure Statement.
Read full office action

Prosecution Timeline

Jul 29, 2024
Application Filed
Apr 04, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12598312
OVERHEAD REDUCTION IN MEDIA STORAGE AND TRANSMISSION
2y 5m to grant Granted Apr 07, 2026
Patent 12598297
METHOD AND APPARATUS FOR RECONSTRUCTING 360-DEGREE IMAGE ACCORDING TO PROJECTION FORMAT
2y 5m to grant Granted Apr 07, 2026
Patent 12593144
SOLID STATE IMAGING ELEMENT, IMAGING DEVICE, AND SOLID STATE IMAGING ELEMENT CONTROL METHOD
2y 5m to grant Granted Mar 31, 2026
Patent 12587754
METHOD FOR DYNAMIC CORRECTION FOR PIXELS OF THERMAL IMAGE ARRAY
2y 5m to grant Granted Mar 24, 2026
Patent 12587689
METHOD AND APPARATUS FOR RECONSTRUCTING 360-DEGREE IMAGE ACCORDING TO PROJECTION FORMAT
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
68%
Grant Probability
84%
With Interview (+16.2%)
3y 3m
Median Time to Grant
Low
PTA Risk
Based on 713 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month