Prosecution Insights
Last updated: April 19, 2026
Application No. 18/171,375

ELECTRONIC DEVICE AND DISPLAY METHOD FOR VIDEO CONFERENCE OR VIDEO TEACHING

Final Rejection §102§103
Filed
Feb 20, 2023
Examiner
ROWLAND, STEVE
Art Unit
3715
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Pegatron Corporation
OA Round
2 (Final)
78%
Grant Probability
Favorable
3-4
OA Rounds
2y 8m
To Grant
95%
With Interview

Examiner Intelligence

Grants 78% — above average
78%
Career Allow Rate
823 granted / 1059 resolved
+7.7% vs TC avg
Strong +18% interview lift
Without
With
+17.6%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
24 currently pending
Career history
1083
Total Applications
across all art units

Statute-Specific Performance

§101
17.2%
-22.8% vs TC avg
§103
32.0%
-8.0% vs TC avg
§102
28.7%
-11.3% vs TC avg
§112
13.5%
-26.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1059 resolved cases

Office Action

§102 §103
Detailed Action Response to Amendment This action is responsive to Applicant’s communication filed on 11/17/2025. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: (a) A person shall be entitled to a patent unless— (1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention Claims 1-3, 5, 9 and 10 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Arputharaj et al (US 2015/0244892 A1). Regarding claim 1, Arputharaj discloses an electronic device for video conference or video teaching (Fig. 1) comprising a first image capturing unit configured to capture a first document image of a physical document (402 – 404), a display, an input unit, configured to generate annotation information on the display (712), and a processor, coupled to the first image capturing unit (712), the display, and the input unit, configured to control the display to display the first document image (Fig. 7E), wherein the processor controls the display to simultaneously display the first document image and the annotation information when the processor receives the annotation information from the input unit during a period of time that the first image capturing unit is capturing the first document image and the display is simultaneously displaying the first document image (Fig. 7E). Regarding claim 2, Arputharaj discloses wherein the first image capturing unit is further configured to capture a second document image of the physical document (504) and the processor further obtains a feature value of the first document image (506), identifies the second document image according to the feature value, and controls the display to simultaneously display the second document image and the annotation information (510). Regarding claim 3, Arputharaj discloses wherein the processor further obtains a relative position between the first document image and the annotation information, and when the annotation information is displayed on the second document image, the annotation information is positioned on the second document image based on the relative position (Fig. 7E). Regarding claim 5, Arputharaj discloses wherein when the second document image does not match the feature value, the processor controls the display not to display the annotation information (inherent in steps 508 – 510: if no matching annotations are found, none are displayed). Regarding claim 9, Arputharaj discloses a display method adapted for an electronic device in video conference or video teaching (Fig. 1), and the electronic device comprising a first image capturing unit, a display, and an input unit (712), the display method comprising capturing, by the first image capturing unit, a first document image of a physical document (402 – 404), displaying, by the display, the first document image, generating, by the input unit, annotation information on the display (Fig. 7E) and controlling the display to simultaneously display the annotation information and the first document image when the annotation information from the input unit is received during a period of time that the first image capturing unit is capturing the first document image and the display is simultaneously displaying the first document image (Fig. 7E). Regarding claim 10, Arputharaj discloses obtaining a feature value of the first document image (204 - 206), capturing, by the first image capturing unit, a second document image of the physical document (502) and identifying the second document image according to the feature value (504) and controlling the display to simultaneously display the annotation information and the second document image (510). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. If this application names joint inventors, Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 4 and 11-13 are rejected under 35 U.S.C. 103 as being unpatentable over Arputharaj in view of Booman et al (US 2017/0060908 A1). Regarding claim 4, Arputharaj discloses wherein the processor further checks whether the annotation information is stored in a database (¶ [0023]). Booman suggests—where Arputharaj does not disclose—when the database does not have the annotation information, the processor registers the annotation information and stores the annotation information in the database (¶ [0023]). It would have been obvious to a person of ordinary skill in the art prior to the effective filing date of the invention to combine the disclosures of Arputharaj and Booman in order to implement data storage and retrieval in an industry-standard way, thus promoting code security and stability. Regarding claim 11, Arputharaj discloses checking whether the annotation information is stored in a database (¶ [0023]). Booman suggests—where Arputharaj does not disclose— registering the annotation information and storing the annotation information in the database when the database does not have the annotation information (¶ [0023]). It would have been obvious to a person of ordinary skill in the art prior to the effective filing date of the invention to combine the disclosures of Arputharaj and Booman in order to implement data storage and retrieval in an industry-standard way, thus promoting code security and stability. Regarding claim 12, Arputharaj discloses obtaining a relative position between the first document image and the annotation information (306) and positioning the annotation information on the second document image based on the relative position when the annotation information is displayed on the second document image (410). Regarding claim 13, Arputharaj discloses controlling the display not to display the annotation information when the second document image does not match the feature value (inherent in steps 508 – 510: if no matching annotations are found, none are displayed). Claims 6, 7, 14 and 15 are rejected under 35 U.S.C. 103 as being unpatentable over Arputharaj in view of How to Change your Background on Zoom (as evidenced by ZoomBackground.pdf and Youtube.com) (“ZB”). Regarding claims 6 and 14, ZB suggests—where Arputharaj does not disclose—a second image capturing unit, coupled to the processor, and configured to capture a user image of a user, wherein the processor controls the display to display the user image (p. 1). It would have been obvious to a person of ordinary skill in the art prior to the effective filing date of the invention to combine the disclosures of Arputharaj and ZB in order to make the system more personalized and collaborative. Regarding claim 7 and 15, ZB suggests—where Arputharaj does not disclose—wherein the processor removes a background part of the user image (pp. 1-2). It would have been obvious to a person of ordinary skill in the art prior to the effective filing date of the invention to combine the disclosures of Arputharaj and ZB in order to filter out potentially distracting background activity. Allowable Subject Matter Claims 8 and 16 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Response to Arguments Applicant's arguments filed on 11/17/2025 have been fully considered but they are not persuasive. Applicant argues: However, in Arputharaj, the "annotation information" is generated by the remote server from the hand-written marks already presented on the physical document and captured by the first device 106, neither generated by the input unit of the first device nor displayed on the display unit of the first device. At best, the "input unit" in Arputharaj would be external, namely, the physical pen and paper used to write on the document being photographed/scanned. Such an arrangement falls outside the scope of the claimed feature requiring the annotation information generated on the display. Furthermore, paragraph [0035] and Fig. 7E of Arputharaj confirm that annotation data and position information are retrieved from a remote server. Examiner respectfully disagrees. Although a client/server topography is used in one embodiment of Arputharaj, the reference notes that the invention may be realized as a single device (See, e.g., claims 27-30). Therefore, Examiner respectfully submits that Arputharaj discloses this feature of claim 1. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to STEVE ROWLAND whose telephone number is (469) 295-9129. The examiner can normally be reached on M-Th 10-8. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor Dmitry Suhol can be reached at (571)-272-4430. The fax number for the organization where this application or proceeding is assigned is (571) 273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /STEVE ROWLAND/Primary Examiner, Art Unit 3715
Read full office action

Prosecution Timeline

Feb 20, 2023
Application Filed
Sep 10, 2025
Non-Final Rejection — §102, §103
Nov 17, 2025
Response Filed
Feb 06, 2026
Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12589308
GENERATIVE NARRATIVE GAME EXPERIENCE WITH PLAYER FEEDBACK
2y 5m to grant Granted Mar 31, 2026
Patent 12586441
SELECTIVE REDEMPTION OF GAMING ESTABLISHMENT TICKET VOUCHERS
2y 5m to grant Granted Mar 24, 2026
Patent 12582874
APPARATUS FOR ARTIFICIAL INTELLIGENCE EXERCISE RECOMMENDATION BY ANALYZING DATA COLLECTED BY POSTURE MEASUREMENT SENSOR AND DRIVING METHOD THEREOF
2y 5m to grant Granted Mar 24, 2026
Patent 12579757
UPDATING A VIRTUAL REALITY ENVIRONMENT
2y 5m to grant Granted Mar 17, 2026
Patent 12569763
VIRTUAL OBJECT CONTROL METHOD AND RELATED APPARATUS
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
78%
Grant Probability
95%
With Interview (+17.6%)
2y 8m
Median Time to Grant
Moderate
PTA Risk
Based on 1059 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month