Prosecution Insights
Last updated: April 19, 2026
Application No. 18/186,740

COMPUTER SYSTEM INCLUDING A DISPLAY SCREEN

Non-Final OA §102§103§112
Filed
Mar 20, 2023
Examiner
SINGH, AVIRAJ DONGSOOK
Art Unit
3645
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
STMicroelectronics
OA Round
1 (Non-Final)
Grant Probability
Favorable
1-2
OA Rounds
3y 0m
To Grant

Examiner Intelligence

Grants only 0% of cases
0%
Career Allow Rate
0 granted / 0 resolved
-52.0% vs TC avg
Minimal +0% lift
Without
With
+0.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
7 currently pending
Career history
7
Total Applications
across all art units

Statute-Specific Performance

§103
71.4%
+31.4% vs TC avg
§102
19.1%
-20.9% vs TC avg
§112
9.5%
-30.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 0 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). The certified copy has been filed in parent Application No. FR2203008, filed on 04/01/2022. Drawings The drawings are objected to as failing to comply with 37 CFR 1.84(p)(5) because they do not include the following reference sign(s) mentioned in the description: INT2, MEM. Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance. The drawings are objected to as failing to comply with 37 CFR 1.84(p)(5) because they include the following reference character(s) not mentioned in the description: 0. Corrected drawing sheets in compliance with 37 CFR 1.121(d), or amendment to the specification to add the reference character(s) in the description in compliance with 37 CFR 1.121(b) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 8 and 18 rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claims 8 and 18 recites the limitation "the individual" in lines 2-3. There is insufficient antecedent basis for this limitation in the claim. Hereafter, examiner interprets “the individual” as “the user”. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim(s) 1, 2, 5, 7-9, 11, 12, 15, 17, 18 and 20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Deng and Xu (CN 108647504), hereafter referred to as DENG. Examiner has attached a copy of DENG and the translation used to this action. Regarding Claim 1, DENG teaches a computer system [Fig. 2: “mobile terminal”] including: a display screen [#21 of Fig. 2: “display”]; a time-of-flight sensor disposed in a vicinity of the screen and configured to acquire distance information of several zones of a scene facing the screen in a field of view of the time-of-flight sensor [#23 of Fig. 2: “depth camera”]; a processor [#20 of Fig. 2: “processor”] configured to: determine a presence of a user of the computer system in the scene [Text description of Figure 2: “proximity sensor for sensing the physical distance between the user and the display … and transmitted to the processor 20, so that the processor 20 indirectly wake up the sleep intelligent mobile terminal“]; detect a presence of at least one individual other than the user from the distance information acquired by the time-of-flight sensor [Text description of Step S31 from Fig. 3: “obtaining the face image. obtaining the human face image of the user and the depth information of the user space in a certain range by the depth camera and the related software; It can be understood that, if there is other non-authorized face in the space range, other non-authorized face image should be obtained together;” and text description of Step S32 from Fig. 3: “The face of the non-authorized person is detected.”]; and inform the user when a presence of the at least one individual other than the user is detected [Text description of Step S35 from Fig. 3: “for the presence of non-authorized person snoop display condition, the processor performs anti-snoop operation, the specific operation comprises but not limited to: sending the reminding alarm by the audio device, or outputting the reminding mark through the display”]. Regarding Claim 2, DENG also teaches the system according to claim 1, wherein the processor is configured to: determine a presence of elements of the scene in the field of view of the time-of-flight sensor from the distance information acquired by the time-of-flight sensor; [Text description of step S31 from Fig. 3: “obtaining the face image. obtaining the human face image of the user and the depth information of the user space in a certain range by the depth camera and the related software; It can be understood that, if there is other non-authorized face in the space range, other non-authorized face image should be obtained together;”] and determine whether some of these elements correspond to individuals other than the user [Text description of Step S32: “ In some equivalent embodiments, the processor can also by analyzing the human face number obtained in S31, quickly identifying whether there is non-authorized face, namely when the human face number in the space is more than the authorized face number when there is non-authorized face.”]. Regarding Claim 5, DENG also teaches the system according to claim 2, wherein the processor is configured to determine that an element corresponds to an individual other than the user when the element appeared in the scene at a distance, relative to the time-of-flight sensor, close to a distance from a previously detected individual [Claim 6: “or the eyeball gaze time data is obtained by calculating, analyzing at least two of the non-authorized human eyeball position change amplitude is less than the preset critical value of the continuous human face image”]. Regarding Claim 7, DENG also teaches the system according to claim 2, further comprising a camera, wherein the processor is configured to acquire images of the scene in front of the screen after detecting an element using the time-of-flight sensor [Text description of Fig. 1: “depth camera 12 comprises a projection device, an image collecting device or further comprises an RGB camera”]. Regarding Claim 8, DENG also teaches the system according to claim 7, wherein the processor is further configured to check, based on a face recognition algorithm, whether the determined element is actually the individual [“In some embodiments, a human face recognition system based on depth camera, three-dimensional information of matching the target face and the three-dimensional information of the authorized face information, and calculating and analyzing the difference between the two, realizing the recognition function of the face”]. Regarding Claim 9, DENG also teaches the system according to claim 1, wherein the processor is configured to inform the user by displaying an alert message on the screen [Text description of Step S35 from Fig. 3: “for the presence of non-authorized person snoop display condition, the processor performs anti-snoop operation, the specific operation comprises but not limited to: sending the reminding alarm by the audio device, or outputting the reminding mark through the display”]. Regarding Claim 11, claim 11 is identical in scope to claim 1 and is rejected for the reasons stated above. Regarding Claim 12, claim 12 is identical in scope to claim 2 and is rejected for the reasons stated above. Regarding claim 15, claim 15 is identical in scope to claim 5 and is rejected for the reasons stated above. Regarding Claim 17, claim 17 is identical in scope to claim 7 and is rejected for the reasons stated above. Regarding Claim 18, claim 18 is identical in scope to claim 8 and is rejected for the reasons stated above. Regarding Claim 20, claim 20 is identical in scope to claim 9, and is rejected for the reasons stated above. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 3, 4, 6, 13, 14, and 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over DENG in view of Buck (US 8973149 B2), hereafter referred to as BUCK. Regarding Claim 3, DENG teaches the system according to claim 2, wherein the processor is configured to: determine that an element corresponds to an individual other than the user when: the element is movable between several successive acquisitions of the time-of-flight sensor [Claim 6: “or the eyeball gaze time data is obtained by calculating, analyzing at least two of the non-authorized human eyeball position change amplitude is less than the preset critical value of the continuous human face image”], and the element remained in the scene longer than a user-defined duration [Claim 7: "and the eyeball gaze time of the non-authorized person exceeds the preset critical time"] or the element appeared in the scene at a distance, relative to the time-of-flight sensor, close to a distance from a previously detected individual [Claim 6: “or the eyeball gaze time data is obtained by calculating, analyzing at least two of the non-authorized human eyeball position change amplitude is less than the preset critical value of the continuous human face image”]. DENG does not explicitly teach – but BUCK does teach determining that an element corresponds to an individual other than the user when: the element is not a closest element to the time-of-flight sensor [61], and the element is located at a distance, relative to the time-of-flight sensor, less than or equal to a maximum distance defined by the user [158], and the element appeared in the scene after the user of the computer system [60]. BUCK also teaches that constantly warning the user about short inadvertent glances can be disruptive [162]. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the device of DENG to include the element classification techniques of BUCK to prevent excessive warnings to the user. Regarding Claim 4, DENG teaches the system according to claim 2, wherein the processor is configured to: determine that an element corresponds to an individual other than the user when: the element is movable between several successive acquisitions of the time-of-flight sensor [Claim 6: “or the eyeball gaze time data is obtained by calculating, analyzing at least two of the non-authorized human eyeball position change amplitude is less than the preset critical value of the continuous human face image”], and the element remained in the scene longer than a user-defined duration [Claim 7: "and the eyeball gaze time of the non-authorized person exceeds the preset critical time"]. DENG does not explicitly teach – but BUCK does teach determining that an element corresponds to an individual other than the user when: the element is not a closest element to the time-of-flight sensor [61], and the element is located at a distance, relative to the time-of-flight sensor, less than or equal to a maximum distance defined by the user [158], and the element appeared in the scene after the user of the computer system [60]. BUCK also teaches that constantly warning the user about short inadvertent glances can be disruptive [162]. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the device of DENG to include the element classification techniques of BUCK, to prevent excessive warnings to the user. Regarding Claim 6, DENG teaches the system according to claim 2, wherein the processor is configured to filter the determined elements before determining whether the filtered elements correspond to individuals, the filtered elements corresponding to the elements which are sufficiently separated from the others elements. DENG does not teach filtering the elements which are located at a distance from the time-of-flight sensor which is less than a given distance. However, in the same field of endeavor, BUCK teaches filtering the elements which are located at a distance from the time-of-flight sensor which is less than a given distance [158]. BUCK also teaches that constantly warning the user about short inadvertent glances can be disruptive [162]. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the device of DENG to include the filtering system of BUCK to prevent excessive warnings to the user. Regarding Claim 13, claim 13 is identical in scope to claim 3 and is rejected for the reasons stated above. Regarding Claim 14, claim 14 is identical in scope to claim 4 and is rejected for the reasons stated above. Regarding Claim 16, claim 16 is identical in scope to claim 6 and is rejected for the reasons stated above Claim(s) 10 and 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Deng in view of Drader et al., hereafter referred to as DRADER. Regarding Claim 10, DENG teaches the system according to claim 1. DENG does not explicitly teach – but DRADER does teach a time-of-flight sensor including an array of avalanche effect diodes, which are triggerable by an individual photon [5-6]. DRADER also teaches that it is known to use an array of SPAD photodiodes for TOF measurement [6]. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to use a time-of-flight sensor including an array of avalanche effect diodes for DENG’s invention. Regarding Claim 19, claim 19 is identical in scope to claim 10 and is rejected for the reasons stated above. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Sengupta et al. (US 20190266337) teaches a similar device to applicants that relies solely on distance thresholds. Atkinsons et al. (US 20230281285) teaches a similar device to applicants that relies on traditional RGB cameras but uses processing techniques similar to applicants. Any inquiry concerning this communication or earlier communications from the examiner should be directed to AVIRAJ D SINGH whose telephone number is (571)272-9128. The examiner can normally be reached Mon-Fri 7:30am-5pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Isam Alsomiri can be reached at (571) 272-6970. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /A.D.S./Examiner, Art Unit 3645 /ISAM A ALSOMIRI/Supervisory Patent Examiner, Art Unit 3645
Read full office action

Prosecution Timeline

Mar 20, 2023
Application Filed
Feb 19, 2026
Non-Final Rejection — §102, §103, §112 (current)

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
Grant Probability
3y 0m
Median Time to Grant
Low
PTA Risk
Based on 0 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month