Prosecution Insights
Last updated: April 19, 2026
Application No. 18/578,305

Display Control System for Drawing

Final Rejection §103
Filed
Jan 10, 2024
Examiner
MERCEDES, DISMERY E
Art Unit
2627
Tech Center
2600 — Communications
Assignee
Teamlab Inc.
OA Round
5 (Final)
77%
Grant Probability
Favorable
6-7
OA Rounds
2y 9m
To Grant
87%
With Interview

Examiner Intelligence

Grants 77% — above average
77%
Career Allow Rate
740 granted / 964 resolved
+14.8% vs TC avg
Moderate +11% lift
Without
With
+10.6%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
29 currently pending
Career history
993
Total Applications
across all art units

Statute-Specific Performance

§101
2.8%
-37.2% vs TC avg
§103
52.1%
+12.1% vs TC avg
§102
21.5%
-18.5% vs TC avg
§112
17.6%
-22.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 964 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1,4-5 is/are rejected under 35 U.S.C. 103 as being unpatentable over Matsumoto et al. (US 2022/0083127) in view of Bamji et al. (US 2011/0291988), further in view of Ono et al. (US 2016/0041632). As to Claim 1, Matsumoto et al. discloses A display control system comprising: a display device that displays an image on a display surface three-dimensionally formed with respect to a floor surface (fig.1-2,4,9,10; para.0024-The projector or LED module projects (displays) an image generated based on the video signal supplied from the control device 20 onto the wall surface 12 with respect to floor surface 11); a two-dimensional scanning type optical distance measuring sensor that detects a contact position of an object on the display surface of the display device, wherein the optical distance measuring sensor emits inspection light along the display surface; an imaging device disposed to acquire an image within a range that includes the display surface of the display device and the floor surface on a side of a front face of the display surface (fig.2, para.0023-camera 15 captures entire space 10 or at least the floor surface 11 and wall surface 12), wherein a length of the floor surface included in the range of imaging by the imaging device is 0.5 m or longer (fig.2,10; para.0023-camera 15 captures entire space 10; para.0029-control device 20 utilizes positions of users obtained from camera 15 to determine a user has come within a predetermined distance from the wall surface; para.0041- an arbitrary distance in a range of about several tens of centimeters to about 1 meter can be set); and a controller that analyzes the image acquired by the imaging device to discriminate a type of the object captured in the image acquired by the imaging device, determines a position at which the object of the discriminated type has come into contact with the display surface of the display device by integrating (i) information relating to the contact position of the object detected by the optical distance measuring sensor and (ii) analysis information relating to the type of the object derived from the image acquired by the imaging device (fig.9, para.0033- 0035, 0041-0045, 0061-0063, 0067-0071; the control device 20 analyzes the image capturing screen of the camera 15 and recognizes the position of the user with respect to the floor, and the control device determines whether the detected position of the user is near the wall surface or not; the control device 20 determines whether the user has touched the wall surface based on information received from the camera 15, and processes an application program according to the touch operation) and controls the display device based on information relating to the type of the object and the information of the determination to change a type or a color of an image displayed at the contact position of the object in accordance with the type of the object (para.0069-0071-the control device 20 determines whether the user has touched the wall surface based on information received from the camera 15, and processes an application program according to the touch operation). Matsumoto et al. discloses a motion sensor 27 may be used to detect a movement and position of the user, where the motion sensor 27 emits light with a predetermined wavelength such as infrared rays or laser light, and receives emitted reflected light to detect a movement of the object and any of a position at which the user 50 is located, how closely the user approaches the wall surface when the user 50 is not in contact with the wall surface, and a place of the wall surface that the user 50 is touching when the user is in contact with the wall surface can be acquired. The output of the motion is provided to the control device 20 to recognize the position of the user (fig.1-2,4,9; para.0022, 0034-0035). Matsumoto et al. does not expressly disclose a two-dimensional scanning type optical distance measuring sensor that detects a contact position of an object on the display surface of the display device, wherein the optical distance measuring sensor emits inspection light along the display surface; a controller that analyzes the image acquired by the imaging device to discriminate a type of the object captured in the image acquired by the imaging device; and determines a position at which the object of the discriminated type has come into contact with the display surface of the display device by integrating (i) information relating to the contact position of the object detected by the optical distance measuring sensor and (ii) analysis information relating to the type of the object derived from the image acquired by the imaging device; controls the display device based on information relating to the type of the object and the information of the determination to change a type or a color of an image displayed at the contact position of the object in accordance with the type of the object Bamji et al. discloses a two-dimensional scanning type optical distance measuring sensor that detects a contact position of an object on the display surface of the display device (fig.7A,8C,8E; para.0033,0057,0059,0062-0063,0068-0069,0071,0076), wherein the optical distance measuring sensor emits inspection light along the display surface (para.0057,0061-0062,0068-0069). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the device of Matsumoto et al. with the teachings of Bamji et al., such that the control device (of Matsumoto) also acquires data from the two-dimensional camera and TOF systems (as disclosed by Bamji), the motivation being to provide a more robust detection of display surface touching and allow the system to more rapidly and reliably discern and identify user-object interaction with the display surface. Matsumoto et al. in view of Bamji et al. do not expressly disclose a controller that analyzes the image acquired by the imaging device to discriminate a type of the object captured in the image acquired by the imaging device; and determines a position at which the object of the discriminated type has come into contact with the display surface of the display device by integrating… analysis information relating to the type of the object derived from the image acquired by the imaging device; controls the display device based on information relating to the type of the object and the information of the determination to change a type or a color of an image displayed at the contact position of the object in accordance with the type of the object Ono et al. discloses a controller that analyzes the image acquired by the imaging device to discriminate a type of the object captured in the image acquired by the imaging device (para.0077, 0084-0087, 0096, 0110, 0114-0115, 0118; determines whether the pointing tool is a finger or a light pen from obtained image); and determines a position at which the object of the discriminated type has come into contact with the display surface of the display device by integrating analysis information relating to the type of the object derived from the image acquired by the imaging device (para.0077,0110, 0114-0115, 0118,0120- type determination unit 713 determines whether the pointing tool is a finger or a light pen, and calculates a feature of the pointing tool, which may be coordinates of the pointing tool); controls the display device based on information relating to the type of the object and the information of the determination to change a type or a color of an image displayed at the contact position of the object in accordance with the type of the object (para.0056, 0075, 0080-0081, 0089,0098,0103,0122-0123- when it is determined is light pen, in which an image of the light pen is detected in the captured image, the coordinates of the determined pointing tool is a pen is sent to the drawing unit, which starts drawing from the coordinate position of the light pen, and if it is determined it is a finger, which include image area irradiated by the lighting unit 15 and image caused by the light emitting pen is not detected in the captured image, the operation mode is switched to finger detection mode, in which hand writing may be performed). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the device disclosed by Matsumoto et al. in view of Bamji et al., with the teachings of Ono et al., the motivation being to provide different modes of operation according to the type of pointing tool used by the user. As to Claim 4, Matsumoto et al. in view of Bamji et al., as modified by Ono et al. disclose wherein the object includes a light emission unit that emits light in a pattern or a color that differs depending on the type (Ono-para.0059,0091- light pen has a light emission unit that emits light, and when a luminance increased area exists in the captured image due to the light emitted from the light pen, detects a contact of the light pen; para.0075), and the controller analyzes the light emission pattern or the color of the light emission unit captured in the image to discriminate the type of the object captured in the image (Ono-para.0059,0091- light pen has a light emission unit that emits light and when a luminance increased area exists in the captured image due to the light emitted from the light pen, detects a contact of the light pen; para.0075). As to Claim 5, Matsumoto et al. in view of Bamji et al., as modified by Ono et al. disclose wherein the controller captures the object before the object comes into contact with the display surface based on the image acquired by the imaging device (Matsumoto-para.0034,0063; Ono-para.0052), and estimates that the object has come into contact with the display surface or estimates the contact position of the object on the display surface (Matsumoto-fig.9; para.0034,0067-0068; Ono-0052), and detects that the object captured by the imaging device has come into contact with the display surface and detects the contact position of the object based on the information detected by the optical distance measuring sensor (Matsumoto- para.0034, 0044, 0067-0068; Bamji-para.0063,0068-0069,0071,0076; Ono-para.0052,0097). Response to Arguments Applicant’s arguments with respect to claim(s) 1 have been considered but are moot because the new ground of rejection are applied as necessitated by amendment. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to DISMERY E. MERCEDES whose telephone number is (571)272-7558. The examiner can normally be reached Monday-Friday, 9am-5pm, EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ke Xiao can be reached at 571-272-7776. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DISMERY MERCEDES/Primary Examiner, Art Unit 2627
Read full office action

Prosecution Timeline

Jan 10, 2024
Application Filed
Sep 07, 2024
Non-Final Rejection — §103
Nov 18, 2024
Response Filed
Jan 29, 2025
Final Rejection — §103
Apr 29, 2025
Request for Continued Examination
May 06, 2025
Response after Non-Final Action
May 17, 2025
Non-Final Rejection — §103
Aug 14, 2025
Response Filed
Oct 03, 2025
Non-Final Rejection — §103
Dec 17, 2025
Response Filed
Mar 24, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602863
HAND-BASED LIGHT ESTIMATION FOR EXTENDED REALITY
2y 5m to grant Granted Apr 14, 2026
Patent 12602130
ELECTRONIC DEVICE
2y 5m to grant Granted Apr 14, 2026
Patent 12603031
DISPLAY DEVICE AND CONTROLLER
2y 5m to grant Granted Apr 14, 2026
Patent 12597374
FOLDABLE DISPLAY DEVICE AND METHOD FOR COMPENSATING FOR DETERIORATION OF FLEXIBLE DISPLAY PANEL
2y 5m to grant Granted Apr 07, 2026
Patent 12597387
DISPLAY PANEL, DISPLAY DEVICE AND PIXEL REPAIR METHOD
2y 5m to grant Granted Apr 07, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

6-7
Expected OA Rounds
77%
Grant Probability
87%
With Interview (+10.6%)
2y 9m
Median Time to Grant
High
PTA Risk
Based on 964 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month