Prosecution Insights
Last updated: April 19, 2026
Application No. 18/862,217

DISPLAY DEVICE AND DISPLAY CONTROL METHOD

Final Rejection §102§103
Filed
Nov 01, 2024
Examiner
PARK, SANGHYUK
Art Unit
2623
Tech Center
2600 — Communications
Assignee
Sony Group Corporation
OA Round
2 (Final)
71%
Grant Probability
Favorable
3-4
OA Rounds
2y 6m
To Grant
88%
With Interview

Examiner Intelligence

Grants 71% — above average
71%
Career Allow Rate
509 granted / 717 resolved
+9.0% vs TC avg
Strong +16% interview lift
Without
With
+16.5%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
25 currently pending
Career history
742
Total Applications
across all art units

Statute-Specific Performance

§101
0.8%
-39.2% vs TC avg
§103
54.1%
+14.1% vs TC avg
§102
25.9%
-14.1% vs TC avg
§112
16.4%
-23.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 717 resolved cases

Office Action

§102 §103
Detailed Action Response to Amendment The amendment filed on 11/12/2025 has been entered and considered by the examiner. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Specification The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed. The current title “display device and display control method” is too broad and reads on the entire classification relating to display devices. The title must be amended to further detail such as light emitting units and viewer position recognition unit. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-4, 6, 7, 9, 10, 12, 13 and 20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Yoon et al (PGPUB 2013/0088486 A1). As to claim 1, Yoon (Figs. 5, 8) teaches a display device (3D image display apparatus, Fig. 8), comprising: a backlight (light source substrate 100, point light sources 110 and line light source 10) that includes a plurality of light-emitting units (light emitting diode LED panel)(¶ 50, 51); a light beam control unit (lenticular lens sheet 200) configured to control light (i.e. light emitted from point light source 110 to line light source 10) that has entered from each light-emitting unit of the plurality of light-emitting units to travel in a corresponding direction (i.e. toward display panel 300 as shown in Fig. 8) based on a position of a light-emitting unit, wherein the light-emitting unit that is a light source and the light-emitting unit emits the light (Figs. 6, 7, ¶ 61, 63-65); a display panel (display panel 300) includes a display surface (Fig. 8), wherein the light that has entered from the light beam control unit is being transmitted through the display panel and emitted from the display surface (Fig. 8, ¶ 88); a viewer position recognition unit (location tracking system 500) configured to recognize a viewer position, wherein the viewer position (observation position, Fig. 8) is a position of a viewer (i.e. track pupil or face of the viewer) with respect to the display surface (¶ 60); and a light emission control unit (control unit 400) configured to control each the light-emitting units of the plurality of light-emitting units based on a direction in which light from each of the light-emitting units as the plurality of light-emitting units as the light source (point light source 110) is emitted from the display surface and the viewer position (¶ 60: i.e. receive information on apposition of the viewer’s face or eyes and turn on corresponding point light source). As to claim 2, Yoon (Figs. 5, 8) teaches, the display device according to claim 1, wherein the light emission control unit is further configured to: cause a first light-emitting unit that is a light source of light emitted from the display surface toward the viewer position to emit light, where the plurality of light-emitting units comprises the first light-emitting unit (¶ 82: i.e. arbitrarily selected and controlled point light source 110); and dims or turns off a light-emitting unit that is a light source of light emitted from the display surface to a direction different from the viewer position, where the plurality of light-emitting units comprises the second light-emitting unit (i.e. non-selected light emitting units at different depths)(¶ 59, 107: i.e. one or one line or arbitrary set of point light source can be turned on depending on the user position to direct light toward the observation position. Fig. 8 shows the observation position relative to the light source position). As to claim 3, Yoon (Figs. 5, 8) teaches, wherein the light beam control unit is further configured to control in a control direction that is a direction parallel to the display surface (Fig. 8: i.e. lenticular lens part 210 is parallel to the display panel 300 as shown in Fig. 8), a direction in which light that has entered from the each light-emitting unit of the plurality of light-emitting units travels, and the light emission control unit is further configured to control each of the light-emitting units of the plurality of in based on position of the each light-emitting unit of the plurality of light-emitting units in the control direction relative to the light beam control unit (Figs. 6, 8: i.e. lenticular lens part 210 changes the direction of light from the point light source 110, ¶ 59: i.e. controls to turn on one arbitrary set of point light sources toward lenticular lens sheet to direct light toward observation position). As to claim 4, Yoon (Fig. 6) teaches, wherein the light beam control unit is a lens that has a curvature in the control direction (Fig. 6, lenticular lens part 210 with curved surface). As to claim 6, Yoon (Figs. 5, 8) teaches, wherein the control direction is one direction parallel (i.e. horizontal direction as shown in Fig. 5) to the display surface, and the light beam control unit is a lens array that includes a lens having a curvature in the one direction (Figs. 5, 6). As to claim 7, Yoon (Fig. 6) teaches, the backlight is a direct backlight (one point light source, such as LED) in which the plurality of light-emitting units is further configured to face a surface of the display panel opposite to the display surface (¶ 25, Fig. 8), and the light beam control unit is disposed between the backlight and the display panel (Fig. 8: i.e. lenticular lens sheet 200 is between light source substrate 100 and display panel 300). As to claim 9, Yoon (Fig. [Claim 9] The display device according to claim 1, further comprising a video generation unit (i.e. circuit for driving display panel 300 of conventional LCD) configured to generate video (i.e. 3D image or 2D image displayed on LCD) to be displayed on the display surface (¶ 57, 60: i.e. determine whether to generate 3D image and display based on the determination that who views the 3D image), wherein the light emission control unit is configured to : control the each light-emitting unit of the plurality of the light-emitting units, based on brightness (i.e. turning on/off) of each region of the video, the emission direction (¶ 50: i.e. individually operating optical fibers for corresponding light source 110) and the viewer position (i.e. turn on/off based on viewer position). As to claim 10, Yoon (Fig. 8) teaches, a video generation unit (i.e. circuit for driving display panel 300 of conventional LCD) configured to generates video (i.e. 3D image or 2D image displayed on LCD) to be displayed on the display surface (¶ 57, 60: i.e. determine whether to generate 3D image and display based on the determination that who views the 3D image), wherein the video generation unit is further configured to correct the video based on the viewer position and a control state of the each light emitting unit of the plurality of light-emitting units by the light emission control unit (¶ 67:i.e. point light sources turn on based on feedback of location tracking system such that 3D image is continuous regardless of viewer movement by viewing zone forming position variation). As to claim 12, Yoon (Fig. 8) teaches, a sensor (location tracking system 500) configured to detect a viewer facing the display surface (¶ 60), wherein the viewer position recognition unit is further configured to recognize the viewer position based on a the detection of the sensor (¶ 60: i.e. pupil / face tracking). As to claim 13, Yoon (Fig. 8) teaches, the sensor is further configured to track at least one of a face or an eye of the viewer facing the display surface (pupil tracking)(¶ 60), and the viewer position recognition unit is further configured to recognize the viewer position based on a result of the tracking (¶ 60: i.e. use pupil tracking to determine viewer’s viewing position). As to claim 20, Yoon (Figs. 5, 8) teaches, a control method for a display device (3D image display apparatus, Fig. 8), comprising: recognizing a viewer position (observation position, Fig. 8), wherein the viewer position is a position of a viewer relative to a display surface (i.e. surface of display panel 300) of a display panel (display panel 300)(¶ 60); and controlling (i.e. via control unit 400) each light-emitting unit of a plurality of light-emitting units (light emitting diode LED panel) included in a backlight (light source substrate 100, point light source 110 and line light source 10) based on a direction in which light from the each light-emitting unit of the plurality of light-emitting units as a light source (point light source 110) is emitted from the display surface and the viewer position (¶ 59, 107: i.e. one or one line or arbitrary set of point light source can be turned on depending on the user position to direct light toward the observation position. Fig. 8 shows the observation position relative to the light source position), wherein the each of the light-emitting unit of the plurality of light-emitting units emitting light that enters a light beam control unit (lenticular lens sheet 200) from the each of the light-emitting units of the plurality of light-emitting units, is controlled to travel in a corresponding specific direction (i.e. toward display panel 300) based on position of the light-emitting unit, the light-emitting unit is that is the light source, the light-emitting unit emits the light, and the light is transmitted through the display panel and emitted from the display surface (Figs. 6, 7, ¶ 61, 63-65). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 5, 14-16, 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Yoon in view of Karafin et al (PGPUB 2020/0296327 A1). As to claim 5, Yoon (Figs. 6, 8) teaches, the control direction is (¶ 58, Fig. 8: i.e. surface of lenticular lens sheet 200 is parallel to the surface of display panel 300). Yoon does not specifically teach a lens having a curvature in the all directions. Karafin (Fig. 3A) teaches, the light beam control unit is a lens array that includes a lens having a curvature in the all directions (spherical lens surface)(¶ 47). It would have been obvious to a person of ordinary skilled in the art before the effective filing date of the claimed invention to incorporate Karafin’s display structure into Yoon’s display, so as to provide a display device with different lens assembly options than lenticular lens (¶ 47) and tailored user experience by utilizing viewer profiles (¶ 122, 123). As to claim 14, Yoon teaches the display device of claim 13, but does not specifically teach track a line of sight. Karafin (Fig. 5) teaches, wherein the sensor is further configured to track a line of sight of the viewer facing the display surface (gaze tracking, where an eye is looking)(¶ 77), and the viewer position recognition unit is further configured not to recognize based on the line of sight is not directed toward the display surface, the position of the viewer as the viewer position (¶ 77: i.e. tracking information includes gaze tracking, ¶ 80: i.e. display based on tracking information). It would have been obvious to a person of ordinary skilled in the art before the effective filing date of the claimed invention to incorporate Karafin’s display structure into Yoon’s display, so as to provide a display device with different lens assembly options than lenticular lens (¶ 47) and tailored user experience by utilizing viewer profiles (¶ 122, 123). As to claim 15, Yoon teaches the display device of claim 13, but does not specifically teach registered viewer. Karafin (Fig. 5) teaches, wherein the sensor is further configured to identify the viewer facing the display surface (¶ 127: i.e. identifies face), and the light emission control unit is further configured not to recognize, based on the viewer facing the display surface is not a viewer registered in advance, the position of the viewer as the viewer position (¶ 123: i.e. authorization to view confidential information, ¶ 124: i.e. selectively not display the holographic content to specific user). It would have been obvious to a person of ordinary skilled in the art before the effective filing date of the claimed invention to incorporate Karafin’s display structure into Yoon’s display, so as to provide a display device with different lens assembly options than lenticular lens (¶ 47) and tailored user experience by utilizing viewer profiles (¶ 122, 123). As to claim 16, Yoon teaches the display device of claim 12, but does not specifically teach the sensor type. Karafin (Fig. 5) teaches, wherein the sensor is further configured to detect the viewer facing the display surface based on at least one of a temperature, infrared rays (infrared, ¶ 79), sound (sound, ¶ 121), or a radio wave (RFID, ¶ 127). It would have been obvious to a person of ordinary skilled in the art before the effective filing date of the claimed invention to incorporate Karafin’s display structure into Yoon’s display, so as to provide a display device with different lens assembly options than lenticular lens (¶ 47) and tailored user experience by utilizing viewer profiles (¶ 122, 123). As to claim 19, Yoon (Figs. 5, 8) teaches, a non-transitory computer-readable medium having stored thereon, computer-executable instructions which, when executed by a computer, cause the computer to execute operations, the operations comprising: recognizing a viewer position (observation position, Fig. 8), wherein the viewer position is a position of a viewer (i.e. track pupil or face of the viewer) relative to a display surface (i.e. surface of display panel 300) of a display panel (display panel 300)(¶ 60: location tracking system 500)); and controlling each light-emitting unit of a plurality of light-emitting units (light emitting diode LED panel) included in a backlight (point light source 110) based on of a direction (i.e. toward display panel 300 as shown in Fig. 8) in which light from the each light-emitting unit of the plurality of light-emitting units as a light source is emitted from the display surface and the viewer position, (Figs. 6, 7, ¶ 61, 63-65); wherein the each light-emitting unit of the plurality of light-emitting units emitting light that enters a light beam control unit (lenticular lens sheet 200) from the each light-emitting unit of the plurality of light-emitting units, is controlled to travel in a direction based on position of the light-emitting unit, the light-emitting unit is the light source, the light-emitting unit emits the light, and is transmitted through the display panel and emitted from the display surface (Figs. 6, 7, ¶ 61, 63-65). Yoon does not specifically teach a program. Karafin (Fig. 3A) teaches, a program (computer program) that causes a display device to operate (¶ 204). It would have been obvious to a person of ordinary skilled in the art before the effective filing date of the claimed invention to incorporate Karafin’s display structure into Yoon’s display, so as to provide a display device with different lens assembly options than lenticular lens (¶ 47) and tailored user experience by utilizing viewer profiles (¶ 122, 123). Claim(s) 8 is/are rejected under 35 U.S.C. 103 as being unpatentable over Yoon in view of Minami et al (PGPUB 2016/0161659 A1). As to claim 8, Yoon teaches the display device of claim 1, but does not specifically teach edge backlights. Minami (Fig. 3) teaches, the backlight includes a plurality of backlights that are edge backlights (edge type light source 122) in which the plurality of light-emitting units is arranged along a peripheral edge of the display panel (¶ 39, Fig. 3), and the light beam control unit is disposed between the backlights (¶ 56: i.e. lenticular lens between the display panel 11 and the light guide plate 121. The lenticular lens would be positioned along horizontal direction as shown in Fig. 3). It would have been obvious to a person of ordinary skilled in the art before the effective filing date of the claimed invention to incorporate Minami’s display structure into Yoon’s display, so as to provide uniform distribution of light (¶ 26, 48). Claim(s) 11 is/are rejected under 35 U.S.C. 103 as being unpatentable over Yoon in view of Trayner et al (PGPUB 2012/0013651 A1). As to claim 11, Trayner (Fig. 1d) teaches, wherein the light emission control unit is further configured to : control based on the viewer position is equal to or less than a specific number (i.e. three viewers as shown in Fig. 1d), the each light-emitting unit of the plurality of light-emitting units based on the emission direction and the viewer position, and cause based on the viewer position exceeds the specific number, the each of light-emitting unit of the plurality of light- emitting units to emit light (¶ 125: i.e. when all viewing zones 10a, 10b, 10c are presented with corresponding viewer, all three light sources 1a, 1b, 1c must be on at the same time for all viewers to simultaneously view the displayed images). It would have been obvious to a person of ordinary skilled in the art before the effective filing date of the claimed invention to incorporate Trayner’s display structure into Yoon’s display, so as to provide stereoscopic images with full parallax with best achievable quality in X, Y, Z directions (¶ 95). Allowable Subject Matter Claims 17 and 18 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Claim 17 recites the limitation, “when the viewer position is not recognized by the viewer position recognition unit, a light-emitting unit that is a light source of light emitted from the display surface toward an outermost side, of the plurality of light-emitting units, to emit light and dims or turns off a light-emitting unit that is a light source of light emitted from the display surface to a direction that is not the outermost side”. Claim 18 recite the limitation, “a light-emitting unit that is a light source of light emitted from the display surface toward the viewer position and a light-emitting unit that is a light source of light emitted from the display surface toward the outermost side to emit light and dims or turns off a light-emitting unit that is a light source of light emitted from the display surface to a direction that is different from the viewer position and is not the outermost side”. Response to Arguments ssApplicant's arguments filed 11/12/2025 have been fully considered but they are not persuasive. Applicant has amended claims 1, 19 and 20 to recite the limitation, “a light beam control unit configured to control light, that has entered from each light-emitting unit of the plurality of light-emitting units, to travel in a corresponding specific direction based on a position of a light-emitting unit, wherein the light-emitting unit is a light source, and the light-emitting unit emits light”. Applicant further argues that this limitation is not taught by Yoon prior art. Examiner respectfully disagrees. Yoon teaches the lenticular lens sheet 200 that is configured to control the direction of light emitted by the point light sources 110 and 110’. The specific direction being the direction of light from lenticular lens sheet 200 to line light source 10. The travel direction is particularly dependent on the depth or the position of the point light source 110 as shown in Fig. 11. Examiner considers that the “control” limitation is taught by Yoon since there is a manipulation of light direction. Further, the claim languages are broad and still read on Yoon prior art. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Inquiry Any inquiry concerning this communication or earlier communications from the examiner should be directed to SANGHYUK PARK whose telephone number is (571)270-7359. The examiner can normally be reached on 10:00AM - 6:00 M-F. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chanh Nguyen can be reached on ((571) 272-7772. The fax phone number for the organization where this application or proceeding is assigned is (571) 273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at (866) 217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call (800) 786-9199 (IN USA OR CANADA) or (571) 272-1000. /SANGHYUK PARK/Primary Examiner, Art Unit 2623
Read full office action

Prosecution Timeline

Nov 01, 2024
Application Filed
Aug 09, 2025
Non-Final Rejection — §102, §103
Nov 12, 2025
Response Filed
Mar 13, 2026
Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602134
ELECTRONIC DEVICE
2y 5m to grant Granted Apr 14, 2026
Patent 12603055
DISPLAY DEVICE INCLUDING A SWEEP DRIVER THAT PROVIDES A SWEEP SIGNAL, AND ELECTRONIC DEVICE INCLUDING THE DISPLAY DEVICE
2y 5m to grant Granted Apr 14, 2026
Patent 12594141
SYSTEMS, METHODS, AND MEDIA FOR PRESENTING BIOPHYSICAL SIMULATIONS IN AN INTERACTIVE MIXED REALITY ENVIRONMENT
2y 5m to grant Granted Apr 07, 2026
Patent 12591322
TOUCH INPUT SYSTEM INCLUDING PEN AND CONTROLLER
2y 5m to grant Granted Mar 31, 2026
Patent 12592207
GATE LINE DRIVING CIRCUIT WITH TOP GATE AND BOTTOM GATE
2y 5m to grant Granted Mar 31, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
71%
Grant Probability
88%
With Interview (+16.5%)
2y 6m
Median Time to Grant
Moderate
PTA Risk
Based on 717 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month