Prosecution Insights
Last updated: April 19, 2026
Application No. 18/377,762

PORTABLE ELECTRONIC DEVICES AND METHODS OF USING THE SAME

Final Rejection §102§103
Filed
Oct 06, 2023
Examiner
GE, JIN
Art Unit
2619
Tech Center
2600 — Communications
Assignee
Nanjing Easthouse Electrical Co. Ltd.
OA Round
2 (Final)
80%
Grant Probability
Favorable
3-4
OA Rounds
2y 9m
To Grant
98%
With Interview

Examiner Intelligence

Grants 80% — above average
80%
Career Allow Rate
416 granted / 520 resolved
+18.0% vs TC avg
Strong +18% interview lift
Without
With
+18.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
38 currently pending
Career history
558
Total Applications
across all art units

Statute-Specific Performance

§101
9.0%
-31.0% vs TC avg
§103
60.2%
+20.2% vs TC avg
§102
12.0%
-28.0% vs TC avg
§112
11.0%
-29.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 520 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment This is in response to applicant’s amendment/response filed on 09/15/2025, which has been entered and made of record. Claims 1 and 10 have been amended. Claims 1-18 are pending in the application. Response to Arguments Applicant's arguments filed on 09/15/2025 have been fully considered but they are not persuasive. Applicant submitted new amended claims. Accordingly, new grounds of rejection are set forth above. The new grounds of rejection conclusion have been necessitated by Applicant's amendments to the claims. Applicants state that “These picture comparisons have shown the difference of the current application and Ho Reference. Amended independent claim 1 teaches: an image displayed on the display screen of the portable electronic device always maintain a landscape view, and atop edge of the landscape view of the display is always aligned with the user’s face. Claims 2-9 depend from now allowable amended claim 1. Thus, for at least the same reasons, claims 2-9 are also patentable under 35 U.S.C. § 102(a)(1) over Ho. Further, Applicant has amended the independent claim 10 to recite limitations/features similar to one or more of the above distinguishing limitations/features of the amended claim 1. Therefore, Applicant submits that, for at least the foregoing reasons, the amended independent claim 10 and their corresponding dependent claims 11- 18 are also patentable under 35 U.S.C. § 102(a)(1) over Hoa In view of the above, Ho does not disclose each and every limitation of amended independent claim 1, and, therefore, Ho does not anticipate the amended independent claim 1. For at least the foregoing reasons, amended independent claim 1 is patentable under 35 U.S.C. § 102(a)(1) over Ho”. PNG media_image1.png 471 547 media_image1.png Greyscale The examiner disagrees. Prior art Ho et al. teach wherein an image displayed on the display screen of the portable electronic device always maintain a landscape view (Figs 2-4, par 0026-0027, “application user interface 108A is displayed horizontally across the long side of display 108 for the user to view. In some embodiments, the landscape orientation may include a landscape left orientation (e.g., application user interface 108A is rotated left 90° from the normal portrait orientation), shown in FIG. 4, or a landscape right orientation (e.g., application user interface 108A is rotated right 90° from the normal portrait orientation), shown in FIG. 5”….. display image 108A always maintain a landscape orientation based on the eye position of user when device change orientation), and a top edge of the landscape view of the display is always aligned with the user’s face (Figs 2-3, par 0026, top edge aligns with user’s face (in vertical position), Figs 4-5, par 0027, top edge aligns with user’s face (in horizontal position)). So the rejection of independent claims would be maintained, same reason for all dependent claims too. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-3 and 10-12 is/are rejected under 35 U.S.C. 102(a)(1) as anticipated by U.S. PGPubs 2020/0104033 to Ho et al.. Regarding claim 1, Ho et al. teach a portable electronic device (Fig 1, par 0024, a “mobile device”), comprising: a plurality of display event detection sensors (Fig 1, par 0028-31, image sensors and inertial sensors), wherein the plurality of display event detection sensors is used for detecting one or more display events on the display screen of the portable electronic device and detecting an orientation of a user's face (par 0028-0031, “image sensor 103 is an IR image sensor and the image sensor is used to capture infrared images used for face detection, facial recognition authentication, and/or depth detection. Other embodiments of image sensor 103 (e.g., an RGB image sensor) may also be contemplated for use in face detection, facial recognition authentication, and/or depth detection as described herein”, par 0045-0046, “wherein the plurality of display event detection sensors is used for detecting one or more display events on the display screen of the portable electronic device and detecting an orientation of a user's face”); and PNG media_image2.png 204 569 media_image2.png Greyscale a display orientation control module (par 0045-0046, “ data from images of a user captured and processed by ISP 110 and/or SEP 112 (e.g., images captured and processed in a facial recognition authentication process) is used to assist determining orientation of display 108 on device 100. For example, data from captured images of the user may be used to determine an orientation of display 108 (e.g., one of the orientations of display 108 shown in FIGS. 2-5) when data from inertial sensors 109 is ambiguous or inconclusive in determining the orientation of the display”), wherein the display orientation control module sets an orientation of a display screen of the portable electronic device to always maintain a landscape view of a display on the portable electronic device for the user (Figs 4-5, par 0027, “In the landscape orientation, application user interface 108A is displayed horizontally across the long side of display 108 for the user to view. In some embodiments, the landscape orientation may include a landscape left orientation (e.g., application user interface 108A is rotated left 90° from the normal portrait orientation), shown in FIG. 4, or a landscape right orientation (e.g., application user interface 108A is rotated right 90° from the normal portrait orientation), shown in FIG. 5”); wherein the display orientation control module comprises: a processor, and a non-volatile memory storing an operating system and a display control application (Fig 1, par 0024), wherein the display control application comprises: a display module, when executed by the processor, the display control application causes the processor to perform (par 0035, par 0045-0046, “software executed by processor 104 during use may control the other components of device 100 to realize the desired functionality of the device. The processors may also execute other software. These applications may provide user functionality, and may rely on the operating system for lower-level device control, scheduling, memory management, etc.”): detecting, by the plurality of display event detection sensors (Fig 1, par 0028-31, image sensors and inertial sensors), the orientation of the user's face (par 0005, par 0065-0067, “ face orientation data obtained from a face detection process is used to determine or update the orientation of an application user interface (e.g., text and/or content) being displayed on a display of a device. The face detection process may operate on images of the user captured during a facial recognition process or an attention detection process being operated by a facial recognition network (e.g., an image signal processor network)”); and setting, when at least one of a plurality of display event detection sensors detects at least one display event, by the display module the orientation of the display screen of the portable electronic device to the landscape view on the portable electronic device to ensure an upright landscape view is displayed according to the orientation of the user's face detected (Figs 4-5, par 0027, “In the landscape orientation, application user interface 108A is displayed horizontally across the long side of display 108 for the user to view. In some embodiments, the landscape orientation may include a landscape left orientation (e.g., application user interface 108A is rotated left 90° from the normal portrait orientation), shown in FIG. 4, or a landscape right orientation (e.g., application user interface 108A is rotated right 90° from the normal portrait orientation), shown in FIG. 5”), wherein an image displayed on the display screen of the portable electronic device always maintain a landscape view (Figs 2-4, par 0026-0027, “application user interface 108A is displayed horizontally across the long side of display 108 for the user to view. In some embodiments, the landscape orientation may include a landscape left orientation (e.g., application user interface 108A is rotated left 90° from the normal portrait orientation), shown in FIG. 4, or a landscape right orientation (e.g., application user interface 108A is rotated right 90° from the normal portrait orientation), shown in FIG. 5”….. display image 108A always maintain a landscape orientation based on the eye position of user when device change orientation), and a top edge of the landscape view of the display is always aligned with the user’s face (Figs 2-3, par 0026, top edge aligns with user’s face (in vertical position), Figs 4-5, par 0027, top edge aligns with user’s face (in horizontal position)). Regarding claim 2, Ho et al. teach all the limitation of claim 1, and further teach wherein the portable electronic device comprises: a lightweight, electrically-powered electronic device functionally capable of communications, data processing and data utility (Ho et al.: par 0024). Regarding claim 3, Ho et al. teach all the limitation of claim 2, and further teach wherein the portable electronic device comprises at least one of: a smartphone; a tablet computer; an electronic reader; a global positioning system (GPS) receiver; a medical device; a digital camera; a video camera; and a handheld computer game console (Ho et al.: par 0024, “Examples of mobile devices include mobile telephones or smart phones, and tablet computers”). Regarding claims 10-12, the method claims 10-12 are similar in scope to claims 1-3 and are rejected under the same rational. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 4 and 13 is/are rejected under 35 U.S.C. 103 as being unpatentable over U.S. PGPubs 2020/0104033 to Ho et al. in view of U.S. PGPubs 2014/0347282 to Woo. Regarding claim 4, Ho et al. teach all the limitation of claim 3, and further teach wherein the plurality of display event detection sensors comprises: one or more front facing cameras (Ho et al.: par 0024); one or more magnetometers (par 0030); one or more accelerometers (Ho et al.: par 0028); and one or more touch screen sensors (Ho et al.: par 0025), but do not explicitly teach wherein the plurality of display event detection sensors comprises: one or more magnetometers. In related endeavor, Woo further teaches wherein the plurality of display event detection sensors comprises: one or more front facing cameras (par 0015, par 0035); one or more magnetometers (par 0030); one or more accelerometers (par 0030, par 0034); and one or more touch screen sensors (par 0033, par 0036). It would have been obvious to a person of ordinary skill in the art at the time before the effective filing data of the claimed invention to modified Ho et al. to include wherein the plurality of display event detection sensors comprises: one or more magnetometers as taught by Woo to detect orientation information of the portable terminal based on the sensors, and changes a view mode of a screen base on the detected orientation information. Regarding claim 13, Ho et al. teach all the limitation of claim 12, the method claim 13 is similar in scope to claim 4 and is rejected under the same rational. Claims 5 and 14 is/are rejected under 35 U.S.C. 103 as being unpatentable over U.S. PGPubs 2020/0104033 to Ho et al. in view of U.S. PGPubs 2014/0347282 to Woo, further in view of U.S. PGPubs 2015/0128075 to Kempinski. Regarding claim 5, Ho et al. as modified by Woo teach all the limitation of claim 4, but keep silent for teaching wherein the front facing camera comprises at least one of: an optical imaging device; a laser imaging device; and an infrared imaging device In related endeavor, Kempinski teaches wherein the front facing camera comprises at least one of: an optical imaging device; a laser imaging device; and an infrared imaging device (par 0031, par 0046, “Video camera 112 may be understood to include any device (e.g., operating to acquire images using visible, infrared, or other radiation), such as a camera, three-dimensional imager, Infra Red light emitting diode (IR LED), scanner, or other device a that is capable of acquiring a series of frames that contain images, or other spatial information, regarding an imaged object, such as a user's eyes, face, head, or body”). It would have been obvious to a person of ordinary skill in the art at the time before the effective filing data of the claimed invention to modified Ho et al. as modified by Woo to include wherein the front facing camera comprises at least one of: an optical imaging device; a laser imaging device; and an infrared imaging device as taught by Kempinski to accurately track gaze direction to modify the display content to provide comfortably view in a limited size screen display of a mobile device. Regarding claim 14, Ho et al. as modified by Woo teach all the limitation of claim 13, the method claim 14 is similar in scope to claim 5 and is rejected under the same rational. Claims 6-7 and 15-16 is/are rejected under 35 U.S.C. 103 as being unpatentable over U.S. PGPubs 2020/0104033 to Ho et al. in view of in view of U.S. PGPubs 2014/0347282 to Woo, further in view of U.S. PGPubs 2015/0128075 to Kempinski, further in view of China PGPubs CN103376893 to Meng et al. Regarding claim 6, Ho et al. as modified by Woo and Kempinski teach all the limitation of claim 5, and Woo further teach wherein the display event comprises: the portable electronic device is tilted in a predetermined angle (par 0038, par 0046-0047, par 0070); and the portable electronic device is turned around (par 0038, par 0046-0047, par 0051, par 0070, claim 7), but keep silent for teaching the front facing cameras of the portable electronic device is blocked in a predetermined duration of time. In related endeavor, Meng et al. teach the front facing cameras of the portable electronic device is blocked in a predetermined duration of time (par 0069-0072, “by the state that is arranged on the gravity sensor judgement terminal on the terminal. When photographing module is closed, then can't detect by photographing module user's human eye direction. when terminal is in erectility, present display frame in perpendicular screen mode, when terminal is in edge-on state, present display frame in horizontal screen mode, finish current flow process” …. generate events to adjust display layout when photographing module is closed (means blocked)). It would have been obvious to a person of ordinary skill in the art at the time before the effective filing data of the claimed invention to modified Ho et al. as modified by Woo and Kempinski to include the front facing cameras of the portable electronic device is blocked in a predetermined duration of time as taught by Meng et al. to provide an extract detected method using gravity sensor to adjust the display content when camera sensor is blocked/disable to meet the visual habits of the user and improve the user experience. Regarding claim 7, Ho et al. as modified by Woo, Kempinski, and Meng et al. teaches all the limitation of claim 6, and Woo further teach wherein tilting, rotating and turning of the portable electronic device are detected by: (par 0038, par 0046-0047, par 0070, “when the locations of the eyes of the user are detected, the calculator 240 compares a connection line of the two eyes of the user with a reference line based on a horizontal line corresponding to the orientation of the portable terminal. When an angle (slope) of the connection line is less than a reference angle (e.g., 45.degree.), the calculator 240 determines a current view mode as a reference view mode. The calculator 240 compares a connection line of the two eyes of the user with a reference line based on a horizontal line corresponding to the orientation of the portable terminal. When an angle (slope) of the connection line is equal to or greater than a reference angle (e.g., 45.degree.), the calculator 240 may change a current view mode and determine the changed view mode as a reference view mode”): one or more front facing cameras (par 0015, par 0035), one or more magnetometers (par 0030); and one or more accelerometers (par 0030, par 0034); and one or more touch screen sensors (par 0033, par 0036). This would be obvious for the same reason given in the rejection for claim 4. Regarding claims 15-16, Ho et al. as modified by Woo and Kempinski teach all the limitation of claim 14, the method claims 15-16 are similar in scope to claims 6-7 and are rejected under the same rational. Claims 8 and 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over U.S. PGPubs 2020/0104033 to Ho et al. in view of in view of U.S. Patent 9,262,999 to Froment et al., further in view of U.S. PGPubs 2003/0223622 to Simon et al. Regarding claim 8, Ho et al. teach all the limitation of claim 1, and Ho et al. further teach wherein detecting, by the plurality of display event detection sensors, the orientation of the user's face comprises: detecting one or more facial characters of the user's face (par 0049), but keep silent for teaching detecting the user's hair line; detecting the user's eye line; detecting the user's eyebrow line; detecting the user's nose; detecting the user's neck; detecting the user's shoulder; and detecting a combination of these user's features. In related endeavor, Froment et al. teach detecting the user's hair line; detecting the user's eye line; detecting the user's nose; detecting the user's shoulder; and detecting a combination of these user's features (col 2:13-27, col 4:31-49, col 12:4-10, “The feature data 820 may include data representative of the user's 102 features, for example, the user's 102 eyes, mouth, ears, nose, shoulders of the user, a hat worn by the user, facial hair of the user, jewelry worn by the user, or glasses worn by the user, and so forth. The feature data 820 may include first eye data representative of the user's 102 first eye, and second eye data representative the user's 102 second eye”). PNG media_image3.png 496 400 media_image3.png Greyscale In related endeavor, Simon et al. teach detecting one or more facial characters of the user's face including detecting the user's hair line; detecting the user's eye line; detecting the user's eyebrow line; detecting the user's nose; detecting the user's neck; and detecting a combination of these user's features (Figs 4-5, par 0011, par 0050-0051, detecting the user's hair line, eye line; eyebrow line; nose; and neck). It would have been obvious to a person of ordinary skill in the art at the time before the effective filing data of the claimed invention to modified Ho et al. to include detecting the user's hair line; detecting the user's eye line; detecting the user's nose; detecting the user's shoulder; and detecting a combination of these user's features as taught by Froment et al. to accurately generate the feature data include data representative of the user's eyes, mouth, nose, ears, shoulders of the user, a hat worn by the user, facial hair of the user, jewelry worn by the user, or glasses worn by the user, and so forth based on image data acquired with a camera. It would have been obvious to a person of ordinary skill in the art at the time before the effective filing data of the claimed invention to modified Ho et al. as modified by Froment et al. to include detecting one or more facial characters of the user's face including detecting the user's hair line; detecting the user's eye line; detecting the user's eyebrow line; detecting the user's nose; detecting the user's neck; and detecting a combination of these user's features as taught by Simon et al. to detect the feature data include hair line; eye line; eyebrow line; nose; neck; and any combination thereof based on image data acquired with a camera to develop a system that uses automated and semi-automated portrait image enhancement methods to enable the facile retouching of portraits. Regarding claim 17, Ho et al. teach all the limitation of claim 10, the method claim 17 is similar in scope to claim 8 and is rejected under the same rational. Claims 9 and 18 is/are rejected under 35 U.S.C. 103 as being unpatentable over U.S. PGPubs 2020/0104033 to Ho et al. in view of in view of U.S. PGPubs 2003/0223622 to Simon et al. Regarding claim 9, Ho et al. teach all the limitation of claim 1, but keep silent for teaching wherein the portable electronic device comprises an image enhancement device. In related endeavor, Simon et al. teach wherein the portable electronic device comprises an image enhancement device (abstract, par 011, provide enhance function of a photographed image for display). It would have been obvious to a person of ordinary skill in the art at the time before the effective filing data of the claimed invention to modified Ho et al. to include wherein the portable electronic device comprises an image enhancement device as taught by Simon et al. to develop a system that uses automated and semi-automated portrait image enhancement methods to enable the facile retouching of portraits producing an enhanced digital image from the digital image. Regarding claim 18, Ho et al. teaches all the limitation of claim 10, the method claim 18 is similar in scope to claim 9 and is rejected under the same rational. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Jin Ge whose telephone number is (571)272-5556. The examiner can normally be reached 8:00 to 5:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jason Chan can be reached at (571)272-3022. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. JIN . GE Examiner Art Unit 2619 /JIN GE/ Primary Examiner, Art Unit 2619
Read full office action

Prosecution Timeline

Oct 06, 2023
Application Filed
Jun 16, 2025
Non-Final Rejection — §102, §103
Sep 15, 2025
Response Filed
Oct 01, 2025
Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12592024
QUANTIFICATION OF SENSOR COVERAGE USING SYNTHETIC MODELING AND USES OF THE QUANTIFICATION
2y 5m to grant Granted Mar 31, 2026
Patent 12586296
METHODS AND PROCESSORS FOR RENDERING A 3D OBJECT USING MULTI-CAMERA IMAGE INPUTS
2y 5m to grant Granted Mar 24, 2026
Patent 12579704
VIDEO GENERATION METHOD AND APPARATUS, DEVICE, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 17, 2026
Patent 12573164
DESIGN DEVICE, PRODUCTION METHOD, AND STORAGE MEDIUM STORING DESIGN PROGRAM
2y 5m to grant Granted Mar 10, 2026
Patent 12573151
PERSONALIZED DEFORMABLE MESH BY FINETUNING ON PERSONALIZED TEXTURE
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
80%
Grant Probability
98%
With Interview (+18.0%)
2y 9m
Median Time to Grant
Moderate
PTA Risk
Based on 520 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month