DETAILED ACTION
Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
2. The information disclosure statement (IDS) submitted on November 15, 2024 is considered by the examiner.
Claim Rejections - 35 USC § 102
3. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
4. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
5. Claim(s) 1-2, 4-5, 12-13, and 15-16 is/are rejected under 35 U.S.C. 102(a)(1) and (a)(2) as being anticipated by Hicks et al. (U.S. Patent Application Publication No. 2018/0075820 A1), hereinafter referred to as Hicks.
6. Regarding claim 1, Hicks teaches a method of adaptively controlling rendered contents in an electronic device, comprising: monitoring a current content status of an application (APP) on a real-time basis; and presenting contents of the APP with an adjustable graphic quality which is adaptively adjusted based on the current content status of the APP (Paragraph 25 teaches that the rapid motions of the user’s head and eye above a threshold can reduce the visual quality and frame rate. The content status of the application running on the head mounted device can include the user’s movements. The user can be considered a player or user in the application and thus is part of the content status of the application. Thus, the content status or the user’s movements are monitored and the contents of the APP are adaptively adjusted).
7. Regarding claim 2, Hicks teaches the limitations of claim 1. Hicks further teaches the method, further comprising: monitoring the current content status of the APP for determining whether a player character in the contents of the APP is moving faster than a predetermined speed; presenting the contents of the APP with a first graphic quality when determining that the player character in the contents of the APP is not moving faster than the predetermined speed; and presenting the contents of the APP with a second graphic quality when determining that the player character in the contents of the APP is moving faster than the predetermined speed, wherein the first graphic quality is higher than the second graphic quality (Paragraph 25 teaches rapid motions of head and eye above a threshold can reduce the resolution and level of detail. This teaches the second rendering method where the graphic quality is lower when the player character moves faster than the predetermined speed. A user is using a head mounted device and the application the head mounted device (HMD) is running monitors the user’s movements. Since the user is being monitored by the HMD, they can be considered a player character. The user can be considered a player character in the contents of the application which when moving above a threshold ends up reducing the resolution or graphic quality which is the second rendering method. The first rendering method is the method applied before the user exceeded a threshold. The first rendering method will have had a higher level of detail and resolution for it to be reduced from since the second rendering method requires lowering the resolution from the previous status).
8. Regarding claim 4, Hicks teaches the limitations of claim 2. Hicks further teaches the method, further comprising: adopting a first rendering method for processing image data associated the contents of the APP with a first frame rate when determining that the player character in the contents of the APP is not moving faster than the predetermined speed; and adopting a second rendering method for processing the image data associated the contents of the APP with a second frame rate when determining that the player character in the contents of the APP is moving faster than the predetermined speed, wherein the first frame rate is higher than the second frame rate (Paragraph 25, teaches rapid motions of head and eye above a threshold can reduce the visuals and frame rate. This is the second rendering method where when the user moves faster than a predetermined speed, the frame rate reduces. A user is using a head mounted device and the application the head mounted device (HMD) is running monitors the user’s movements. Since the user is being monitored by the HMD, they can be considered a player character. Thus, the user can be considered a player character in the contents of the application. The first rendering method is the method that was applied before the user exceeded a threshold. The first rendering method will have a higher frame rate since the second rendering method requires lowering the frame rate from the previous status or first rendering method).
9. Regarding claim 5, Hicks teaches the limitations of claim 2. Hicks further teaches the method, further comprising: adopting a first rendering method for processing image data associated the contents of the APP with a first amount of computational resources when determining that the player character in the contents of the APP is not moving faster than the predetermined speed; and adopting a second rendering method for processing the image data associated the contents of the APP with a second amount of computational resources when determining that the player character in the contents of the APP is moving faster than the predetermined speed, wherein the first amount of computational resources is larger than the second amount of computational resources (Paragraph 25, teaches rapid motions of head and eye above a threshold can reduce the visuals and frame rate. This is the second rendering method. Reducing frame rate in Paragraph 48 is taught to reduce the computational resources. A user is using a head mounted device and the application the head mounted device (HMD) is running monitors the user’s movements. Since the user is being monitored by the HMD, they can be considered a player character. Thus, the user can be considered a player character in the contents of the application. The first rendering method is the method that was applied before the user exceeded a threshold. The first rendering method will have a higher frame rate since the second rendering method requires lowering the frame rate from the previous status or first rendering method. Thus, the first rendering method will use more computational resources as Paragraph 48 teaches increased frame rate has a high processing demand).
10. Regarding claim 12, Hicks teaches an electronic device with adaptive control of rendered contents, comprising: a memory device configured to store an application (APP) (Paragraph 74 and Figure 10 teach a memory 8, 9, and 12 which can store the application); a screen configured to present contents of the APP (Paragraph 25-26 teaches there is a scene, or application contents, rendered in front of the user’s eyes through a head mounted display; Paragraph 74 and Figure 10 teaches a display 18); and a processor configured to: monitor a current content status of the APP on a real-time basis; provide an adjustable graphic quality of the contents of the APP which is adaptively adjusted based on the current content status of the APP; and instruct the screen to present the contents of the APP with the adjustable graphic quality (Paragraph 25 teaches that the rapid motions of the user’s head and eye above a threshold can reduce the visual quality and frame rate. The content status of the application running on the head mounted device can include the user’s movements. The user can be considered a player or user in the application and thus is part of the content status of the application. Thus, the content status or the user’s movements are monitored and the contents of the APP are adaptively adjusted).
11. Regarding claim 13, Hicks teaches the limitations of claim 12. Hicks further teaches the electronic device wherein the processor is further configured to: monitor the current content status of the APP for determining whether a player character in the contents of the APP is moving faster than a predetermined speed; instruct the screen to present the contents of the APP with a first graphic quality when determining that the player character in the contents of the APP is not moving faster than the predetermined speed; and instruct the screen to present the contents of the APP with a second graphic quality when determining that the player character in the contents of the APP is moving faster than the predetermined speed, wherein the first graphic quality is higher than the second graphic quality (Paragraph 25 teaches rapid motions of head and eye above a threshold can reduce the resolution and level of detail. This teaches the second rendering method where the graphic quality is lower when the player character moves faster than the predetermined speed. A user is using a head mounted device and the application the head mounted device (HMD) is running monitors the user’s movements. Since the user is being monitored by the HMD, they can be considered a player character. The user can be considered a player character in the contents of the application which when moving above a threshold ends up reducing the resolution or graphic quality which is the second rendering method. The first rendering method is the method applied before the user exceeded a threshold. The first rendering method will have had a higher level of detail and resolution for it to be reduced from since the second rendering method requires lowering the resolution from the previous status).
12. Regarding claim 15, Hicks teaches the limitations of claim 13. Claim 15 is similar in scope to claim 4. Therefore, similar rationale as applied in the rejection of claim 4 applies herein.
13. Regarding claim 16, Hicks teaches the limitations of claim 13. Claim 16 is similar in scope to claim 5. Therefore, similar rationale as applied in the rejection of claim 5 applies herein.
Claim Rejections - 35 USC § 103
14. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
15. Claim(s) 3 and 14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Hicks et al. (U.S. Patent Application Publication No. 2018/0075820 A1), hereinafter referred to as Hicks, as applied to claim 2 and 13 above, and further in view of Ogasawara (U.S. Patent Application Publication No. 2020/0051207 A1).
16. Regarding claim 3, Hicks teaches the limitations of claim 2. However, Hicks fails to teach the method, further comprising: adopting a first rendering method for processing image data associated the contents of the APP with a first power state when determining that the player character in the contents of the APP is not moving faster than the predetermined speed; and adopting a second rendering method for processing the image data associated the contents of the APP with a second power state when determining that the player character in the contents of the APP is moving faster than the predetermined speed, wherein the first power state is higher than the second power state.
Ogasawara teaches the method, further comprising: adopting a first rendering method for processing image data associated the contents of the APP with a first power state when determining that the player character in the contents of the APP is not moving faster than the predetermined speed; and adopting a second rendering method for processing the image data associated the contents of the APP with a second power state when determining that the player character in the contents of the APP is moving faster than the predetermined speed, wherein the first power state is higher than the second power state (Paragraph 40 teaches power consumption reduces when resolution is reduced. Resolution is reduced when motion of the head is fast. The user is the play character in this application. If the player is fast and in a range of speed, then the resolution drops and power is reduced. This second power state is lower than when the player is in the previous range of speed or below the threshold which is the first power state. Thus, the first power state is higher than the second power state).
Hicks and Ogasawara are considered analogous to the claimed invention as because both are in the same field of adaptively adjusting the graphic quality of the images displayed based on speed. Thus, it would have been obvious to a person holding ordinary skill in the art before the effective filing date to modify the method of adaptively adjusting the graphic quality taught by Hicks with the power state changing depending on rendering method taught by Ogasawara in order to solve the problem of increased power consumption from high-definition and high-frame rate data being displayed which would typically result in requiring larger batteries (Ogasawara paragraph 3-4).
17. Regarding claim 14, Hicks teaches the limitations of claim 13. Claim 14 is similar in scope to claim 3. Therefore, similar rationale as applied in the rejection of claim 3 applies herein.
18. Claim(s) 6-7, 9, 17-18, and 20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Hicks et al. (U.S. Patent Application Publication No. 2018/0075820 A1), hereinafter referred to as Hicks, as applied to claim 1 and 12 above, and further in view of Hui et al. (U.S. Patent Application Publication No. 2016/0247310 A1), hereinafter referred to as Hui.
19. Regarding claim 6, Hicks teaches the limitations of claim 1. However, Hicks fails to teach the method further comprising: monitoring the current content of the APP for determining whether a user of the electronic device is issuing a touch command; presenting the contents of the APP with a first graphic quality when determining that the user of the electronic device is not issuing the touch command; and presenting the contents of the APP with a second graphic quality when determining that the user of the electronic device is issuing the touch command, wherein the first graphic quality is higher than the second graphic quality.
Hui teaches the method further comprising: monitoring the current content of the APP for determining whether a user of the electronic device is issuing a touch command (Paragraph 27 teaches detecting fast scrolling and rendering in a high quality (HQ) tile or low quality (LQ) tile. Paragraph 34 teaches the user input can come from a touch pad. Thus, the fast scrolling can be come from the touch pad input and be considered a touch command); presenting the contents of the APP with a first graphic quality when determining that the user of the electronic device is not issuing the touch command; and presenting the contents of the APP with a second graphic quality when determining that the user of the electronic device is issuing the touch command, wherein the first graphic quality is higher than the second graphic quality (Paragraph 27 teaches during fast scrolling, the resolution is lowered. This can be considered the second graphic quality when the fast scrolling touch command is issued. When the fast scrolling touch command is not issued, then a first and higher resolution graphic quality is rendered. Thus, when the scrolling touch command is issued, the second graphic quality is lower than the first graphic quality).
Hicks and Hui are considered analogous to the claimed invention as because both are in the same field of adaptively adjusting the graphics quality. Thus, it would have been obvious to a person holding ordinary skill in the art before the effective filing date to modify the method of adaptively adjusting the graphic quality taught by Hicks with the touch affecting the graphics quality taught by Hui in order to reduce blank areas that may not render in time and improve user experience (Hui Paragraph 27).
20. Regarding claim 7, Hicks in view of Hui teaches the limitations of claim 6. However, Hicks fails to teach the method, further comprising: adopting a first rendering method for processing image data associated the contents of the APP with a first power state when determining that the user of the electronic device is not issuing the touch command; and adopting a second rendering method for processing the image data associated the contents of the APP with a second power state when determining that the user of the electronic device is not issuing the touch command, wherein the first power state is higher than the second power state.
Hui teaches the method, further comprising: adopting a first rendering method for processing image data associated the contents of the APP with a first power state when determining that the user of the electronic device is not issuing the touch command; and adopting a second rendering method for processing the image data associated the contents of the APP with a second power state when determining that the user of the electronic device is not issuing the touch command, wherein the first power state is higher than the second power state (Paragraph 27 teaches during fast scrolling, the resolution is lowered into a lower quality or LQ tile. This teaches the second rendering method. When the fast scrolling touch command is not issued, then a first and higher resolution graphic quality or HQ tile is rendered. This teaches the first rendering method. Paragraph 28 teaches HQ tiles have high power and memory bandwidth. LQ tiles have lower power and memory bandwidth. Rendering the HQ tiles can be considered a first power state that is higher than rendering the LQ tiles that is a lower power state; Paragraph 34 teaches user input can come from a touch pad. Thus, the fast scrolling can come from a touch input and be considered a touch command).
Hicks and Hui are considered analogous to the claimed invention as because both are in the same field of adaptively adjusting the graphics quality. Thus, it would have been obvious to a person holding ordinary skill in the art before the effective filing date to modify the method of adaptively adjusting the graphic quality taught by Hicks with the touch affecting the power state taught by Hui in order to reduce blank areas that may not render in time and improve user experience (Hui Paragraph 27).
21. Regarding claim 9, Hicks in view of Hui teaches the limitations of claim 6. However, Hicks fails to teach the method further comprising: adopting a first rendering method for processing image data associated the contents of the APP with a first amount of computational resources when determining that the user of the electronic device is not issuing the touch command; and adopting a second rendering method for processing the image data associated the contents of the APP with a second amount of computational resources when determining that the user of the electronic device is issuing the touch command, wherein the first amount of computational resources is larger than the second amount of computational resources.
Hui teaches the method further comprising: adopting a first rendering method for processing image data associated the contents of the APP with a first amount of computational resources when determining that the user of the electronic device is not issuing the touch command; and adopting a second rendering method for processing the image data associated the contents of the APP with a second amount of computational resources when determining that the user of the electronic device is issuing the touch command, wherein the first amount of computational resources is larger than the second amount of computational resources (Paragraph 27 teaches during fast scrolling, the resolution is lowered into a lower quality or LQ tile. When the fast scrolling touch command is not issued, then a first and higher resolution graphic quality or HQ tile is rendered. Paragraph 28 teaches HQ tiles have high power and memory bandwidth. LQ tiles have lower power and memory bandwidth. The memory bandwidth and power usage teach computational resources. Rendering the HQ tiles can be considered a first amount of computational resources that is higher than rendering the LQ tiles that uses a smaller second amount of computational resources; Paragraph 34 teaches user input can come from a touch pad. Thus, the fast scrolling can come from a touch input and be considered a touch command).
Hicks and Hui are considered analogous to the claimed invention as because both are in the same field of adaptively adjusting the graphics quality. Thus, it would have been obvious to a person holding ordinary skill in the art before the effective filing date to modify the method of adaptively adjusting the graphic quality taught by Hicks with the touch affecting the amount of computational resources taught by Hui in order to reduce blank areas that may not render in time and improve user experience (Hui Paragraph 27).
22. Regarding claim 17, Hicks teaches the limitations of claim 12. Claim 17 is similar in scope to claim 6. Therefore, similar rationale as applied in the rejection of claim 6 applies herein.
23. Regarding claim 18, Hicks in view of Hui teaches the limitations of claim 17. Claim 18 is similar in scope to claim 7. Therefore, similar rationale as applied in the rejection of claim 7 applies herein.
24. Regarding claim 20, Hicks in view of Hui teaches the limitations of claim 17. Claim 20 is similar in scope to claim 9. Therefore, similar rationale as applied in the rejection of claim 9 applies herein.
25. Claim(s) 8 and 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Hicks et al. (U.S. Patent Application Publication No. 2018/0075820 A1), hereinafter referred to as Hicks, in view of Hui et al. (U.S. Patent Application Publication No. 2016/0247310 A1), hereinafter referred to as Hui, as applied to claim 6 and 17 above, and further in view of Yu et al. (“Sensing Human-Screen Interaction for Energy-Efficient Frame Rate Adaptation on Smartphones”), hereinafter referred to as Yu.
26. Regarding claim 8, Hicks in view of Hui teaches the limitations of claim 6. However, Hicks fails to teach the method further comprising: adopting a first rendering method for processing image data associated the contents of the APP with a first frame rate when determining that the user of the electronic device is not issuing the touch command; and adopting a second rendering method for processing the image data associated the contents of the APP with a second frame rate when determining that the user of the electronic device is not issuing the touch command, wherein the first frame rate is higher than the second frame rate.
Yu teaches the method further comprising: adopting a first rendering method for processing image data associated the contents of the APP with a first frame rate when determining that the user of the electronic device is not issuing the touch command; and adopting a second rendering method for processing the image data associated the contents of the APP with a second frame rate when determining that the user of the electronic device is not issuing the touch command, wherein the first frame rate is higher than the second frame rate (Section 4.2 Paragraph 5 teaches when user is scrolling higher than a certain level, the frame rate is fixed to a low value. The fast scrolling speed teaches the touch command issued by the user. The frame rate reduced to a low value when fast scrolling teaches the second rendering method. The Applicant does not define the touch command so the touch command issued can be interpreted as a specific touch command like scrolling past a certain speed; When the user is not issuing the touch command of scrolling at a higher speed, then the frame rate is not fixed at the low value and follows a Logarithmic model that increases the frame rate depending on scroll speed as explained in Section 4.2 Paragraph 3. Thus, the first frame rate is higher than the second frame rate).
Hicks, Hui, and Yu are considered analogous to the claimed invention as because both are in the same field of adaptively adjusting the content displayed. Thus, it would have been obvious to a person holding ordinary skill in the art before the effective filing date to modify the method of adaptively adjusting the graphic quality taught by Hicks in view of Hui with the lowering of frame rate taught by Yu in order to account for the fact that users do not paly attention to contents displayed on screen when scrolling quickly (Yu Figure 11a) and to achieve more energy savings (Yu Section 4.2 Paragraph 2).
27. Regarding claim 19, Hicks in view of Hui teaches the limitations of claim 17. Claim 19 is similar in scope to claim 8. Therefore, similar rationale as applied in the rejection of claim 8 applies herein.
28. Claim(s) 10-11 is/are rejected under 35 U.S.C. 103 as being unpatentable over Hicks et al. (U.S. Patent Application Publication No. 2018/0075820 A1), hereinafter referred to as Hicks, as applied to claim 1 and 12 above, and further in view of Digital Foundry (“Tech Focus – Dynamic Resolution Scaling: A Great Fit For PC Gaming?” -- https://www.youtube.com/watch?v=180nuQJccTA), hereinafter referred to as Digital Foundry.
29. Regarding claim 10, Hicks teaches the limitations of claim 1. However, Hicks fails to teach the method further comprising: requesting a user statement from the APP regarding a permission to adaptively controlling rendered contents; and presenting the contents of the APP with the adjustable graphic quality which is adaptively adjusted based on the current content of the APP when the user statement corresponds to the permission to adaptively controlling rendered contents.
Digital Foundry teaches the method further comprising: requesting a user statement from the APP regarding a permission to adaptively controlling rendered contents; and presenting the contents of the APP with the adjustable graphic quality which is adaptively adjusted based on the current content of the APP when the user statement corresponds to the permission to adaptively controlling rendered contents (1:25 screenshot teaches that the user can make a statement of ‘YES’ or ‘NO’ to enable or disable the dynamic resolution functionality in a game. This gives the APP permissions on whether to adaptively control the rendered contents and edit the resolution based on the current content or gameplay of the APP).
Hicks and Digital Foundry are considered analogous to the claimed invention as because both are in the same field of adjustable graphic quality in an application. Thus, it would have been obvious to a person holding ordinary skill in the art before the effective filing date to modify the method of adaptively adjusting the graphic quality taught by Hicks with the user statement corresponding to permissions taught by Digital Foundry in order to reduce times in rendering and “save performance … with various levels of load” (Digital Foundry 2:00-2:15, can also see 2:00 and 2:08 screenshots),
30. Regarding claim 11, Hicks in view of Digital Foundry teaches the limitations of claim 10. However, Hicks fails to teach the method further comprising: presenting the contents of the APP with a fixed graphic quality when the user statement does not correspond to the permission to adaptively controlling rendered contents.
Digital Foundry teaches the method further comprising: presenting the contents of the APP with a fixed graphic quality when the user statement does not correspond to the permission to adaptively controlling rendered contents. (1:25 screenshot teaches that the user can make a statement of ‘YES’ or ‘NO’ to enable or disable the dynamic resolution functionality in a game. This gives the APP permissions on whether to adaptively control the rendered contents and edit the resolution based on the current content or gameplay of the APP. If the user sets it to no, then the graphic quality will not be changed dynamically and will be fixed; 9:23 screenshot teaches the resolution stays at 1080P when dynamic resolution or rendering is turned off for a game).
Hicks and Digital Foundry are considered analogous to the claimed invention as because both are in the same field of adjustable graphic quality in an application. Thus, it would have been obvious to a person holding ordinary skill in the art before the effective filing date to modify the method of adaptively adjusting the graphic quality taught by Hicks with the user statement corresponding to permissions taught by Digital Foundry in order give users an option to reduce times in rendering and “save performance … with various levels of load” (Digital Foundry 2:00-2:15, can also see 2:00 and 2:09 screenshots).
Conclusion
31. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Sakaniwa et al. (U.S. Patent Application Publication No. 2011/0032419 A1) teaches calculating a frame rate depending on a detected motion vector in a picture.
Honda et al. (U.S. Patent Application Publication No. 2002/0080881 A1) teaches reducing the frame rate when the image motions is fast.
Shan et al. (U.S. Patent Application Publication No. 2024/0221694 A1) teaches reducing the screen refresh rate when the touch operation is not in a target area.
32. Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHRISTINE Y AHN whose telephone number is (571)272-0672. The examiner can normally be reached M-F 8-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alicia Harrington can be reached at (571)272-2330. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/CHRISTINE YERA AHN/Examiner, Art Unit 2615
/ALICIA M HARRINGTON/Supervisory Patent Examiner, Art Unit 2615