The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Detailed Action
Current Status of Claims
This action is response to communication of September 15, 2024. By amendment of September 15, 2025, the Applicant amended claims 1, 3-4, 9, 16-17, and 20. Claims 2 and 5 were canceled. Therefore, claims 1, 3-4, 6-20 to 20 are currently active in the application.
Response to Arguments
Applicant’s arguments with respect to claim(s) 1, 3-4, 6-20 to 20 have been considered but are moot because the new ground of rejection does not rely on some of references applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Claim Objections
Claims 16 and 17 were previously objected. This objection is withdrawn in view of amendment of September 15, 2025. Currently, claims 7 and 15 are objected for using the term “artificial intelligent” versa “artificial intelligence” used in claims 1, 3-4. Could the Applicant provide the reason for difference in terms seemingly addressing the same structure?
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim 6 is/are rejected under 35 U.S.C. 103 as being unpatentable over Reeves et al. (US Patent Publication Application 2020/0110566 A1) in view of Nakamura et al. (US Patent Publication Application 2020/0241604 A1).
In regard of claim 6, Reeves et al. disclose a portable interactive device, comprising: a first portion comprising a first screen; a second portion comprising a second screen; (See at least Figures 1A-1C of Reeves et al. illustrating a portable interactive device (100) comprising a first screen (110) and second screen (114) connected by spine (128) as discussed in paragraphs [0080-0081] of Reeves et al.).
However, the reference to Reaves et al. does not specifically shows a portable interactive device with a plurality of spines operatively coupled to the first and second portions wherein the plurality of spines are configured to provide for movement of the first and second portions such that the first and second portions may be positioned with the first screen and the second screen facing each other in a closed configuration; and wherein the first and second screens are configured to display information.
In the same field of endeavor, Nakamura et al. discloses interactive device (10) comprising two screens (14A, 14B) operatively coupled by the plurality of spines (16) and capable to be positioned with the first screen and the second screen facing each other in a closed configuration as shown in Figures 1-2 and discussed in paragraphs [0029-0034] of Nakamura et al.
Therefore, it would be obvious for a person skilled in the art at the moment the invention was filed to use plurality of spines shown by Nakamura with the apparatus of Reeves et al. in order to avoid a large gap between screens of the two displays in open position.
Claims 7-9, 11-13, 14-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Reeves et al. (US Patent Publication Application 2020/0110566 A1) in view of Nakamura et al. (US Patent Publication Application 2020/0241604 A1).
In regard of claim 7, Reeves et al., Nakamura et al. further disclose the portable interactive device of claim 6, further comprising: a central processing unit (CPU); a memory unit operatively coupled to the CPU; an interface module to provide communication between the portable interactive device and at least one external electronic device; a display control module to provide control of display of information on the first and second screens (See Figure 2 of Reeves et al. illustrating CPU (204), memory (208), interface (252), display control (216a) as also discussed in paragraphs [0094, 0097-0098] of Reeves et al.).
However, the combination of Reeves et al. and Nakamura et al. does not specifically discuss the device an artificial intelligent (AI) module capable of performing at least one of: machine learning; deep learning; natural language processing; and computer vision; and an audio control module to provide control of an audio output signal.
In the same field of endeavor, Kothary disclose as shown in Figures 7-8 interactive device with a CPU (80), memory (84), interface module (86) and display (90), an AI (60) for machine learning (70) as discussed in paragraphs [0071-0078].
Therefore, it would be obvious for a person skilled in the art at the moment the invention was filed to use AI shown by Nakamura with the apparatus of Reeves et al. and Nakamura et al. in order to store and use input patterns for providing more convivence for a user.
In regard of claim 8, Reeves et al., Nakamura and Kothary further disclose the portable interactive device of claim 7, wherein the interface module comprises at least one of: a wireless interface configured to provide wireless communications between the portable interactive device and at least one external electronic device; a wi-fi interface configured to provide wireless communications between the portable interactive device and at least one external electronic device; or a Bluetooth Interface configured to provide wireless communications between the portable interactive device and at least one external electronic device (See at least Figure 2 of Reeves et al. illustrating the apparatus (100) having wireless communication module (232) like Wi-Fi, BlueTooth as discussed in paragraph [0097] providing connection with other external electronic device).
In regard of claim 9, Reeves et al., Nakamura and Kothary further disclose the portable interactive device of claim 7, wherein the AI module comprises at least one of: an AI processing unit capable of performing at least one of: machine learning; deep learning; natural language processing; and computer vision; a usage detection unit capable of monitoring activities of the apparatus and provide usage-type data to the AI processing unit; a data scraping unit capable of acquiring data relating to user inputs based on information; an autocorrelation unit capable of storing based on one or more correlation factors; and an output unit capable of providing an output from the AI processing unit in at least one of a visual format, an audio format, an audiovisual format, a text format, a graphic format, and a combination of two or more therein (See Figures 6-7 of Kothary illustrating usage of AI module (60) for machine learning (70) as discussed in paragraphs [0071-0078] of Kothary).
In regard of claim 11, Reeves et al., Nakamura et al., and Kothary further disclose the portable interactive device of claim 6, wherein the first screen is configured to perform a note-taking task based on input from a user and substantially simultaneously operate independently from the second screen (See at least paragraphs [0029] of Nakamura et al. discussing usage of a notebook application).
In regard of claim 12, Reeves et al., Nakamura et al., and Kothary further disclose the portable interactive device of claim 11, wherein the second screen is capable of performing one or more tasks independent of the note-taking tasks (See at least paragraphs [0029] of Nakamura et al. discussing usage of a notebook application).
In regard of claim 13, Reeves et al., Nakamura et al., and Kothary further disclose the portable interactive device of claim 6, wherein the plurality of spines comprise mechanical means to allow swiveling of the first and second portions about the spine (See at least Figure 1C of Reeves et al. illustrating spine (128) which could be formed with hinges or locking mechanisms discussed in paragraph [0090]).
In regard of claim 15, Reeves et al., Nakamura et al., and Kothary further disclose a portable interactive device, comprising: a first foldable portion comprising a first interactive display; a second foldable portion comprising a second interactive display; an artificial intelligent (AI) module capable of performing at least one of: machine learning; deep learning; natural language processing; and computer vision a first spine and a second spine, wherein the first and second spines operatively coupled to the first and second interactive displays, wherein the first and second spine are configured to provide for opening and closing movements of the first and second foldable portions such that the first and second portions may be positioned with the first screen and the second screen facing each other in a closed configuration; and wherein the first and second screens are configured to display information and interactive communications (See rejections of claims 6-7 provided above).
In regard of claim 16, Reeves et al., Nakamura et al. and Kothary further disclose the portable interactive device of claim 16, further comprising: a central processing unit (CPU); a memory unit operatively coupled to the CPU; an interface module to provide communication between the portable interactive device and at least one external electronic device; a display control module to provide control of display of information on the first and second screens; and audio control module to provide control of an audio output signal (See Figure 2 of Reeves et al. illustrating the apparatus (100) further disclosing CPU (204), memory (208), interface module (248) a display control module (216a, 216b) controlling first and second displays (110, 114) and audio control module (244) to provide control of an audio output signals as discussed in paragraph [0100] ).
In regard of claim 17, Reeves et al., Nakamura et al. and Kothary further disclose the portable interactive device of claim 17, wherein the interface module comprises at least one of: a wireless interface configured to provide wireless communications between the portable interactive device and at least one external electronic device; a wi-fi interface configured to provide wireless communications between the portable interactive device and at least one external electronic device; or a Bluetooth Interface configured to provide wireless communications between the portable interactive device and at least one external electronic device (See at least Figure 2 of Reeves et al. illustrating the apparatus (100) having wireless communication module (232) like Wi-Fi, BlueTooth as discussed in paragraph [0097] providing connection with other external electronic device).
In regard of claim 19, Reeves et al., Nakamura et al. and Kothary further disclose the portable interactive device of claim 16, wherein the portable interactive device is an electronic reader (eReader) (See at least paragraph [0002] of Reeves et al. discussing usage of the device in E-Readers).
In regard of claim 20, Reeves et al., Nakamura et al. and Kothary further disclose the portable interactive device of claim 15, wherein the AI module comprises at least one of: an AI processing unit capable of performing at least one of: machine learning; deep learning; natural language processing; and computer vision; a usage detection unit capable of capable of monitoring activities of the apparatus and provide usage-type data to the AI processing unit; a data scraping unit capable of acquiring data relating to user inputs based on information; an autocorrelation unit capable of storing based on one or more correlation factors; and an output unit capable of providing an output from the AI processing unit in at least one of a visual format, an audio format, an audiovisual format, a text format, a graphic format, and a combination of two or more therein (See at least Figures 6-7 of Kothary illustrating AI module (60) capable for autocorrection as discussed in paragraphs [0071-0078]).
Claims 10, 14 and 18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Reeves et al. (US Patent Publication Application 2020/0110566 A1) in view of Nakamura et al. (US Patent Publication Application 2020/0241604 A1) and further in view of Kothari (US Patent Publication Application 2020/0117187 A1) and further in view of Kim et al. (US Patent Publication Application 2015/0317120 A1).
In regard of claim 10, Reeves et al., Nakamura et al , Kothary further disclose the portable interactive device of claim 7.
However, the combination of references to Reeves et al. and Kothary does not specifically shows the apparatus, wherein the interface module comprises at least one of: a touch user interface configured to receive the user input, wherein the user input is a touch input from a user of the apparatus; and a haptics link interface configured to receive user input from a user of the apparatus based on at least one of: a touch input from the user; a vibration; or a motion of the user.
In the same field of endeavor Kim et al. discloses an apparatus (100) with several display/touch screens which could provide a tactile reaction to a user (852) when they are touched (840A, 840E, 840I) as shown in Figures 1A(A) and 8 and discussed in paragraphs [0161, 0201].
Therefore, it would be obvious for a person skilled in the art at the moment the invention was filed to use haptic response shown by Kim et al. with the apparatus of Reeves et al., Nakamura et al. and Kothary in order to provide a set of sensory feedback information from user inputs based on movements/pressure/touch.
In regard of claim 14, Reeves et al., Nakamura et al., Kothary and Kim et al. further disclose the portable interactive device of claim 6, wherein the spines comprise at least one electronic circuit adapted to perform an operation of the portable interactive device (See at least Figures 4A or 4B of Kim et al. illustrating circuits (445, 450) adapted to perform operation of wave guiding as discussed in paragraphs [0118-0119]).
In regard of claim 18, Reeves et al., Nakamura et al., Kothary , and Kim et al. further disclose the portable interactive device of claim 17, wherein the interface module comprises at least one of: a touch user interface configured to receive a touch input from a user of the portable interactive device; and a haptics link interface configured to receive an input from a user of the portable interface device based on at least one of: a touch input from the user; a vibration; or a motion of the user (See reference to Kim et al. discloses an apparatus (100) with several display/touch screens which could provide a tactile reaction to a user (852) when they are touched (840A, 840E, 840I) as shown in Figures 1A(A) and 8 and discussed in paragraphs [0161, 0201]) .
Allowable Subject Matter
Claims 1, and 3-4 are allowed.
Conclusion
THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Contact Information
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Olga V. Merkoulova whose telephone number is ((571)270-7796. The examiner can normally be reached on Mon-Fri. from 7:30-5:00.
If attempts to reach the examiner by telephone are unsuccessful, the examiner's Supervisor, LunYi Lao can be reached on (571) 272-7671. The fax phone number for the organization where this application or proceeding is assigned is 703-872-9306. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free).
/OLGA V MERKOULOVA/Primary Examiner, Art Unit 2621