DETAILED ACTION
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
This is in reply to a Request for Continued Examination filed on January 30, 2026 regarding Application No. 18/438,589. Applicants amended claims 1, 4-7, 12, and 18. Claims 1-20 are pending.
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicants’ submission filed on January 30, 2026 has been entered.
Response to Arguments
Applicants’ amendments to claim 18 are acknowledged. However, the objection to claim 18 is maintained, as discussed below.
Applicants’ arguments* filed on January 30, 2026 have been fully considered, and without conceding the arguments, they are moot in view of new grounds of rejection.
* The arguments regarding amendments to independent claims 1, 12, and 18 (Remarks – General, page 1, last paragraph to page 2, first paragraph) are not applicable to the January 30, 2026 RCE amendments; they appear to refer to the December 1, 2025 After Final amendments, which were not entered as indicated in the December 18, 2025 Advisory Action.
Information Disclosure Statement
The information disclosure statements (IDS) submitted on December 1, 2025 and February 2, 2026 (the same references are listed in both IDSes) are in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statements are being considered by the Office.
Claim Objections
Claims 1-11 and 18-20 are objected to for the reasons discussed below.
Claim 1: “preferred human sense identified by as a function of the dominant sensory profile” (4th-5th to last line) should be changed to “preferred human sense identified by [[as ]]a function of the dominant sensory profile” or “preferred human sense identified [[by ]]as a function of the dominant sensory profile”.
Claim 18: “the a user sensory preference reaction score” (5th to the last line) should be changed to “the [[a ]]user sensory preference reaction score” since the term was previously recited.
Claims 2-10 and 19-20: these claims depend from an objected to base claim.
Appropriate correction is required.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or non-obviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicants are advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1-2, 6-8, 12, and 18-19 are rejected under 35 U.S.C. 103 as being unpatentable over Kratz et al. in US 2024/0027394 A1 (hereinafter Kratz) in view of Yoshikawa et al. in US 2020/0184843 A1 (hereinafter Yoshikawa).
Regarding claim 1, Kratz teaches:
A method (as shown in FIG. 4) in an electronic device (100B in FIG. 3), the method comprising (Kratz: FIGs. 3-4 and “[0035] FIG. 4 is a flowchart 400 illustrating a method of sending and receiving a message including an olfactory sticker 204 between two smartphones 100 [100A and 100B].”):
receiving a user interface sensory element bundle (predetermined set of olfactory stickers each having a predetermined scent (a bundle of user sensory element configurations)) comprising at least a first user sensory element configuration (e.g., pineapple smell configuration) catering to a first human sense (smell) and a second user sensory element configuration (e.g., fresh pine smell configuration) catering to a second human sense (smell) (Kratz: FIGs. 1-4, “[0024] The smartphone 100 further includes a memory or storage system, for storing… data….”, “[0028] The smartphone 100 is configured to send messages to, and receive messages from, a remote smartphone device via the short range XCVRs 170 and the WWAN XCVRs 165….”, and “[0034]… [T]he user of the sender smartphone 100A may select which olfactory sticker 204 to send from a displayed set of predefined olfactory stickers 204 that correlate to predefined scents that are stored in the memory 110 of the sender smartphone 100A. The receiver smartphone 100B has access to an identical… set of predefined olfactory stickers 204 to easily preview what particular scent has been received without activating the olfactory sticker 204. For example, an image of a pine tree corresponds to a fresh pine scent for both the user and the sender of the smartphones. The sender smartphone 100A sends an olfactory sticker 204 representing a pine tree to the receiver smartphone 100B and the receiver smartphone 100B knows from the predefined olfactory stickers 204 that a fresh pine scent olfactory sticker 204 was sent without activating the olfactory sticker 204….” (emphasis added), see also [0030], [0032], and [0036]-[0038]);
selecting, by one or more processors (120 in FIG. 1), a user interface sensory element configuration (olfactory sticker smell configuration, e.g., pineapple smell or fresh pine smell configuration) from the user interface sensory element bundle having a human sense (smell) (Kratz: FIGs. 1-4, “[0018]… As illustrated [in FIG. 1], smartphone 100 includes a flash memory 110 that stores programming to be executed by a CPU 120 to perform all or a subset of the functions described herein….”, “[0032]… As illustrated [in FIG. 3], the sender has selected an object 310, a pineapple, within the view of the rear camera 160 to create a cloned image 312 (i.e., “segmented image”) of the object 310. The sender smartphone 100A includes a scent sensor or an image classification function to determine if information indicative of a scent can also be sent with the message. In the example of the pineapple image 312 being sent, information indicative of a scent of a pineapple is sent with the message in the form of the olfactory sticker 204. The sender smartphone 100A sends the cloned image 312 and the olfactory sticker 204 to the receiver smartphone 100B. The receiver smartphone 100B displays the cloned image 312 with the olfactory sticker 204 on the user interface 202B of the receiver smartphone 100B….”, and “[0034]… [T]he user of the sender smartphone 100A may select which olfactory sticker 204 to send from a displayed set of predefined olfactory stickers 204 that correlate to predefined scents that are stored in the memory 110 of the sender smartphone 100A. The receiver smartphone 100B has access to an identical… set of predefined olfactory stickers 204 to easily preview what particular scent has been received without activating the olfactory sticker 204. For example, an image of a pine tree corresponds to a fresh pine scent for both the user and the sender of the smartphones. The sender smartphone 100A sends an olfactory sticker 204 representing a pine tree to the receiver smartphone 100B and the receiver smartphone 100B knows from the predefined olfactory stickers 204 that a fresh pine scent olfactory sticker 204 was sent without activating the olfactory sticker 204….”, see also [0022]-[0023] and [0030]); and
presenting, by the one or more processors on a user interface (202B in FIG. 3) of the electronic device, the user interface sensory element configuration selected from the user interface sensory element bundle (Kratz: FIGs. 1-4, “[0018]… As illustrated [in FIG. 1], smartphone 100 includes a flash memory 110 that stores programming to be executed by a CPU 120 to perform all or a subset of the functions described herein….”, [0032] (pineapple smell configuration), [0034] (fresh pine smell configuration), “[0037] At block 404 [in FIG. 4], the receiver smartphone 100B receives the message forwarded from the sender smartphone 100A via respective short range XCVRs 170 or WWAN XCVRs 165. The message contains the olfactory sticker 204 with the encoded olfactory information. The receiver smartphone 100B displays the olfactory sticker 204 on the display 145B via the receiver smartphone 100B user interface 202B.”, and “[0038] At block 406, the user of the receiver smartphone 100B activates the olfactory sticker 204 by interacting with the displayed olfactory sticker 204. For example, the user taps or rubs the displayed olfactory sticker 204….”, see also [0022]-[0023] and [0030]).
However, it is noted that Kratz does not teach:
said receiving, by a communication device from a remote electronic device across a network, the user interface sensory element bundle,
but which would have been obvious to include since it would have been within the general skill of one of ordinary skill in the art to select features on the basis of their suitability for the intended use to update a user interface sensory element bundle (e.g., update a predetermined set of olfactory stickers, such as adding new/different and/or deleting stickers and/or changing olfactory formulations for stickers, in pushing updated software/features to remote devices across a network).
However, it is noted that Kratz as modified does not teach:
wherein the first human sense and the second human sense are different human senses;
identifying, by one or more sensors of the electronic device, a user using the electronic device;
determining, by one or more processors of the electronic device, a dominant sensory profile associated with the user of the electronic device;
said selecting, by the one or more processors, the user interface sensory element configuration from the user interface sensory element bundle having a preferred human sense identified by as a function of the dominant sensory profile associated with the user of the electronic device.
Yoshikawa teaches:
a first human sense (one of a visual, auditory, and tactile senses, e.g., visual sense) and a second human sense (another of the visual, auditory, and tactile senses, e.g., auditory sense) are different human senses (Yoshikawa: FIG. 2 and “[0035] Here, it is said that the way of recognizing the information differs for each user depending on the user's dominant sense. In the present disclosure, differences in information recognition depending on the user's dominant sense are represented by classifying them into sense types. Here, three sense types are set, for example, visual sense type in which visual sense works dominantly, auditory sense type in which auditory sense works dominantly, and tactile sense type in which tactual sense works dominantly. How to receive information, how to communicate it, how to store it, and so on are different depending on each sense type, and in one example, these differences appear in the body's movement.”, see also FIGs. 7, 12, and 14-16 and “[0004]… [A] human being senses external environments using five senses, namely, visual, auditory, tactile, gustatory, and olfactory senses. It is known that a human being has each sense superior to others among these senses….”);
identifying, by one or more sensors (111, 113, and 115 in FIG. 3) of an electronic device (e.g., tablet terminal), a user using the electronic device (Yoshikawa: identifying in sensing a user and/or in updating user model information; see FIGs. 3 and 7, “[0043]… Examples of the sensors 110 include a biometric sensor 111 for acquiring biometric information of the user, a speech acquisition device 113 for acquiring speech, and an imaging device 115 for capturing an image of the user.” (emphasis added), “[0050]… the model setting unit 120 evaluates the estimated user's reaction and sets the user model. The user model represents the user's dominant sense, and in one example, is represented as the user's sense type. The presentation of each information item based on the user model that is set for each user makes it possible to appropriately stimulate different dominant senses for each user, thereby allowing each user to easily accept information. The model setting unit 120 stores the set user model in the user model storage unit 140…. In addition, the model setting unit 120 can update the user model on the basis of the reaction to the stimulus given to the user.” (emphasis added), “[0054] The user model storage unit 140 is a storage unit that stores a user model representing the user's reaction to the stimulus. In the present embodiment, the user model storage unit 140 has user's sense type information 141 and a stimulus reaction history table 143. The sense type information 141 stores the sense type of each user estimated by the model setting unit 120 as a user model. In the sense type information 141, in one example, the proportion of each of visual, auditory, and tactile sense types is included for each user, as illustrated in FIG. 7. Some people have one sense that works dominantly…. Thus, as illustrated in FIG. 7, the representation of the extent to which each sense type dominantly works as the proportion makes it possible to set a stimulus more suitable for each user. The sense type information 141 is a sense type estimated on the basis of the information stored in the stimulus reaction history table 143.” (emphasis added), and “[0075]… [T]he content including the impression object is displayed on a tablet terminal….” (emphasis added), see also FIGs. 1-2, 9-10, 12, and 14-17, [0033]-[0036], [0038], [0079], [0084], [0088], and [0098]);
determining, by one or more processors (901 in FIG. 18) of the electronic device, a dominant sensory profile (visual, auditory, or tactile dominant sensory profile) associated with the user of the electronic device (Yoshikawa: FIGs. 3, 7, and 18, “[0050]… the model setting unit 120 evaluates the estimated user's reaction and sets the user model. The user model represents the user's dominant sense, and in one example, is represented as the user's sense type….” (emphasis added), “[0105]… FIG. 18 is a hardware configuration diagram illustrating the hardware configuration of a functional unit that constitutes the information presentation system 100…. FIG. 18 is described as an example of the hardware configuration of an information processing apparatus 900 including the respective functional units illustrated in FIG. 3….”, “[0106]… The information processing apparatus 900 includes a central processing unit (CPU) 901….” (emphasis added), and “[0107] The CPU 901 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the information processing apparatus 900 in accordance with various programs….”, see also FIGs. 1-2, 9, 12, and 14-17, [0035]-[0036], [0038], [0054], [0079], and [0098]);
selecting, by the one or more processors, a user interface sensory element configuration (visual, auditory, or tactile configuration) from a user interface sensory element bundle (visual, auditory, and tactile sense bundle) having a preferred human sense (visual, auditory, or tactile preferred sense) identified by as a function of the dominant sensory profile associated with the user of the electronic device (Yoshikawa: FIGs. 3, 14, and 18, “[0089]… [I]n FIG. 14,… [w]hen the user instructs the terminal 3 for guidance to the annex, the presentation processing unit 150 of the information presentation system 100 refers to the user model storage unit 140 to acquire the sense type of the user. Then, the guide information is presented depending on the sense type of the user. In one example, a guide object 31, which indicates visually the position of the annex displayed on the display 30, is highlighted and displayed to the visual sense type user. In one example, the guide object 31 is displayed on the display 30 to be smaller than the case where the guide object 31 is presented to the visual sense type user, but the position of the annex is notified by sound to the auditory sense type user. Furthermore, in one example, the guide object 31 is displayed on the display 30 to be smaller as in the case of the auditory sense type user, but the position of the annex can be notified to the tactile sense type user by vibrating the terminal 3 in the case where the annex is displayed in the display 30.” (emphasis added), “[0105]… FIG. 18 is a hardware configuration diagram illustrating the hardware configuration of a functional unit that constitutes the information presentation system 100…. FIG. 18 is described as an example of the hardware configuration of an information processing apparatus 900 including the respective functional units illustrated in FIG. 3….”, “[0106]… The information processing apparatus 900 includes a central processing unit (CPU) 901….”, and “[0107] The CPU 901 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the information processing apparatus 900 in accordance with various programs….”, see also FIGs. 2, 7, 10-13, and 15-16, [0038], [0054], and [0093]-[0094]); and
presenting, by the one or more processors on a user interface (of 3 in FIG. 14, or 906-907 in FIG. 18) of the electronic device, the user interface sensory element configuration selected from the user interface sensory element bundle (Yoshikawa: see FIGs. 3, 14, and 18, “[0075]… The presentation unit 170 includes… a display unit that displays information, a sound output unit that outputs sound, a vibration generation unit that vibrates equipment, and the like….”, “[0089]… [I]n FIG. 14,… [w]hen the user instructs the terminal 3 for guidance to the annex, the presentation processing unit 150 of the information presentation system 100 refers to the user model storage unit 140 to acquire the sense type of the user. Then, the guide information is presented depending on the sense type of the user. In one example, a guide object 31, which indicates visually the position of the annex displayed on the display 30, is highlighted and displayed to the visual sense type user. In one example, the guide object 31 is displayed on the display 30 to be smaller than the case where the guide object 31 is presented to the visual sense type user, but the position of the annex is notified by sound to the auditory sense type user. Furthermore, in one example, the guide object 31 is displayed on the display 30 to be smaller as in the case of the auditory sense type user, but the position of the annex can be notified to the tactile sense type user by vibrating the terminal 3 in the case where the annex is displayed in the display 30.” (emphasis added), “[0105]… FIG. 18 is a hardware configuration diagram illustrating the hardware configuration of a functional unit that constitutes the information presentation system 100…. FIG. 18 is described as an example of the hardware configuration of an information processing apparatus 900 including the respective functional units illustrated in FIG. 3….”, “[0106]… The information processing apparatus 900 includes a central processing unit (CPU) 901….”, “[0107] The CPU 901 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the information processing apparatus 900 in accordance with various programs….”, and “[0109] [In FIG. 18,] [t]he input device 906 includes an input means that allows the user to input information…. The output device 907 includes… a display device… and an audio output device….” (emphasis added), see also FIGs. 2, 10-13, and 15-16, [0038], and [0093]-[0094]).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to include: the features taught by Yoshikawa, such that Kratz as modified teaches: receiving, by a communication device from a remote electronic device across a network, a user interface sensory element bundle comprising at least a first user sensory element configuration catering to a first human sense and a second user sensory element configuration catering to a second human sense, wherein the first human sense and the second human sense are different human senses (receiving, communication device, remote electronic device, network, user interface sensory element bundle, first and second user sensory element configurations, and first and second human senses of Kratz as modified combined with the first and second human senses of Yoshikawa); identifying, by one or more sensors of the electronic device, a user using the electronic device (electronic device of Kratz as modified combined with identifying, one or more sensors, electronic device, and user of Yoshikawa); determining, by one or more processors of the electronic device, a dominant sensory profile associated with the user of the electronic device (one or more processors and electronic device of Kratz as modified combined with determining, one or more processors, electronic device, dominant sensory profile, and user of Yoshikawa); selecting, by the one or more processors, a user interface sensory element configuration from the user interface sensory element bundle having a preferred human sense identified by as a function of the dominant sensory profile associated with the user of the electronic device (selecting, one or more processors, user interface sensory element configuration, user interface sensory bundle, human sense, and electronic device of Kratz as modified combined with selecting, one or more processors, user interface sensory element configuration, user interface sensory element bundle, preferred human sense, and function of the dominant sensory profile associated as claimed of Yoshikawa); and presenting, by the one or more processors on a user interface of the electronic device, the user interface sensory element configuration selected from the user interface sensory element bundle (presenting, one or more processors, user interface, electronic device, user interface sensory element configuration, and user interface sensory element bundle of Kratz as modified combined with presenting, one or more processors, user interface, electronic device, user interface sensory element configuration, and user interface sensory element bundle of Yoshikawa), to “present[] information in a presentation way that allows each user to easily accept information.” (Yoshikawa: [0005]).
Regarding claim 2, Kratz as modified by Yoshikawa teaches:
The method of claim 1, wherein the determining the dominant sensory profile associated with the user of the electronic device comprises retrieving, by the one or more processors, a user sensory preference reaction score (visual, auditory, and tactile sense type proportion user sensory preference reaction score in FIG. 7 of Yoshikawa) from a user profile stored in a memory (140 in FIG. 3 of Yoshikawa) of the electronic device (Yoshikawa: FIGs. 3, 7, 9, and 18, “[0054] The user model storage unit 140 is a storage unit that stores a user model representing the user's reaction to the stimulus…. [T]he user model storage unit 140 has user's sense type information 141 and a stimulus reaction history table 143. The sense type information 141 stores the sense type of each user estimated by the model setting unit 120 as a user model. In the sense type information 141, in one example, the proportion of each of visual, auditory, and tactile sense types is included for each user, as illustrated in FIG. 7. Some people have one sense that works dominantly…. Thus, as illustrated in FIG. 7, the representation of the extent to which each sense type dominantly works as the proportion makes it possible to set a stimulus more suitable for each user. The sense type information 141 is a sense type estimated on the basis of the information stored in the stimulus reaction history table 143.” (emphasis added), “[0073]… The presentation processing unit 150 sets a stimulus for the impression object included in the content stored in the content storage unit 160 on the basis of the information stored in the user model storage unit 140 and outputs it to the presentation unit 170. In one example, the presentation processing unit 150 can determine stimulus information to be set on the basis of the sense type information 141 in referring to the user model storage unit 140. In addition, the presentation processing unit 150 can, in one example, refer to the stimulus reaction history table 143, extract history information relating to the current stimulus from the previous stimulus content and the history of the reaction, and check the reaction to the stimulus and its effect on the basis of the extracted information. Then, the presentation processing unit 150 can determine stimulus information to be set currently….” (emphasis added), “[0105]… FIG. 18 is a hardware configuration diagram illustrating the hardware configuration of a functional unit that constitutes the information presentation system 100…. FIG. 18 is described as an example of the hardware configuration of an information processing apparatus 900 including the respective functional units illustrated in FIG. 3….”, “[0106]… The information processing apparatus 900 includes a central processing unit (CPU) 901….”, and “[0107] The CPU 901 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the information processing apparatus 900 in accordance with various programs….”, see also [0055], [0078]-[0079], and [0085]).
Regarding claim 6, Kratz as modified by Yoshikawa teaches:
The method of claim 1, wherein the user interface sensory element configuration selected from the user interface sensory element bundle comprises one or more of:
user input controls (e.g., activation input controls in Kratz);
navigational elements; and/or
containers (Kratz: FIGs. 2-4 and “[0031] The olfactory sticker 204 may also include text instructions 208 indicating how to engage and activate the olfactory sticker 204 to release the scent via the olfactory transducer 200. In the example shown in FIG. 2, the text instruction 208 is “Rub Me!” to indicate to the user that the olfactory sticker 204 displayed on user interface 202 must be rubbed to release the scent represented by the olfactory sticker 204. Other text, such as, “Tap Me!” may be used to indicate to the user that the olfactory sticker 204 must be tapped to release the scent represented by the olfactory sticker 204. The user interface 202 is configured to detect the interaction between the user and the olfactory sticker 204. The intensity of tapping or rubbing the displayed olfactory sticker 204 regulates the amount of scent released by an olfactory transducer 200 so that the user is not overwhelmed with a received scent. For example, a singular tap of the displayed olfactory sticker 204 translates to a small, predetermined amount of scent being released from the olfactory transducer 200, whereas two taps translate to twice the amount of scent being released as compared to the singular tap. In the example of rubbing the olfactory sticker 204, rubbing for 2 seconds translates to a small, predetermined amount of scent to be released, whereas 5 seconds of rubbing translates to twice the amount of scent being released as compared to the 2 seconds of rubbing….”, see also [0038]).
Regarding claim 7, Kratz as modified by Yoshikawa teaches:
The method of claim 6, wherein:
the user interface sensory element bundle comprises a plurality of user interface sensory element configurations, each comprising informational components in comprising text (Kratz: FIG. 2 and “[0030]… The olfactory sticker 204 [in FIG. 2] graphically represents the encoded olfactory information to the user of the smartphone 100. The olfactory sticker 204 comprises a visual preview 206 of the scent conveyed. In an example, the visual preview 206 can be an icon, symbol, cartoon, picture, emoji, Bitmoji®, text, or similar visual information. In the example shown in FIG. 2, the visual preview 206 is a cartoon pineapple used to represent a scent of a pineapple that has been sent to the smartphone 100….” (emphasis added); claims 1 and 6 above); and
the text of each user interface sensory element configuration has one or both of adjectives and/or adverbs in the text that differs from each other user interface sensory element configuration of the plurality of user interface sensory element configurations, with those one or both of the adjectives and/or the adverbs enhancing a characteristic associated with at least one user interface element and diminishing another characteristic associated with at least one other user interface element (Kratz: see FIGs. 2-3, [0030] (olfactory sticker with a particular scent; text visual preview), [0032] (pineapple scent sticker), [0034] (select an olfactory sticker with a predefined scent from a set of olfactory stickers each with predefined scents; fresh pine scent sticker); claims 1 and 6 above; it would have been obvious to include the claimed features since it would have been within the general skill of one of ordinary skill in the art to select features on the basis of their suitability for the intended use to provide information about a user interface sensory element configuration using descriptive text - e.g., “pineapple scent” or “fresh pine scent” text).
Regarding claim 8, Kratz as modified by Yoshikawa teaches:
The method of claim 7, wherein the one or both of the adjectives and/or the adverbs are encoded into metadata associated with each user interface sensory element configuration (Kratz: “… [0030]… The user interface 202 is configured to send and receive the olfactory sticker 204 encoded with olfactory information corresponding to a particular scent. The encoded olfactory information is configured to activate the olfactory transducer 200 to release the corresponding scent when the user engages the olfactory sticker 204. The olfactory sticker 204 graphically represents the encoded olfactory information to the user of the smartphone 100….”; claims 1 and 6-7 above; it would have been obvious to include the claimed features since it would have been within the general skill of one of ordinary skill in the art to select features on the basis of their suitability for the intended use to provide descriptive text information about a user interface sensory element configuration in metadata).
Regarding claim 12, Kratz teaches:
An electronic device (100B in FIG. 3), comprising (Kratz: FIG. 3 and “[0032]… [A] receiver smartphone 100B….”, see also FIGs. 1 and 4):
a user interface (202B in FIG. 3) (Kratz: : FIG. 3 and “[0032]… [A] receiver smartphone 100B having a display 145B including user interface 202B….”);
a communication device (165 in FIG. 1) (Kratz: FIG. 1 and “[0028] The smartphone 100 is configured to send messages to, and receive messages from, a remote smartphone device via… the WWAN XCVRs 165….”, see also [0037]); and
one or more processors (120) operable with the user interface and the communication device (Kratz: FIG. 1 and “[0018]… As illustrated [in FIG. 1], smartphone 100 includes a flash memory 110 that stores programming to be executed by a CPU 120 to perform all or a subset of the functions described herein…. ”, see also [0022]-[0023]);
wherein the one or more processors are configured to select a user interface sensory element configuration (smell configuration) having informational components catering to a human sense (smell) from a plurality of user interface sensory element configurations (smell configurations) each catering to a human sense (smell) contained in a user interface sensory element bundle (predetermined set of olfactory stickers each having a predetermined scent (a bundle of user interface sensory element configurations)) and, thereafter, cause the user interface to present the user interface sensory element configuration selected from the plurality of user interface sensory element configurations contained in the user interface sensory element bundle (Kratz: FIGs. 2-4, “[0018]… As illustrated [in FIG. 1], smartphone 100 includes a flash memory 110 that stores programming to be executed by a CPU 120 to perform all or a subset of the functions described herein…. ”, “[0024] The smartphone 100 further includes a memory or storage system, for storing… data….”, “[0028] The smartphone 100 is configured to send messages to, and receive messages from, a remote smartphone device via the short range XCVRs 170 and the WWAN XCVRs 165….”, “[0032]… As illustrated [in FIG. 3], the sender has selected an object 310, a pineapple, within the view of the rear camera 160 to create a cloned image 312 (i.e., “segmented image”) of the object 310. The sender smartphone 100A includes a scent sensor or an image classification function to determine if information indicative of a scent can also be sent with the message. In the example of the pineapple image 312 being sent, information indicative of a scent of a pineapple is sent with the message in the form of the olfactory sticker 204. The sender smartphone 100A sends the cloned image 312 and the olfactory sticker 204 to the receiver smartphone 100B. The receiver smartphone 100B displays the cloned image 312 with the olfactory sticker 204 on the user interface 202B of the receiver smartphone 100B….”, and “[0034]… [T]he user of the sender smartphone 100A may select which olfactory sticker 204 to send from a displayed set of predefined olfactory stickers 204 that correlate to predefined scents that are stored in the memory 110 of the sender smartphone 100A. The receiver smartphone 100B has access to an identical… set of predefined olfactory stickers 204 to easily preview what particular scent has been received without activating the olfactory sticker 204. For example, an image of a pine tree corresponds to a fresh pine scent for both the user and the sender of the smartphones. The sender smartphone 100A sends an olfactory sticker 204 representing a pine tree to the receiver smartphone 100B and the receiver smartphone 100B knows from the predefined olfactory stickers 204 that a fresh pine scent olfactory sticker 204 was sent without activating the olfactory sticker 204….” (emphasis added), see also [0022]-[0023], [0030], and [0036]-[0038]).
However, it is noted that Kratz does not teach:
the user interface sensory element bundle received by the communication device from a remote electronic device across a network,
but which would have been obvious to include since it would have been within the general skill of one of ordinary skill in the art to select features on the basis of their suitability for the intended use to update a user interface sensory element bundle (e.g., update a predetermined set of olfactory stickers, such as adding new/different and/or deleting stickers and/or changing olfactory formulations for stickers, in pushing updated software/features to remote devices across a network).
However, it is noted that Kratz as modified does not teach:
the informational components catering to a preferred human sense identified by a dominant sensory profile associated with an authorized user of the electronic device,
the plurality of user interface sensory element configurations each catering to a different human sense contained in the user interface sensory element bundle.
Yoshikawa teaches:
wherein one or more processors (901 in FIG. 18) are configured to select a user interface sensory element configuration (visual, auditory, or tactile configuration) having informational components catering to a preferred human sense (visual, auditory, or tactile preferred sense) identified by a dominant sensory profile (visual, auditory, or tactile dominant sensory profile) associated with an authorized user (user using electronic device authorized user) of an electronic device (3 in FIG. 14) from a plurality of user interface sensory element configurations (visual, auditory, and tactile configurations) each catering to a different human sense (visual, auditory, or tactile) contained in a user interface sensory element bundle (visual, auditory, and tactile sense bundle) (Yoshikawa: FIGs. 3, 14, and 18, “[0089]… [I]n FIG. 14,… [w]hen the user instructs the terminal 3 for guidance to the annex, the presentation processing unit 150 of the information presentation system 100 refers to the user model storage unit 140 to acquire the sense type of the user. Then, the guide information is presented depending on the sense type of the user. In one example, a guide object 31, which indicates visually the position of the annex displayed on the display 30, is highlighted and displayed to the visual sense type user. In one example, the guide object 31 is displayed on the display 30 to be smaller than the case where the guide object 31 is presented to the visual sense type user, but the position of the annex is notified by sound to the auditory sense type user. Furthermore, in one example, the guide object 31 is displayed on the display 30 to be smaller as in the case of the auditory sense type user, but the position of the annex can be notified to the tactile sense type user by vibrating the terminal 3 in the case where the annex is displayed in the display 30.” (emphasis added), “[0105]… FIG. 18 is a hardware configuration diagram illustrating the hardware configuration of a functional unit that constitutes the information presentation system 100…. FIG. 18 is described as an example of the hardware configuration of an information processing apparatus 900 including the respective functional units illustrated in FIG. 3….”, “[0106]… The information processing apparatus 900 includes a central processing unit (CPU) 901….”, and “[0107] The CPU 901 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the information processing apparatus 900 in accordance with various programs….”, see also FIGs. 1-2, 7, 9-13, and 15-16, [0038], [0054], [0085], and [0093]-[0094]) and, thereafter, cause a user interface (of 3 in FIG. 14, or 906-907 in FIG. 18) to present the user interface sensory element configuration selected from the plurality of user interface sensory element configurations contained in the user interface sensory element bundle (Yoshikawa: FIGs. 3, 14, and 18, “[0075]… The presentation unit 170 includes… a display unit that displays information, a sound output unit that outputs sound, a vibration generation unit that vibrates equipment, and the like….”, “[0089]… [I]n FIG. 14,… [w]hen the user instructs the terminal 3 for guidance to the annex, the presentation processing unit 150 of the information presentation system 100 refers to the user model storage unit 140 to acquire the sense type of the user. Then, the guide information is presented depending on the sense type of the user. In one example, a guide object 31, which indicates visually the position of the annex displayed on the display 30, is highlighted and displayed to the visual sense type user. In one example, the guide object 31 is displayed on the display 30 to be smaller than the case where the guide object 31 is presented to the visual sense type user, but the position of the annex is notified by sound to the auditory sense type user. Furthermore, in one example, the guide object 31 is displayed on the display 30 to be smaller as in the case of the auditory sense type user, but the position of the annex can be notified to the tactile sense type user by vibrating the terminal 3 in the case where the annex is displayed in the display 30.” (emphasis added), “[0105]… FIG. 18 is a hardware configuration diagram illustrating the hardware configuration of a functional unit that constitutes the information presentation system 100…. FIG. 18 is described as an example of the hardware configuration of an information processing apparatus 900 including the respective functional units illustrated in FIG. 3….”, “[0106]… The information processing apparatus 900 includes a central processing unit (CPU) 901….”, “[0107] The CPU 901 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the information processing apparatus 900 in accordance with various programs….”, and “[0109] [In FIG. 18,] [t]he input device 906 includes an input means that allows the user to input information…. The output device 907 includes… a display device… and an audio output device….” (emphasis added), see also FIGs. 2, 10-13, and 15-16, [0038], and [0093]-[0094]).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to include: the features taught by Yoshikawa, such that Kratz as modified teaches: wherein the one or more processors are configured to select a user interface sensory element configuration having informational components catering to a preferred human sense identified by a dominant sensory profile associated with an authorized user of the electronic device from a plurality of user interface sensory element configurations each catering to a different human sense contained in a user interface sensory element bundle received by the communication device from a remote electronic device across a network and, thereafter, cause the user interface to present the user interface sensory element configuration selected from the plurality of user interface sensory element configurations contained in the user interface sensory element bundle (one or more processors, select a user interface sensory element configuration, informational components, human sense, electronic device, plurality of user interface sensory element configurations, user interface sensory element bundle, communication device, remote electronic device, network, and user interface of Kratz as modified combined with one or more processors, select a user interface sensor element configuration, informational components, preferred human sense, dominant sensory profile, authorized user, electronic device, plurality of user interface sensory element configurations, different human sense, user interface sensory element bundle, and user interface of Yoshikawa), to “present[] information in a presentation way that allows each user to easily accept information.” (Yoshikawa: [0005]).
Regarding claim 18, Kratz teaches:
A method (as shown in FIG. 4) for an electronic device (100B in FIG. 3), the method comprising (Kratz: FIGs. 3-4 and “[0035] FIG. 4 is a flowchart 400 illustrating a method of sending and receiving a message including an olfactory sticker 204 between two smartphones 100 [100A and 100B].”):
receiving, a user interface sensory element bundle (predetermined set of olfactory stickers each having a predetermined scent (a bundle of user interface sensory element configurations)) comprising at least a first user interface sensory element configuration (e.g., pineapple smell configuration) enhancing a human sense (smell) and a second user interface sensory element configuration (e.g., fresh pine smell configuration) enhancing the human sense (Kratz: FIGs. 1-4, “[0024] The smartphone 100 further includes a memory or storage system, for storing… data….”, “[0028] The smartphone 100 is configured to send messages to, and receive messages from, a remote smartphone device via the short range XCVRs 170 and the WWAN XCVRs 165….”, and “[0034]… [T]he user of the sender smartphone 100A may select which olfactory sticker 204 to send from a displayed set of predefined olfactory stickers 204 that correlate to predefined scents that are stored in the memory 110 of the sender smartphone 100A. The receiver smartphone 100B has access to an identical… set of predefined olfactory stickers 204 to easily preview what particular scent has been received without activating the olfactory sticker 204. For example, an image of a pine tree corresponds to a fresh pine scent for both the user and the sender of the smartphones. The sender smartphone 100A sends an olfactory sticker 204 representing a pine tree to the receiver smartphone 100B and the receiver smartphone 100B knows from the predefined olfactory stickers 204 that a fresh pine scent olfactory sticker 204 was sent without activating the olfactory sticker 204….” (emphasis added), see also [0030], [0032], and [0036]-[0038]);
selecting, by one or more processors (120 in FIG. 1), a user interface sensory element configuration (olfactory sticker smell configuration) from the user interface sensory element bundle to enhance the human sense (Kratz: FIGs. 1-4, “[0018]… As illustrated [in FIG. 1], smartphone 100 includes a flash memory 110 that stores programming to be executed by a CPU 120 to perform all or a subset of the functions described herein…. ”, “[0032]… As illustrated [in FIG. 3], the sender has selected an object 310, a pineapple, within the view of the rear camera 160 to create a cloned image 312 (i.e., “segmented image”) of the object 310. The sender smartphone 100A includes a scent sensor or an image classification function to determine if information indicative of a scent can also be sent with the message. In the example of the pineapple image 312 being sent, information indicative of a scent of a pineapple is sent with the message in the form of the olfactory sticker 204. The sender smartphone 100A sends the cloned image 312 and the olfactory sticker 204 to the receiver smartphone 100B. The receiver smartphone 100B displays the cloned image 312 with the olfactory sticker 204 on the user interface 202B of the receiver smartphone 100B….”, and “[0034]… [T]he user of the sender smartphone 100A may select which olfactory sticker 204 to send from a displayed set of predefined olfactory stickers 204 that correlate to predefined scents that are stored in the memory 110 of the sender smartphone 100A. The receiver smartphone 100B has access to an identical, or a different, set of predefined olfactory stickers 204 to easily preview what particular scent has been received without activating the olfactory sticker 204. For example, an image of a pine tree corresponds to a fresh pine scent for both the user and the sender of the smartphones. The sender smartphone 100A sends an olfactory sticker 204 representing a pine tree to the receiver smartphone 100B and the receiver smartphone 100B knows from the predefined olfactory stickers 204 that a fresh pine scent olfactory sticker 204 was sent without activating the olfactory sticker 204….”, see also [0022]-[0023] and [0030]); and
presenting, by the one or more processors on a user interface (202B in FIG. 3), the user interface sensory element configuration selected from the user interface sensory element bundle (Kratz: FIGs. 3-4, “[0018]… As illustrated [in FIG. 1], smartphone 100 includes a flash memory 110 that stores programming to be executed by a CPU 120 to perform all or a subset of the functions described herein…. ”, [0032] (pineapple smell configuration; “… [A] receiver smartphone 100B having a display 145B including user interface 202B….”), [0034] (fresh pine smell configuration), and “[0037] At block 404 [in FIG. 4], the receiver smartphone 100B receives the message forwarded from the sender smartphone 100A via respective short range XCVRs 170 or WWAN XCVRs 165. The message contains the olfactory sticker 204 with the encoded olfactory information. The receiver smartphone 100B displays the olfactory sticker 204 on the display 145B via the receiver smartphone 100B user interface 202B.”, and “[0038] At block 406, the user of the receiver smartphone 100B activates the olfactory sticker 204 by interacting with the displayed olfactory sticker 204. For example, the user taps or rubs the displayed olfactory sticker 204….”, see also [0022]-[0023] and [0030]).
However, it is noted that Kratz does not teach:
thereafter, said receiving, by a communication device from a remote electronic device across a network, the user interface sensory element bundle,
but which would have been obvious to include since it would have been within the general skill of one of ordinary skill in the art to select features on the basis of their suitability for the intended use to update a user interface sensory element bundle (e.g., update a predetermined set of olfactory stickers, such as adding new/different and/or deleting stickers and/or changing olfactory formulations for stickers, in pushing updated software/features to remote devices across a network).
However, it is noted that Kratz as modified does not teach:
presenting, by one or more processors on a user interface, a plurality of user interface elements, wherein each user interface element of the plurality of user interface elements includes components catering to different sensory perceptions from other user interface elements of the plurality of user interface elements;
measuring, by one or more sensors, reactions of a user of the electronic device to the plurality of user interface elements;
determining, by the one or more processors from the reactions, a user sensory preference reaction score; and
said thereafter, receiving, by the communication device from the remote electronic device across the network, the user interface sensory element bundle comprising at least the first user interface sensory element configuration enhancing a first human sense and diminishing a second human sense that is different from the first human sense and the second user interface sensory element configuration enhancing the second human sense and diminishing the first human sense;
said selecting, by the one or more processors, the user interface sensory element configuration from the user interface sensory element bundle to enhance a preferred human sense selected from the first human sense or the second human sense and identified by the a user sensory preference reaction score.
Yoshikawa teaches:
presenting, by one or more processors (901 in FIG. 18) on a user interface (of 3 in FIG. 14, or 906-907 in FIG. 18), a plurality of user interface elements (visual, auditory, and tactile user interface elements), wherein each user interface element of the plurality of user interface elements includes components catering to different sensory perceptions (visual, auditory, and tactile sensory perceptions) from other user interface elements of the plurality of user interface elements (Yoshikawa: FIGs. 3, 14, and 18, “[0089]… [I]n FIG. 14,… [w]hen the user instructs the terminal 3 for guidance to the annex, the presentation processing unit 150 of the information presentation system 100 refers to the user model storage unit 140 to acquire the sense type of the user. Then, the guide information is presented depending on the sense type of the user. In one example, a guide object 31, which indicates visually the position of the annex displayed on the display 30, is highlighted and displayed to the visual sense type user. In one example, the guide object 31 is displayed on the display 30 to be smaller than the case where the guide object 31 is presented to the visual sense type user, but the position of the annex is notified by sound to the auditory sense type user. Furthermore, in one example, the guide object 31 is displayed on the display 30 to be smaller as in the case of the auditory sense type user, but the position of the annex can be notified to the tactile sense type user by vibrating the terminal 3 in the case where the annex is displayed in the display 30.” (emphasis added), “[0105]… FIG. 18 is a hardware configuration diagram illustrating the hardware configuration of a functional unit that constitutes the information presentation system 100…. FIG. 18 is described as an example of the hardware configuration of an information processing apparatus 900 including the respective functional units illustrated in FIG. 3….”, “[0106]… The information processing apparatus 900 includes a central processing unit (CPU) 901….”, “[0107] The CPU 901 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the information processing apparatus 900 in accordance with various programs….”, and “[0109] [In FIG. 18,] [t]he input device 906 includes an input means that allows the user to input information…. The output device 907 includes… a display device… and an audio output device….” (emphasis added), see also FIGs. 1-2, 9-13, and 15-16);
measuring, by one or more sensors (111, 113, and 115 in FIG. 3), reactions of a user of an electronic device (e.g., tablet terminal) to the plurality of user interface elements (Yoshikawa: see FIG. 3, “[0043] The sensors 110 acquire user-related information that is used to analyze reaction of the user who receives presented information. Examples of the sensors 110 include a biometric sensor 111 for acquiring biometric information of the user, a speech acquisition device 113 for acquiring speech, and an imaging device 115 for capturing an image of the user.” (emphasis added), “[0047]… The detection values acquired by the sensors 110 are output to the model setting unit 120 as sensor information.”, and “[0075]… [T]he content including the impression object is displayed on a tablet terminal….” (emphasis added), see also FIGs. 9-10 and 14-17, [0038], [0079], [0084], [0088], and [0098]);
determining, by the one or more processors from the reactions, a user sensory preference reaction score (visual, auditory, and tactile sense type proportion user sensory preference reaction score in FIG. 7) (Yoshikawa: FIGs. 3-4, 7, 9, and 18, “[0047]… The detection values acquired by the sensors 110 are output to the model setting unit 120 as sensor information.”, “[0048] The model setting unit 120 sets a user model representing the user's reaction to the stimulus for each user on the basis of the sensor information. The model setting unit 120 first analyzes the sensor information and estimates the user's reaction to the stimulus presented to the user. In this event, the model setting unit 120 can estimate the user's reaction with reference to the reaction evaluation information storage unit 130….”, “[0049]… the model setting unit 120 can estimate the user's reaction from variation in the effector illustrated in FIG. 4 on the basis of the biometric information acquired by the biometric sensor 111…. In a case where there are a plurality of acquired sensor information items, the user's reaction can be estimated for each item of the sensor information. Then, the model setting unit 120 records contents of the stimulus presented to the user and the estimated user's reaction in the user model storage unit 140 as stimulus reaction history information.”, “[0054] The user model storage unit 140 is a storage unit that stores a user model representing the user's reaction to the stimulus…. [T]he user model storage unit 140 has user's sense type information 141 and a stimulus reaction history table 143. The sense type information 141 stores the sense type of each user estimated by the model setting unit 120 as a user model. In the sense type information 141, in one example, the proportion of each of visual, auditory, and tactile sense types is included for each user, as illustrated in FIG. 7. Some people have one sense that works dominantly…. Thus, as illustrated in FIG. 7, the representation of the extent to which each sense type dominantly works as the proportion makes it possible to set a stimulus more suitable for each user. The sense type information 141 is a sense type estimated on the basis of the information stored in the stimulus reaction history table 143.” (emphasis added), “[0105]… FIG. 18 is a hardware configuration diagram illustrating the hardware configuration of a functional unit that constitutes the information presentation system 100…. FIG. 18 is described as an example of the hardware configuration of an information processing apparatus 900 including the respective functional units illustrated in FIG. 3….”, “[0106]… The information processing apparatus 900 includes a central processing unit (CPU) 901….”, and “[0107] The CPU 901 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the information processing apparatus 900 in accordance with various programs….”, see also [0078]-[0079], and [0085], and [0098]); and
a user interface sensory element bundle (terminal 3 user interface visual, auditory, and tactile sensory element bundle) comprising at least a first user interface sensory element configuration (e.g., auditory configuration) enhancing a first human sense (e.g., auditory) and diminishing a second human sense (e.g., tactile) that is different from the first human sense and a second user interface sensory element configuration (e.g., tactile configuration) enhancing the second human sense and diminishing the first human sense (Yoshikawa: see FIGs. 3 and 14 and “[0089]… [I]n FIG. 14,… [w]hen the user instructs the terminal 3 for guidance to the annex, the presentation processing unit 150 of the information presentation system 100 refers to the user model storage unit 140 to acquire the sense type of the user. Then, the guide information is presented depending on the sense type of the user. In one example, a guide object 31, which indicates visually the position of the annex displayed on the display 30, is highlighted and displayed to the visual sense type user. In one example, the guide object 31 is displayed on the display 30 to be smaller than the case where the guide object 31 is presented to the visual sense type user, but the position of the annex is notified by sound to the auditory sense type user. Furthermore, in one example, the guide object 31 is displayed on the display 30 to be smaller as in the case of the auditory sense type user, but the position of the annex can be notified to the tactile sense type user by vibrating the terminal 3 in the case where the annex is displayed in the display 30.” (emphasis added), see also FIGs. 2, 7-8, 10-13, 15-16, and 18, [0038], [0054], [0093]-[0094], [0105], and [0112]);
selecting, by the one or more processors, a user interface sensory element configuration (terminal 3 user interface visual, auditory, or tactile sensory element configuration) from the user interface sensory element bundle to enhance a preferred human sense selected from the first human sense or the second human sense and identified by the a user sensory preference reaction score (Yoshikawa: FIGs. 3, 7, 14, and 18, “[0050]… the model setting unit 120 evaluates the estimated user's reaction and sets the user model. The user model represents the user's dominant sense, and in one example, is represented as the user's sense type…. The model setting unit 120 stores the set user model in the user model storage unit 140…. In addition, the model setting unit 120 can update the user model on the basis of the reaction to the stimulus given to the user.”, “[0054] The user model storage unit 140 is a storage unit that stores a user model representing the user's reaction to the stimulus…. [T]he user model storage unit 140 has user's sense type information 141 and a stimulus reaction history table 143. The sense type information 141 stores the sense type of each user estimated by the model setting unit 120 as a user model. In the sense type information 141, in one example, the proportion of each of visual, auditory, and tactile sense types is included for each user, as illustrated in FIG. 7. Some people have one sense that works dominantly…. Thus, as illustrated in FIG. 7, the representation of the extent to which each sense type dominantly works as the proportion makes it possible to set a stimulus more suitable for each user. The sense type information 141 is a sense type estimated on the basis of the information stored in the stimulus reaction history table 143.” (emphasis added), “[0073]… The presentation processing unit 150 sets a stimulus for the impression object included in the content stored in the content storage unit 160 on the basis of the information stored in the user model storage unit 140 and outputs it to the presentation unit 170. In one example, the presentation processing unit 150 can determine stimulus information to be set on the basis of the sense type information 141 in referring to the user model storage unit 140….”(emphasis added), “[0089]… [I]n FIG. 14,… [w]hen the user instructs the terminal 3 for guidance to the annex, the presentation processing unit 150 of the information presentation system 100 refers to the user model storage unit 140 to acquire the sense type of the user. Then, the guide information is presented depending on the sense type of the user. In one example, a guide object 31, which indicates visually the position of the annex displayed on the display 30, is highlighted and displayed to the visual sense type user. In one example, the guide object 31 is displayed on the display 30 to be smaller than the case where the guide object 31 is presented to the visual sense type user, but the position of the annex is notified by sound to the auditory sense type user. Furthermore, in one example, the guide object 31 is displayed on the display 30 to be smaller as in the case of the auditory sense type user, but the position of the annex can be notified to the tactile sense type user by vibrating the terminal 3 in the case where the annex is displayed in the display 30.” (emphasis added), “[0105]… FIG. 18 is a hardware configuration diagram illustrating the hardware configuration of a functional unit that constitutes the information presentation system 100…. FIG. 18 is described as an example of the hardware configuration of an information processing apparatus 900 including the respective functional units illustrated in FIG. 3….”, “[0106]… The information processing apparatus 900 includes a central processing unit (CPU) 901….”, and “[0107] The CPU 901 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the information processing apparatus 900 in accordance with various programs….”, see also FIGs. 2, 10-13, and 15-16, [0038], [0054], and [0093]-[0094]); and
presenting, by the one or more processors on the user interface, the user interface sensory element configuration selected from the user interface sensory element bundle (Yoshikawa: FIGs. 3, 14, and 18, “[0089]… [I]n FIG. 14,… [w]hen the user instructs the terminal 3 for guidance to the annex, the presentation processing unit 150 of the information presentation system 100 refers to the user model storage unit 140 to acquire the sense type of the user. Then, the guide information is presented depending on the sense type of the user. In one example, a guide object 31, which indicates visually the position of the annex displayed on the display 30, is highlighted and displayed to the visual sense type user. In one example, the guide object 31 is displayed on the display 30 to be smaller than the case where the guide object 31 is presented to the visual sense type user, but the position of the annex is notified by sound to the auditory sense type user. Furthermore, in one example, the guide object 31 is displayed on the display 30 to be smaller as in the case of the auditory sense type user, but the position of the annex can be notified to the tactile sense type user by vibrating the terminal 3 in the case where the annex is displayed in the display 30.” (emphasis added), “[0105]… FIG. 18 is a hardware configuration diagram illustrating the hardware configuration of a functional unit that constitutes the information presentation system 100…. FIG. 18 is described as an example of the hardware configuration of an information processing apparatus 900 including the respective functional units illustrated in FIG. 3….”, “[0106]… The information processing apparatus 900 includes a central processing unit (CPU) 901….”, and “[0107] The CPU 901 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the information processing apparatus 900 in accordance with various programs….”, see also FIGs. 2, 10-13 and 15-16, [0038], and [0093]-[0094]).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to include: the features taught by Yoshikawa, such that Kratz as modified teaches: presenting, by one or more processors on a user interface, a plurality of user interface elements, wherein each user interface element of the plurality of user interface elements includes components catering to different sensory perceptions from other user interface elements of the plurality of user interface elements (one or more processors and user interface of Kratz as modified combined with presenting, one or more processors, user interface, plurality of user interface elements, each user interface element, components, different sensory perceptions, and other user interface elements of Yoshikawa); measuring, by one or more sensors, reactions of a user of the electronic device to the plurality of user interface elements (electronic device of Kratz as modified combined with measuring, one or more sensors, reactions, user, electronic device, and plurality of user interface elements of Yoshikawa); determining, by the one or more processors from the reactions, a user sensory preference reaction score (one or more processors of Kratz as modified combined with determining, one or more processors, reactions, and user sensory preference reaction score of Yoshikawa); and thereafter, receiving, by a communication device from a remote electronic device across a network, a user interface sensory element bundle comprising at least a first user interface sensory element configuration enhancing a first human sense and diminishing a second human sense that is different from the first human sense and a second user interface sensory element configuration enhancing the second human sense and diminishing the first human sense (receiving, communication device, remote electronic device, network, user interface sensory element bundle, first and second user interface sensory element configurations, and human sense of Kratz as modified combined with the user interface sensory element bundle, first and second user interface sensory element configurations, and first and second human senses of Yoshikawa); selecting, by the one or more processors, a user interface sensory element configuration from the user interface sensory element bundle to enhance a preferred human sense selected from the first human sense or the second human sense and identified by the a user sensory preference reaction score (selecting, one or more processors, user interface sensory element configuration, user interface sensory element bundle, and human sense of Kratz as modified combined with selecting, one or more processors, user interface sensory element configuration, user interface sensory element bundle, preferred human sense, first and second human senses, and user sensory preference reaction score of Yoshikawa); and presenting, by the one or more processors on the user interface, the user interface sensory element configuration selected from the user interface sensory element bundle (presenting, one or more processors, user interface, user interface sensory element configuration, and user interface sensory element bundle of Kratz as modified combined with presenting, one or more processors, user interface, user interface sensory element configuration, and user interface sensory element bundle of Yoshikawa), to “present[] information in a presentation way that allows each user to easily accept information.” (Yoshikawa: [0005]).
Regarding claim 19, Kratz as modified by Yoshikawa teaches:
The method of claim 18, wherein the presenting the user interface sensory element configuration selected from the user interface sensory element bundle occurs only as long as the user is using the electronic device (Yoshikawa: e.g., where the user only uses the electronic device while content is played back and the user sense type is the same during play back, or where the user turns off the electronic device; see FIG. 9 and “[0078]… When the content presentation way is determined in step S120, the content is played back (S130), and the content is presented to the user….”, and “[0079]… The processing from step S120 to SS160 is performed, in one example, until the playback of the content ends (S170).”).
Claims 13-16 are rejected under 35 U.S.C. 103 as being unpatentable over Kratz in view of Yoshikawa, in further view of Hara et al. in JP 2001-252265 A (hereinafter Hara; an original copy and full machine translation thereof was provided with the May 9, 2025 Office action).
Regarding claim 13, Kratz as modified by Yoshikawa teaches:
The electronic device of claim 12, wherein the user interface sensory element configuration selected from the plurality of user interface sensory element configurations contained in the user interface sensory element bundle is enhanced as a function of a first combination of a visual appearance preferred by the authorized user of the electronic device, an olfactory appearance preferred by the authorized user of the electronic device, an aural appearance preferred by the authorized user of the electronic device, and a haptic appearance preferred by the authorized user of the electronic device and diminished as a second combination of the visual appearance preferred by the authorized user of the electronic device, the olfactory appearance preferred by the authorized user of the electronic device, the aural appearance preferred by the authorized user of the electronic device, and the haptic appearance preferred by the authorized user of the electronic device (Kratz: an olfactory appearance; FIGs. 1-4 and “[0030] FIG. 2 illustrates a user interface 202 forming the user input layer 140 on the display 145 of the smartphone 100 and displaying an olfactory sticker 204. The user interface 202 displays messages 210, such as chats, sent by a user of the smartphone 100 via the short range XCVRs 170 and the WWAN XCVRs 165 to a remote smartphone, and messages 212 received by the smartphone 100 and viewable by the user. The user interface 202 is configured to send and receive the olfactory sticker 204 encoded with olfactory information corresponding to a particular scent. The encoded olfactory information is configured to activate the olfactory transducer 200 to release the corresponding scent when the user engages the olfactory sticker 204. The olfactory sticker 204 graphically represents the encoded olfactory information to the user of the smartphone 100. The olfactory sticker 204 comprises a visual preview 206 of the scent conveyed. In an example, the visual preview 206 can be an icon, symbol, cartoon, picture, emoji, Bitmoji®, text, or similar visual information. In the example shown in FIG. 2, the visual preview 206 is a cartoon pineapple used to represent a scent of a pineapple that has been sent to the smartphone 100. A variety of images may be used as a visual preview 206 for the olfactory sticker 204. For example, a vanilla flower may be used to represent the scent of vanilla, a chocolate cake may be used to represent a scent of chocolate, and a strawberry may be used to represent a scent of a strawberry.”, see also [0032] and [0034]; Yoshikawa: e.g., a first combination of visual, aural, and haptic appearances corresponding to content creator selected arousal level 5 for impression object 23 preferred by the authorized user with an associated visual dominant sensory profile, and a second combination of visual, aural, and haptic appearances corresponding to content creator selected arousal level 4 for impression object 23 preferred by the authorized user with an associated visual dominant sensory profile; see FIGs. 10-12, “[0085]… In FIG. 12, the magnitude of movement of the impression object is set as the stimulus to the visual sense type user, the magnitude of the effect sound is set as a stimulus to the auditory sense type user, and the vibration time of the terminal 2 is set as a stimulus to the tactile sense type user. In this example, a user who uses the digital picture book has all characteristics of visual, auditory, and tactile sense types, but the visual sense type is more dominant than the other types, so the content is presented in such a manner that the movement of the impression object is emphasized more than other stimuli.”, and “[0086] The presentation processing unit 150 moves Wolf 23 of the arousal level 4 by 20% more than normal movement, outputs the sound effect with volume 10% higher than normal volume, vibrate the terminal 2 for three seconds, on the basis of the relationship between the stimulus information and the arousal level illustrated in FIG. 12…. In this manner, the content creator automatically sets the specific movement of the impression object only by setting of the impression object and the magnitude of the stimulus.”, see also “[0004]… [A] human being senses external environments using five senses, namely, visual, auditory, tactile, gustatory, and olfactory senses. It is known that a human being has each sense superior to others among these senses. In one example, in receiving information presentation, there is a case where information visually presented on a screen is easier to understand than that audibly presented by voice. Thus, how to present easy-to-understand information is different depending on the person. Such a difference is thought to be related to difference in the dominance of each person's five senses.”).
However, it is noted that Kratz as modified by Yoshikawa does not teach:
the first combination includes a gustatory appearance preferred by the authorized user of the electronic device, and the second combination includes the gustatory appearance preferred by the authorized user of the electronic device.
Hara teaches:
a gustatory appearance and the gustatory appearance (Hara: see FIG. 1 and “[[0061]] The stimulus generating means 3 transmits a stimulus corresponding to at least one of the five human senses, sight, hearing, smell, touch, and taste, to the user P by varying the stimulus in a time series manner…. Depending on the similarity of the fluctuation of the stimulus emitted from this stimulus generating means 3, the user P is induced to mental activity such as sleep induction, relaxation, normality, and activity.” (emphasis added)).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to include: the features taught by Hara, such that Kratz as modified teaches: wherein the user interface sensory element configuration selected from the plurality of user interface sensory element configurations contained in the user interface sensory element bundle is enhanced as a function of a first combination of a visual appearance preferred by the authorized user of the electronic device, an olfactory appearance preferred by the authorized user of the electronic device, an aural appearance preferred by the authorized user of the electronic device, a gustatory appearance preferred by the authorized user of the electronic device, and a haptic appearance preferred by the authorized user of the electronic device and diminished as a second combination of the visual appearance preferred by the authorized user of the electronic device, the olfactory appearance preferred by the authorized user of the electronic device, the aural appearance preferred by the authorized user of the electronic device, the gustatory appearance preferred by the authorized user of the electronic device, and the haptic appearance preferred by the authorized user of the electronic device (olfactory appearance of Kratz as modified combined with the user interface sensory element configuration and first and second combinations of Yoshikawa and gustatory appearances of Hara), to “chang[e] the mental activity state of the… [user].” (Hara: [[0009]]).
Regarding claim 14, Kratz as modified by Yoshikawa and Hara teaches:
The electronic device of claim 13, wherein the user interface sensory element configuration selected from the plurality of user interface sensory element configurations contained in the user interface sensory element bundle comprises text that is different from any other user interface sensory element configuration contained in the plurality of user interface sensory element configurations contained in the user interface sensory element bundle (Yoshikawa: highlighted “ANNEX” text in bottom left side of FIG. 14, and any other terminal 3 user interface sensory element configuration in the bottom middle and the bottom right side of FIG. 14; FIG. 14 and “[0089]… [I]n FIG. 14,… the guide information is presented depending on the sense type of the user. In one example, a guide object 31, which indicates visually the position of the annex displayed on the display 30, is highlighted and displayed to the visual sense type user. In one example, the guide object 31 is displayed on the display 30 to be smaller than the case where the guide object 31 is presented to the visual sense type user, but the position of the annex is notified by sound to the auditory sense type user. Furthermore, in one example, the guide object 31 is displayed on the display 30 to be smaller as in the case of the auditory sense type user, but the position of the annex can be notified to the tactile sense type user by vibrating the terminal 3 in the case where the annex is displayed in the display 30.”).
Regarding claim 15, Kratz as modified by Yoshikawa and Hara teaches:
The electronic device of claim 13, further comprising one or more sensors (111, 113, and 115 in FIG. 3 of Yoshikawa), wherein (Yoshikawa: see FIG. 3 and “[0043]… Examples of the sensors 110 include a biometric sensor 111 for acquiring biometric information of the user, a speech acquisition device 113 for acquiring speech, and an imaging device 115 for capturing an image of the user.” (emphasis added)):
the one or more sensors are configured to identify the authorized user of the electronic device when the authorized user is using the electronic device (Yoshikawa: identifying in sensing a user and/or in updating user model information; see FIGs. 3 and 7, “[0050]… the model setting unit 120 evaluates the estimated user's reaction and sets the user model. The user model represents the user's dominant sense, and in one example, is represented as the user's sense type. The presentation of each information item based on the user model that is set for each user makes it possible to appropriately stimulate different dominant senses for each user, thereby allowing each user to easily accept information. The model setting unit 120 stores the set user model in the user model storage unit 140…. In addition, the model setting unit 120 can update the user model on the basis of the reaction to the stimulus given to the user.” (emphasis added), and “[0054] The user model storage unit 140 is a storage unit that stores a user model representing the user's reaction to the stimulus. In the present embodiment, the user model storage unit 140 has user's sense type information 141 and a stimulus reaction history table 143. The sense type information 141 stores the sense type of each user estimated by the model setting unit 120 as a user model. In the sense type information 141, in one example, the proportion of each of visual, auditory, and tactile sense types is included for each user, as illustrated in FIG. 7. Some people have one sense that works dominantly…. Thus, as illustrated in FIG. 7, the representation of the extent to which each sense type dominantly works as the proportion makes it possible to set a stimulus more suitable for each user. The sense type information 141 is a sense type estimated on the basis of the information stored in the stimulus reaction history table 143.” (emphasis added), see also FIG. 1, 9, and 17, [0033]-[0036], [0079], and [0098]); and
the dominant sensory profile associated with the authorized user is stored in a user profile of the authorized user of the electronic device (Yoshikawa: see FIG. 3, “[0050]… the model setting unit 120 evaluates the estimated user's reaction and sets the user model. The user model represents the user's dominant sense, and in one example, is represented as the user's sense type. The presentation of each information item based on the user model that is set for each user makes it possible to appropriately stimulate different dominant senses for each user, thereby allowing each user to easily accept information. The model setting unit 120 stores the set user model in the user model storage unit 140…. In addition, the model setting unit 120 can update the user model on the basis of the reaction to the stimulus given to the user.” (emphasis added), and “[0054] The user model storage unit 140 is a storage unit that stores a user model representing the user's reaction to the stimulus. In the present embodiment, the user model storage unit 140 has user's sense type information 141 and a stimulus reaction history table 143. The sense type information 141 stores the sense type of each user estimated by the model setting unit 120 as a user model. In the sense type information 141, in one example, the proportion of each of visual, auditory, and tactile sense types is included for each user, as illustrated in FIG. 7. Some people have one sense that works dominantly…. Thus, as illustrated in FIG. 7, the representation of the extent to which each sense type dominantly works as the proportion makes it possible to set a stimulus more suitable for each user. The sense type information 141 is a sense type estimated on the basis of the information stored in the stimulus reaction history table 143.” (emphasis added), see also FIGs. 9 and 17, [0079], and [0097]).
Regarding claim 16, Kratz as modified by Yoshikawa and Hara teaches:
The electronic device of claim 15, wherein the one or more processors are configured to cease presenting the user interface sensory element configuration selected from the plurality of user interface sensory element configurations contained in the user interface sensory element bundle and select another user interface sensory element configuration from the plurality of user interface sensory element configurations contained in the user interface sensory element bundle for presentation on the user interface when the one or more sensors detect a user other than the authorized user using the electronic device (Yoshikawa: e.g., when another user associated with a dominant sensory profile different from that associated with the authorized user is detected by the one or more sensors, where the authorized user hands terminal 3 to the another user standing next to the authorized user; see FIGs. 3, 14, and 18, “[0048] The model setting unit 120 sets a user model representing the user's reaction to the stimulus for each user on the basis of the sensor information….”, “[0050]… the model setting unit 120 evaluates the estimated user's reaction and sets the user model. The user model represents the user's dominant sense, and in one example, is represented as the user's sense type. The presentation of each information item based on the user model that is set for each user makes it possible to appropriately stimulate different dominant senses for each user, thereby allowing each user to easily accept information. The model setting unit 120 stores the set user model in the user model storage unit 140…. In addition, the model setting unit 120 can update the user model on the basis of the reaction to the stimulus given to the user.”, “[0089]… [I]n FIG. 14,… [w]hen the user instructs the terminal 3 for guidance to the annex, the presentation processing unit 150 of the information presentation system 100 refers to the user model storage unit 140 to acquire the sense type of the user. Then, the guide information is presented depending on the sense type of the user. In one example, a guide object 31, which indicates visually the position of the annex displayed on the display 30, is highlighted and displayed to the visual sense type user. In one example, the guide object 31 is displayed on the display 30 to be smaller than the case where the guide object 31 is presented to the visual sense type user, but the position of the annex is notified by sound to the auditory sense type user. Furthermore, in one example, the guide object 31 is displayed on the display 30 to be smaller as in the case of the auditory sense type user, but the position of the annex can be notified to the tactile sense type user by vibrating the terminal 3 in the case where the annex is displayed in the display 30.”, “[0105]… FIG. 18 is a hardware configuration diagram illustrating the hardware configuration of a functional unit that constitutes the information presentation system 100…. FIG. 18 is described as an example of the hardware configuration of an information processing apparatus 900 including the respective functional units illustrated in FIG. 3….”, “[0106]… The information processing apparatus 900 includes a central processing unit (CPU) 901….”, and “[0107] The CPU 901 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the information processing apparatus 900 in accordance with various programs….”, see also FIGs. 1-2, 10-13, and 15-16, [0033]-[0036], [0038], [0081], and [0085]-[0086]).
Allowable Subject Matter
Claims 3-5, 9-11, 17, and 20 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to K. Kiyabu whose telephone number is (571) 270-7836. The examiner can normally be reached Monday to Thursday 9:00 A.M. - 5:00 P.M. EST.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Temesghen Ghebretinsae, can be reached at (571) 272-3017. The fax number for the organization where this application or proceeding is assigned is (571) 273-8300.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, Applicants are encouraged to use the USPTO Automated Interview Request (AIR) at https://www.uspto.gov/patents/uspto-automated-interview-request-air-form.
Information regarding the status of an application may be obtained from Patent Center. Status information for published applications may be obtained from Patent Center. Status information for unpublished applications is available through Patent Center for authorized users only. Should you have questions about access to Patent Center, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free).
/K. K./
Examiner, Art Unit 2626
/TEMESGHEN GHEBRETINSAE/Supervisory Patent Examiner, Art Unit 2626 2/9/26B