Prosecution Insights
Last updated: April 19, 2026
Application No. 18/438,596

Electronic Devices and Corresponding Methods for Utilizing User Sensory Preference Reaction Scores to Enhance User Interface Interactions

Non-Final OA §102§103§112
Filed
Feb 12, 2024
Examiner
HUYNH, LINDA TANG
Art Unit
2172
Tech Center
2100 — Computer Architecture & Software
Assignee
Motorola Mobility LLC
OA Round
1 (Non-Final)
36%
Grant Probability
At Risk
1-2
OA Rounds
3y 8m
To Grant
68%
With Interview

Examiner Intelligence

Grants only 36% of cases
36%
Career Allow Rate
100 granted / 274 resolved
-18.5% vs TC avg
Strong +32% interview lift
Without
With
+31.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 8m
Avg Prosecution
30 currently pending
Career history
304
Total Applications
across all art units

Statute-Specific Performance

§101
9.8%
-30.2% vs TC avg
§103
53.4%
+13.4% vs TC avg
§102
13.4%
-26.6% vs TC avg
§112
18.6%
-21.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 274 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION This Office Action is sent in response to Applicant's Response filed 09/30/2025 for 18438596. Claims 1-11 are pending. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Information Disclosure Statement The information disclosure statement (IDS) submitted on 09/05/2024 and 08/05/2025 were filed before the mailing date of a first action. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the IDS is being considered by the examiner. Election/Restrictions Applicant's election with traverse of Group I (claims 1-11) in the reply filed on 09/30/2025 is acknowledged. The traversal is on the ground(s) that amendments to claims 12-13, 15-17, 18, and 20 are directed to the same classification of Group I and are no longer a search burden. This is not found persuasive because claims 12-13, 15-17, 18, and 20 are currently withdrawn in view of Applicant's election of Group I including claims 1-11 as initially stated in the requirement for restriction mailed 09/22/2025 and as currently amended do not recite limitations of user identification. The requirement is still deemed proper and is therefore made FINAL. Claims 12-20 are withdrawn from further consideration pursuant to 37 CFR 1.142(b), as being drawn to a nonelected invention, there being no allowable generic or linking claim. Applicant timely traversed the restriction (election) requirement in the reply filed on 09/30/2025. In order to retain the right to rejoinder, applicant is advised that the claims to the nonelected invention(s) should be amended during prosecution to require the limitations of the elected invention. The propriety of a restriction requirement should be reconsidered when all the claims directed to the elected invention are in condition for allowance, and the nonelected invention(s) should be considered for rejoinder [see MPEP 821.04]. Response to Amendment While Applicant's amendments to claims 12-20 are acknowledged, for any amendment being filed in response to a restriction or election of species requirement and any subsequent amendment, any claims which are non-elected must have the status identifier (withdrawn). Any non-elected claims which are being amended must have either the status identifier (withdrawn) or (withdrawn – currently amended) and the text of the non-elected claims must be presented with markings to indicate the changes [see MPEP 714]. Claim Objections Claims 6 and 7 are objected to because of the following informalities. Claims 6 and 7 recite the term "and/or" which includes selective claim language, thus the term "or" is being selected to more clearly delineate the claim scope for purposes of examination. Appropriate correction is required. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. Claims 3-11 are rejected under 35 U.S.C. 112(b) as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor regards as the invention. Claim 3 recites the limitation "the presenting" which is unclear if the presenting refers to the "presenting… the user sensory preference reaction score" as recited in the instant claim or the "presenting… the one or more modified user interface elements" as recited in parent claim 1 and has been interpreted as "user input in response to the presenting --the one or more modified user interface elements--". Claims 4-11 are rejected as being indefinite for failing to remedy the deficiencies of parent claim 3. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim(s) 1 and 2 is/are rejected under 35 U.S.C. 102(a)(1)/(a)(2) as being anticipated by Yoshikawa et al. (US 20200184843 A1). As to claim 1, Yoshikawa discloses a method in an electronic device [Fig. 3, para 0077, system], the method comprising: identifying, by one or more sensors of the electronic device, a user using the electronic device [para 0049, 0054, 0078-0079, 0097, determine user using content presented by system by acquiring sensor information]; determining, by one or more processors of the electronic device, a dominant sensory profile associated with the user of the electronic device [para 0050, 0078-0079, 0097, processing unit sets sense type representing dominant sense of user model (read: dominant sensory profile) for user]; modifying, by the one or more processors, one or more user interface elements configured for presentation on a user interface of the electronic device as a function of the dominant sensory profile associated with the user of the electronic device to create a one or more modified user interface elements [para 0050, 0075, 0078-0079, 0097-0099, processing unit sets content with changed presentation way (read: function) referring to updated sense type of user model for user and presents changed content (read: modified user interface element) through presentation unit included in system sensors and presentation unit (read: user interface)]; and presenting, by the one or more processors on the user interface of the electronic device, the one or more modified user interface elements [para 0075, 0078-0079, 0097-0099, processing unit presents changed content via system display unit]. As to claim 2, Yoshikawa discloses the method of claim 1, wherein the determining the dominant sensory profile associated with the user of the electronic device comprises retrieving, by the one or more processors, a user sensory preference reaction score from a user profile stored in a memory of the electronic device [para 0054-0055, 0078-0079, 0085, system sets sense type of user model for user by referring to user model (read: user profile) for user including effect of user reaction to content stimulus (read: user sensory preference reaction score) as stored in storage unit of system]. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 3-11 is/are rejected under 35 U.S.C. 103 as being unpatentable over Yoshikawa as applied to claim 2 above, and further in view of Johnson et al. (US 20190220777 A1). As to claim 3, Yoshikawa discloses the method of claim 2, further comprising: … receiving, by the user interface, user input in response to the presenting [para 0049-0051, 0078-0079, 0098, sensors acquire user information (read: user input) reacting to presentation of changed content]; and adjusting one or more sensory preference elements of the user sensory preference reaction score as a function of the user input [para 0049-0051, 0054-0055, 0072, 0079, 0097, update sense type (read: sensory preference element) of user reaction effect based on user information acquired by sensors]. Yoshikawa teaches the method of claim 2, further comprising: presenting, by the one or more processors, the user interface [para 0043-0044, 0073, system with processing unit includes sensors and presentation unit] but not explicitly presenting, by the one or more processors, the user sensory preference reaction score from the user profile on the user interface. However, Yoshikawa teaches storing the user sensory preference reaction score from the user profile [para 0054-0055, 0079, 0085, 0098, processing unit sets user reaction effect for user model] and Johnson teaches presenting a user sensory preference reaction score from a user profile on a user interface [Figs. 3-4, para 0024, 0028-0029, dashboard (read: user interface) displays stored client sentiment metric]. Yoshikawa and Johnson are analogous art to the claimed invention being from a similar field of endeavor of information presentation systems. Thus it would have been obvious to one skilled in the art before the effective filing date of the claimed invention apply the teachings of Yoshikawa storing a user sensory preference reaction score from a user profile to the teachings of Johnson presenting user information on a user interface with a reasonable expectation of success to result in presenting, by the one or more processors, the user sensory preference reaction score from the user profile on the user interface [see MPEP 2143]. One of ordinary skill in the art would be motivated to apply this teaching to Yoshikawa to improve communications with clients [Johnson, para 0019]. As to claim 4, Yoshikawa discloses the method of claim 3, wherein the one or more sensory preference elements comprise a plurality of user sensory preference elements [Fig. 7, para 0054, sense type includes sense types for user]. As to claim 5, Yoshikawa discloses the method of claim 4, wherein the plurality of user sensory preference elements comprises: an eye-minded dominance score; … an ear-minded dominance score; and … a motor-minded dominance score [Figs. 7, 12, para 0051, 0054, 0085, sense types include excitement levels (read: dominance scores) for each of visual sense type (read: eye-minded), auditory sense type (read: ear-minded), and tactile sense type (read: motor-minded)]. Yoshikawa teaches a dominance score for a plurality of user sensory preference elements [Figs. 7, 12, para 0051, 0054, 0085, set excitement levels for sense types] but not explicitly a smell-minded dominance score and a taste-minded dominance score. However, Yoshikawa teaches a dominance score for a user sensory preference element [Figs. 7, 12, para 0051, 0054, 0085, set excitement level for sense type of user sense] and that human senses include smell and taste [para 0004, human senses including olfactory (read: smell) and gustatory (read: taste) senses]. Yoshikawa is analogous art to the claimed invention being from a similar field of endeavor of information presentation systems. Thus, it would have been obvious to one skilled in the art before the effective filing date of the claimed invention apply the teachings of Yoshikawa teaching providing dominance scores for user sensory preference elements to sensory preferences including smell and taste with a reasonable expectation of success to result in a smell-minded dominance score and a taste-minded dominance score [see MPEP 2143]. One of ordinary skill in the art would be motivated to apply this teaching to Yoshikawa to present easy to understand information related to dominance of a person's sense [Yoshikawa, para 0004]. As to claim 6, Yoshikawa discloses the method of claim 5, wherein the one or more user interface elements configured for presentation on the user interface of the electronic device comprise one or more of: user input controls [para 0089-0090, system presentation unit presents content including guide information based on user behavior, note strikethrough indicates non-selected alternatives]; As to claim 7, Yoshikawa discloses the method of claim 5, wherein: the one or more user interface elements configured for presentation on the user interface of the electronic device comprise informational components comprising text [Fig. 14, para 0088-0090, system presentation unit presents content including guide information, note Figure 14 shows guide object including text]; and the modifying the one or more user interface elements to create the one or more modified user interface elements comprises changing the text to enhance a characteristic associated with at least one user sensory preference element and diminish another characteristic associated with at least one other user sensory preference element [Fig. 14, para 0088-0090, changing content presentation includes presenting guide object with highlighted text; note the limitation "to enhance a characteristic associated with at least one user sensory preference element and diminish another characteristic associated with at least one other user sensory preference element" is not being given patentable weight as the term "to" suggests or makes optional and does not require the step to be performed as the limitation is an intended result of the "changing the text" as recited in the claim (see MPEP 2111.04), nevertheless note increasing guide text highlights (read: enhance) visual sense type while no (read: diminish) sound or vibration of auditory or tactile sense types are presented]. As to claim 8, Yoshikawa teaches the method of claim 5, wherein each of the eye-minded dominance score, … the ear-minded dominance score, … and the motor-minded dominance score is normalized to have a value [Figs. 7, 12, para 0051, 0054, 0085, set excitement levels (read: dominance scores) for each of visual sense type (read: eye-minded), auditory sense type (read: ear-minded), and tactile sense type (read: motor-minded) as numerical value, note setting levels as numerical values storable in system memory falls under the broadest reasonable interpretation of normalized including conforming to a standard of computer-readable instructions] but not explicitly a smell-minded dominance score and a taste-minded dominance score. However, Yoshikawa teaches a dominance score for a user sensory preference element [Figs. 7, 12, para 0051, 0054, 0085, set excitement level for sense type of user sense] and that human senses include smell and taste [para 0004, human senses including olfactory (read: smell) and gustatory (read: taste) senses]. Yoshikawa is analogous art to the claimed invention being from a similar field of endeavor of information presentation systems. Thus, it would have been obvious to one skilled in the art before the effective filing date of the claimed invention apply the teachings of Yoshikawa teaching providing dominance scores for user sensory preference elements to sensory preferences including smell and taste with a reasonable expectation of success to result in the smell-minded dominance score and the taste-minded dominance score [see MPEP 2143]. One of ordinary skill in the art would be motivated to apply this teaching to Yoshikawa to present easy to understand information related to dominance of a person's sense [Yoshikawa, para 0004]. However, Yoshikawa does not specifically disclose wherein a score is normalized to have a value between one and negative one, inclusive. Johnson discloses wherein a score is normalized to have a value between one and negative one, inclusive [para 0020, normalize sentiment score on scale between 0 and 1, , where one of ordinary skill would recognize that a normalized score range between 0 and 1 would overlap a normalized score range between 1 and -1, inclusive (see MPEP 2144.05)]. Yoshikawa and Johnson are analogous art to the claimed invention being from a similar field of endeavor of information presentation systems. Thus, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the score value as disclosed by Yoshikawa with a normalized score between one and negative one as disclosed by Johnson with a reasonable expectation of success. One of ordinary skill in the art would be motivated to modify Yoshikawa as described above to provide an ability to view and analyze a change in emotion over time [Johnson, para 0021]. As to claim 9, Yoshikawa discloses the method of claim 8, further comprising, prior to the determining: presenting, by the one or more processors on the user interface, a plurality of user interface elements [Fig. 10, 0084-0085, processing unit presents content including multiple objects (read: plurality of user interface elements) through presentation unit]; wherein each user interface element of the plurality of user interface elements includes components catering to different sensory perceptions from other user interface elements of the plurality of user interface elements [Figs. 10, 13, para 0084-0087, set stimulus information (read: components) of visual, auditory, and tactile sense types (read: different sensory perceptions) for each object of presented content objects]. As to claim 10, Yoshikawa discloses the method of claim 9, further comprising: measuring, by one or more sensors, reactions of the user of the electronic device to the plurality of user interface elements [para 0048-0049, 0097-0098, sensors acquire user reactions to content including objects presented through presentation unit]; and determining, by the one or more processors from the reactions, the user sensory preference reaction score [para 0054-0055, 0079, 0085, 0098, processing unit sets user reaction effect based on user reactions acquired by sensors]. As to claim 11, Yoshikawa discloses the method of claim 10, further comprising, when the one or more sensors detect another user using the electronic device [para 0049, 0054, 0078-0079, 0097, determine user of users, note storing user information for multiple users]: repeating the presenting the plurality of user interface elements [para 0050, 0054, 0084-0085, processing unit presents each content including multiple objects through presentation unit for each user]; measuring other reactions of the another user of the electronic device to the plurality of user interface elements [para 0048-0050, 0097-0098, sensors acquire reactions of each user of users to content including objects presented through presentation unit]; and determining, by the one or more processors from the other reactions, another user sensory preference reaction score [para 0054-0055, 0079, 0085, 0098, processing unit sets user reaction effect of each user of users based on user reactions acquired by sensors]. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Kwatra et al. (US 20220129285 A1) generally discloses modifying user interface elements based on user focus types. Any inquiry concerning this communication or earlier communications from the examiner should be directed to LINDA HUYNH whose telephone number is (571)272-5240 and email is linda.huynh@uspto.gov. The examiner can normally be reached M-F between 9am-5pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Adam Queler can be reached at (571) 272-4140. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /LINDA HUYNH/Primary Examiner, Art Unit 2172
Read full office action

Prosecution Timeline

Feb 12, 2024
Application Filed
Oct 31, 2025
Non-Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12578837
USER INTERFACES FOR MANAGING SHARING OF CONTENT IN THREE-DIMENSIONAL ENVIRONMENTS
2y 5m to grant Granted Mar 17, 2026
Patent 12547310
INFORMATION PROCESSING DEVICE
2y 5m to grant Granted Feb 10, 2026
Patent 12541287
INTEGRATED ENERGY DATA SCIENCE PLATFORM
2y 5m to grant Granted Feb 03, 2026
Patent 12524136
EVENT TRANSCRIPT PRESENTATION
2y 5m to grant Granted Jan 13, 2026
Patent 12524124
RECORDING FOLLOWING BEHAVIORS BETWEEN VIRTUAL OBJECTS AND USER AVATARS IN AR EXPERIENCES
2y 5m to grant Granted Jan 13, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
36%
Grant Probability
68%
With Interview (+31.9%)
3y 8m
Median Time to Grant
Low
PTA Risk
Based on 274 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month