Prosecution Insights
Last updated: April 19, 2026
Application No. 18/704,805

DISPLAYING INDICATIONS CORRESPONDING TO SETTINGS OF HARDWARE VEHICLE CONTROLS

Final Rejection §101§102§103
Filed
Apr 25, 2024
Examiner
ALKIRSH, AHMED
Art Unit
3668
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Atieva, Inc.
OA Round
2 (Final)
54%
Grant Probability
Moderate
3-4
OA Rounds
3y 0m
To Grant
99%
With Interview

Examiner Intelligence

Grants 54% of resolved cases
54%
Career Allow Rate
23 granted / 43 resolved
+1.5% vs TC avg
Strong +54% interview lift
Without
With
+53.7%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
63 currently pending
Career history
106
Total Applications
across all art units

Statute-Specific Performance

§101
20.2%
-19.8% vs TC avg
§103
54.5%
+14.5% vs TC avg
§102
22.5%
-17.5% vs TC avg
§112
2.8%
-37.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 43 resolved cases

Office Action

§101 §102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims Claims 1-13, 15, 20-23,25-26, 28-29 and 30-35 of U.S. Application No. 18/704,805 were examined. Examiner filed a non-final office action on 09/08/2025. Applicant filed remarks on 12/08/2025. Claims 1-13, 15, 20-23,25-26, 28-29 and 30-35 are presently pending examination. Response to Arguments Regarding the claim rejections under 35 USC 101: Applicant's arguments filed 12/08/2025 have been fully considered but they are not persuasive. Claims 1, 20-22 and 25-29, Applicant argues that the claims are not directed to a mental process because they require operations in a computer system of a vehicle and presenting on a display device, which cannot practically be performed in the human mind. Applicant further contends that the claims represent an improvement to computer functionality under Enfish, LLC v. Microsoft Corp., 822 F.3d 1327 (Fed. Cir. 2016), by allowing the user to see the “present setting” of the hardware control, improving over Sakamaki. Additionally, Applicant asserts that examination should focus on §§ 102, 103, and 112 per Ex parte Desjardins, No. 2024-000567 (Appeals Review Panel, P.T.A.B. Sep. 26, 2025), and cites the USPTO memorandum of August 4, 2025, p. 2, stating that claims do not recite a mental process if limitations cannot practically be performed in the human mind. However, this argument is not persuasive The claims are directed to the abstract idea of receiving input from user interactions with controls and presenting corresponding indications, which falls under the grouping of mental processes (observation, evaluation, and judgment) as identified in the 2019 Revised Patent Subject Matter Eligibility Guidance. Specifically, the steps of “receiving… an input generated by user interaction with any of a plurality of hardware controls” and “presenting… a container… configured for displaying indications that correspond to and reflect present settings” can be performed as mental steps or with pen and paper, such as noting control interactions and sketching corresponding status indicators. The recitation of a “computer system of a vehicle” and “display device of the vehicle” amounts to generic computer implementation and does not integrate the abstract idea into a practical application. Regarding Enfish, the claims here do not improve computer functionality itself (e.g., no novel data structure or processing efficiency as in a self-referential table); instead, they apply the abstract idea to a vehicular context using routine display technology, without technological improvement. The alleged improvement over Sakamaki (displaying “present settings”) is functional rather than technical and does not transform the claims. As to the USPTO memorandum (Aug. 4, 2025, p. 2), while a human mind may not directly “receive” electronic inputs, the core idea remains performable mentally when abstracted from the generic hardware—the additional elements are extra solution activity. Finally, while §§ 102 and 103 are addressed separately, § 101 analysis is independent and warranted here, as Desjardins does not preclude § 101 rejections for abstract ideas even when novelty is argued. The additional elements, individually or in combination, do not amount to significantly more than the abstract idea, as they involve well-understood, routine, and conventional activities (e.g., displaying control status on a screen). Regarding the claim rejections under 35 USC 102 and 103: Applicant's arguments filed 12/08/2025 with respect to Sakamaki (US20200339174A1) have been fully considered but they are not persuasive. Regarding Claim 1-3 and 34, Applicant argues that Sakamaki does not anticipate claim 1 because it does not disclose or suggest displaying indications that reflect the “present settings” of hardware controls. Applicant asserts that Sakamaki’s function presentation units 54 and 55 only display labels like “Temp” or “Map,” which indicate the assigned function but not the actual present setting (e.g., the current temperature value or map scale level). Applicant cites Sakamaki ¶¶ 0039 and 0078, noting that the Office Action takes undue liberties in interpreting “map scaling setting,” and points to FIGS. 17 and 19-21 as showing no display of present settings. Applicant further contends that claim 1 requires the indications to reflect “present settings” as understood by a person of ordinary skill, and that Sakamaki’s approach of alternatively displaying function labels does not teach a “container” configured for such indications corresponding to a plurality of hardware controls assigned to respective vehicle systems. However the Examiner respectfully disagrees, this argument is not persuasive. Sakamaki anticipates the claimed subject matter, as the function presentation units 54 and 55 serve as the “container” on a display device, presenting indications (e.g., “Temp” or “Map”) that correspond to and reflect the present settings of the hardware controls (operation switches 52 and 53). The claim term “present settings” is broadly interpreted in light of the specification to encompass the current assignment or state of the control’s function, which Sakamaki discloses by displaying marks that indicate the actively assigned setting upon user interaction. Sakamaki [0078]: “the first operation switch 52 alternatively displays a mark (letter “Temp”) indicating a temperature adjustment of an air conditioning function and a mark (letter “Map”) indicating a map scaling setting of a navigational function in the function presenting unit 54 according to the received function presentation signal.” Sakamaki [0081]: “When the air conditioning function is assigned to the operation switches 14 and 15, the main control unit 12 assigns the temperature adjustment to the first operation switch 14… causes a mark indicating the temperature adjustment to be displayed on the function presentation unit 54.” This reflects the “present setting” as the control is currently set to adjust temperature (or map scaling), triggered by user input, meeting the “receiving… an input” and “presenting… a container” limitations. Sakamaki’s FIGS. 20 and 21 illustrate this display in response to assignment, contrary to Applicant’s assertion. The claim does not require displaying numerical values (e.g., exact temperature); the indications need only “reflect” settings, which the function marks do by showing the current operational state. Regarding the plurality of controls corresponding to vehicle systems, Sakamaki [0035]: “The multimedia system functions include an audio function, an air conditioning function, a navigation function, or the like.” This teaches controls (switches 14-17) assigned to respective systems. For claim 2, Applicant argues Sakamaki does not display even one container, let alone multiple. The rejection is maintained, as Sakamaki’s units 54 and 55 function as multiple containers or sub-elements within the switch group display. However the Examiner respectfully disagrees, this argument is not persuasive. Sakamaki [0078]: “The second operation switch 52 has a function presentation unit 55… alternatively displays a mark indicating an air volume adjustment of an air conditioning function (figure of windmill”) and a mark “Dest” indicating a destination setting of the navigation function in the function presentation unit 55.” For claim 3, Applicant argues Sakamaki [0040] does not describe units 54 and 55 as the container. The rejection is maintained, as [0040] ties the switching to the presentation units. However the Examiner respectfully disagrees, this argument is not persuasive. Sakamaki [0079]: “The switching controller 12 d switches a display image displayed on an in-vehicle display 9, functions assigned to the operation switches 14 and 15, and the presentation of the function presentation units 54 and 55 in association with each other.” For claim 34, Applicant argues Sakamaki [0063] does not describe units 54 and 55. The rejection is maintained, as [0063] describes the switch groups including operation switches for multimedia functions, integrated with the presentation units. However the Examiner respectfully disagrees, this argument is not persuasive. Sakamaki [0035]: “The right switch group 7 disposed on the right spoke portion 2 c is a switch group to which a multimedia system functions are assigned, and includes four switches 14 to 17 of, a first operation switch 14, a second operation switch 15, a first function selection switch 16, and a second function selection switch 17.” Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-13, 15, 20-23,25-26, 28-29 and 30-35 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more. The claimed invention is directed to the concept of displaying indications that correspond to settings of hardware vehicle controls. This judicial exception is not integrated into a practical application. The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception and do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The Examiner will further explain in view of the Revised Patent Subject Matter Eligibility Guidance: Claims 1 is directed to a computer system of a vehicle (i.e., an apparatus). Therefore, claim 1 is within at least one of the four statutory categories. 101 Analysis – Step 2A, Prong I Regarding Prong I of the Step 2A analysis, the claims are to be analyzed to determine whether they recite subject matter that falls within one of the follow groups of abstract ideas: a) mathematical concepts, b) certain methods of organizing human activity, and/or c) mental processes. Independent claims 1, 20-22 and 25-29 include limitations that recite an abstract idea (emphasized below) and will be used as a representative claim for the remainder of the 101 rejection. Claims 1, 20-22 and 25-29 recites: A computer-based method comprising: receiving, in a computer system of a vehicle, an input generated by user interaction with any of a plurality of hardware controls in the vehicle, the plurality of hardware controls corresponding to respective vehicle systems; and presenting, in response to the input, a container on a display device of the vehicle, the container configured for displaying indications that correspond to and reflect present settings of each of the plurality of hardware controls. 101 Analysis – Step 2A, Prong II Regarding Prong II of the Step 2A analysis, the claims are to be analyzed to determine whether the claim, as a whole, integrates the abstract into a practical application. As noted, it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception. The courts have indicated that additional elements merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application.” In the present case, the additional limitations beyond the above-noted abstract idea are as follows (where the underlined portions are the “additional limitations” while the bolded portions continue to represent the “abstract idea”): A computer-based method comprising: receiving, in a computer system of a vehicle, an input generated by user interaction with any of a plurality of hardware controls in the vehicle, the plurality of hardware controls corresponding to respective vehicle systems; and presenting, in response to the input, a container on a display device of the vehicle, the container configured for displaying indications that correspond to and reflect present settings of each of the plurality of hardware controls. For the following reason(s), the examiner submits that the above identified additional limitations do not integrate the above-noted abstract idea into a practical application. Regarding the additional limitations of “computer,” the examiner submits that these limitations are an attempt to generally link additional elements to a technological environment. In particular, the, receiving, presenting by a computer is recited at a high level of generality and merely automates the determining steps, therefore acting as a generic computer to perform the abstract idea. The computer is claimed generically and is operating in its ordinary capacity and does not use the judicial exception in a manner that imposes a meaningful limit on the judicial exception, such that the claim is more than a drafting effort designed to monopolize the exception. The additional limitation is no more than mere instructions to apply the exception using a computer processor. Thus, taken alone, the additional elements do not integrate the abstract idea into a practical application. Further, looking at the additional limitation(s) as an ordered combination or as a whole, the limitation(s) add nothing that is not already present when looking at the elements taken individually. For instance, there is no indication that the additional elements, when considered as a whole, reflect an improvement in the functioning of a computer or an improvement to another technology or technical field, apply or use the above-noted judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition, implement/use the above-noted judicial exception with a particular machine or manufacture that is integral to the claim, effect a transformation or reduction of a particular article to a different state or thing, or apply or use the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is not more than a drafting effort designed to monopolize the exception (MPEP § 2106.05). Accordingly, the additional limitation(s) do/does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. 101 Analysis – Step 2B Regarding Step 2B of the Revised Guidance, representative independent claim 1 does not include additional elements (considered both individually and as an ordered combination) that are sufficient to amount to significantly more than the judicial exception for the same reasons to those discussed above with respect to determining that the claim does not integrate the abstract idea into a practical application. As discussed above with respect to integration of the abstract idea into a practical application, the additional element of “computer” amounts to nothing more than mere instructions to apply the exception using a generic computer component. Mere instructions to apply an exception using a generic computer component cannot provide an inventive concept. Hence, the claim is not patent eligible. Dependent claims 2-15, 30-35 and 23 do not recite any further limitations that cause the claim(s) to be patent eligible. Rather, the limitations of dependent claims are directed toward additional aspects of the judicial exception and/or well-understood, routine and conventional additional elements that do not integrate the judicial exception into a practical application. Therefore, dependent claims 22-15, 30-35 and 23 are not patent eligible under the same rationale as provided for in the rejection of Claims 1, 20-22 and 25-29. Therefore, claims 2-15, 30-35 and 23 are ineligible under 35 USC §101. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim 1-3 and 34 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Sakamaki (US20200339174A1). Regarding claim 1, Sakamaki discloses A computer-based method comprising: receiving, in a computer system of a vehicle, an input generated by user interaction with any of a plurality of hardware controls in the vehicle, the plurality of hardware controls corresponding to respective vehicle systems (“when the user operates the switch for the audio function, the operation unit 10 outputs a function selection detection signal indicating that the user has selected the audio function to the main control unit 12. The same applies to a case in which switches for the air conditioning function and the navigation function are disposed around the center display 3.” [0039]); and presenting, in response to the input, a container on a display device of the vehicle, the container configured for displaying indications that correspond to and reflect present settings of each of the plurality of hardware controls (“when receiving a function presentation signal from a main control unit 12, the first operation switch 52 alternatively displays a mark (letter “Temp”) indicating a temperature adjustment of an air conditioning function and a mark (letter “Map”) indicating a map scaling setting of a navigational function in the function presenting unit 54 according to the received function presentation signal.” [0078]). Regarding claim 2, Sakamaki discloses The computer-based method of claim 1, wherein the vehicle has multiple containers for presentation on the display device, the method further comprising selecting the container from among the multiple containers for presentation, the container selected based on the received input (“The image data output device 11 includes an audio device 23, an air conditioning device 24, and a navigation device 25, and outputs image data to the display control unit 13, and when receiving a control signal from the main control unit 12, the image data output device 11 executes processing according to the input control signal. In other words, the audio device 23 outputs the image data of the audio display image to the display control unit 13, and when receiving the control signal from the main control unit 12, the audio device 23 performs the volume adjustment and channel selection/music selection setting or the like according to the input control signal.” [0040]). Regarding claim 3, Sakamaki discloses The computer-based method of claim 2, wherein the container selected from among the multiple containers is a volume control container (“The image data output device 11 includes an audio device 23, an air conditioning device 24, and a navigation device 25, and outputs image data to the display control unit 13, and when receiving a control signal from the main control unit 12, the image data output device 11 executes processing according to the input control signal. In other words, the audio device 23 outputs the image data of the audio display image to the display control unit 13, and when receiving the control signal from the main control unit 12, the audio device 23 performs the volume adjustment and channel selection/music selection setting or the like according to the input control signal.” [0040]). Regarding claim 34, Sakamaki discloses The computer-based method of claim 1, further comprising ceasing to present the container in response to a predefined time elapsing after a most recent user interaction with any of the plurality of hardware controls (“Further, even in a case where the user does not select the navigation function, when a predetermined time period (e.g., 10 seconds) has elapsed from a time, as a starting point, at which the user selects the audio function without operating the operation switches 14 and 15, the main control unit 12 causes the center display 3 to display the navigation display image A1 again, as shown in FIG. 8.” [0063]). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 4-6 are rejected under 35 U.S.C. 103 as being unpatentable over Sakamaki in view of Nagata (US8626387B1), hereinafter referred to as Sakamaki and Nagata respectively. Regarding claim 4, Sakamaki discloses The computer-based method of claim 3, Sakamaki does not explicitly teach wherein the plurality of hardware controls includes multiple hardware volume controls, wherein each of the multiple hardware volume controls is configured for increasing and decreasing volume of audio in the vehicle, and wherein the volume control container is presented in response to the user interaction occurring with any of the multiple hardware volume controls However, Nagata does teach wherein the plurality of hardware controls includes multiple hardware volume controls, wherein each of the multiple hardware volume controls is configured for increasing and decreasing volume of audio in the vehicle, and wherein the volume control container is presented in response to the user interaction occurring with any of the multiple hardware volume controls (“Referring to FIG. 2, the vehicle 100 and dash console 102 are shown. The dash console 102 includes a variety of physical controls 104, such as rotatable knobs 106 a, 106 b and pressable button columns 108 a, 108 b.” [Col.3 ln 56-68] ; “These physical controls 104 can be used to control a variety of vehicle systems or vehicle functions, such as air conditioning, radio station presets, and navigation controls. For example, the rotatable knobs 106 a, 106 b can be used to adjust climate control settings and radio volume, while the pressable button columns 108 a, 108 b may be used to select radio station presets and to control a navigation unit.” [Col.3 ln 61-67]; “For example, the steering wheel 140 may have climate control buttons 142 a or volume buttons 142 b on the steering wheel 140.” [Col.6 ln 4-6]; “The processor 112 can instruct a display device 116 to display the information of interest 114 and/or vehicle system control options. This allows the information of interest 114 to be displayed on the display device 116 before or soon after the occupant 120 has made physical contact with the physical control 104.” [Col.2 ln 59-64]). Both Sakamaki and Nagata teach methods for vehicle controls based on user input. However, Nagata explicitly teaches a plurality of hardware controls includes multiple hardware volume controls, wherein each of the multiple hardware volume controls is configured for increasing and decreasing volume of audio in the vehicle. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the vehicle control method of Sakamaki to also include a plurality of hardware controls includes multiple hardware volume controls, wherein each of the multiple hardware volume controls is configured for increasing and decreasing volume of audio in the vehicle, as taught by Nagata, with a reasonable expectation of success. Doing so improves the vehicle control method (With regard to this reasoning, see at least [Nagata, [Col.3 ln 56-68], [Col.3 ln 61-67], [Col.6 ln 4-6], [Col.2 ln 59-64]]). Regarding claim 5, Sakamaki discloses The computer-based method of claim 4, Sakamaki does not explicitly teach wherein at least one of the multiple hardware volume controls is a roller control, and wherein at least another one of the multiple hardware volume controls is a toggle control However, Nagata does teach wherein at least one of the multiple hardware volume controls is a roller control, and wherein at least another one of the multiple hardware volume controls is a toggle control (“Referring to FIG. 2, the vehicle 100 and dash console 102 are shown. The dash console 102 includes a variety of physical controls 104, such as rotatable knobs 106 a, 106 b and pressable button columns 108 a, 108 b.” [Col.3 ln 56-68] ; “These physical controls 104 can be used to control a variety of vehicle systems or vehicle functions, such as air conditioning, radio station presets, and navigation controls. For example, the rotatable knobs 106 a, 106 b can be used to adjust climate control settings and radio volume, while the pressable button columns 108 a, 108 b may be used to select radio station presets and to control a navigation unit.” [Col.3 ln 61-67]; “For example, the steering wheel 140 may have climate control buttons 142 a or volume buttons 142 b on the steering wheel 140.” [Col.6 ln 4-6]). Both Sakamaki and Nagata teach methods for vehicle controls based on user input. However, Nagata explicitly teaches at least one of the multiple hardware volume controls is a roller control, and wherein at least another one of the multiple hardware volume controls is a toggle control. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the vehicle control method of Sakamaki to also include at least one of the multiple hardware volume controls is a roller control, and wherein at least another one of the multiple hardware volume controls is a toggle control, as taught by Nagata, with a reasonable expectation of success. Doing so improves the vehicle control method (With regard to this reasoning, see at least [Nagata, [Col.3 ln 56-68], [Col.3 ln 61-67], [Col.6 ln 4-6], [Col.2 ln 59-64]]). Regarding claim 6, Sakamaki discloses The computer-based method of claim 4, Sakamaki does not explicitly teach wherein at least one of the multiple hardware volume controls is positioned on a steering wheel of the vehicle, and wherein at least another one of the multiple hardware volume controls is positioned on an instrument panel of the vehicle However, Nagata does teach wherein at least one of the multiple hardware volume controls is positioned on a steering wheel of the vehicle, and wherein at least another one of the multiple hardware volume controls is positioned on an instrument panel of the vehicle (“Referring to FIG. 2, the vehicle 100 and dash console 102 are shown. The dash console 102 includes a variety of physical controls 104, such as rotatable knobs 106 a, 106 b and pressable button columns 108 a, 108 b.” [Col.3 ln 56-68] ; “These physical controls 104 can be used to control a variety of vehicle systems or vehicle functions, such as air conditioning, radio station presets, and navigation controls. For example, the rotatable knobs 106 a, 106 b can be used to adjust climate control settings and radio volume, while the pressable button columns 108 a, 108 b may be used to select radio station presets and to control a navigation unit.” [Col.3 ln 61-67]; “For example, the steering wheel 140 may have climate control buttons 142 a or volume buttons 142 b on the steering wheel 140.” [Col.6 ln 4-6]). Both Sakamaki and Nagata teach methods for vehicle controls based on user input. However, Nagata explicitly teaches at least one of the multiple hardware volume controls is positioned on a steering wheel of the vehicle, and wherein at least another one of the multiple hardware volume controls is positioned on an instrument panel of the vehicle. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the vehicle control method of Sakamaki to also include at least one of the multiple hardware volume controls is positioned on a steering wheel of the vehicle, and wherein at least another one of the multiple hardware volume controls is positioned on an instrument panel of the vehicle, as taught by Nagata, with a reasonable expectation of success. Doing so improves the vehicle control method (With regard to this reasoning, see at least [Nagata, [Col.3 ln 56-68], [Col.3 ln 61-67], [Col.6 ln 4-6], [Col.2 ln 59-64]]). Claims 7-9 are rejected under 35 U.S.C. 103 as being unpatentable over Sakamaki in view of Feit et al. (US20210011618A1), hereinafter referred to as Sakamaki and Feit respectively. Regarding claim 7, Sakamaki discloses The computer-based method of claim 3, Sakamaki does not explicitly teach wherein the volume control container is presented in response to user interaction with a hardware volume control, and wherein the hardware volume control is configured for controlling whichever one of multiple sound sources in the vehicle that is presently in focus However, Feit does teach wherein the volume control container is presented in response to user interaction with a hardware volume control, and wherein the hardware volume control is configured for controlling whichever one of multiple sound sources in the vehicle that is presently in focus (“In some embodiments, infotainment system controller 205 receives input from external controls that provides instruction to infotainment system controller 205. For example, infotainment system controller 205 may be connected to a physical volume controller, such as a volume knob that allows a user to change the volume of one or more media outputs 225 by rotating the knob.” [0063] and “In some embodiments, when a volume control (not shown) is modified through infotainment system 200 (shown in FIG. 2), a current volume indicator is shown over audio information 124 for a predetermined period of time (i.e., 3 seconds). As shown in FIG. 9C, audio information 124 displays a speaker icon, a numerical value for the current volume, and a relative bar that illustrates where the current volume is relative to maximum and minimum volume. In some embodiments, the speaker icon changes based on the current audio source.” [0118]). Both Sakamaki and Feit teach methods for vehicle controls based on user input. However, Feit explicitly teaches wherein the volume control container is presented in response to user interaction with a hardware volume control, and wherein the hardware volume control is configured for controlling whichever one of multiple sound sources in the vehicle that is presently in focus. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the vehicle control method of Sakamaki to also include wherein the volume control container is presented in response to user interaction with a hardware volume control, and wherein the hardware volume control is configured for controlling whichever one of multiple sound sources in the vehicle that is presently in focus, as taught by Feit, with a reasonable expectation of success. Doing so improves the vehicle control method (With regard to this reasoning, see at least [Feit, 0118]). Regarding claim 8, Sakamaki discloses The computer-based method of claim 7, Sakamaki does not explicitly teach wherein the volume control container is presented with a respective icon corresponding to a corresponding one of the multiple sound sources However, Feit does teach wherein the volume control container is presented with a respective icon corresponding to a corresponding one of the multiple sound sources “As shown in FIG. 9C, audio information 124 displays a speaker icon, a numerical value for the current volume, and a relative bar that illustrates where the current volume is relative to maximum and minimum volume. In some embodiments, the speaker icon changes based on the current audio source. For example, if the phone is the current audio source, then the speaker icon would be replaced with the phone app icon. Other examples may include, but are not limited to, the auxiliary source icon, text-to-speech and voice recognition icon, and in-car public announcement system icon.” [0118]). Both Sakamaki and Feit teach methods for vehicle controls based on user input. However, Feit explicitly teaches wherein the volume control container is presented with a respective icon corresponding to a corresponding one of the multiple sound sources. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the vehicle control method of Sakamaki to also include wherein the volume control container is presented with a respective icon corresponding to a corresponding one of the multiple sound sources, as taught by Feit, with a reasonable expectation of success. Doing so improves the vehicle control method (With regard to this reasoning, see at least [Feit, 0118]). Regarding claim 9, Sakamaki discloses The computer-based method of claim 7, Sakamaki does not explicitly teach wherein the volume control container has a different numbers of volume control stops for at least some of the multiple sound sources However, Feit does teach wherein the volume control container has a different numbers of volume control stops for at least some of the multiple sound sources (“In the illustrated example, FIG. 9C illustrates that the current volume is at 20, where the maximum volume is 40. The numerical value of the volume and the maximum value of the volume are based on the current audio source.” [0119]). Both Sakamaki and Feit teach methods for vehicle controls based on user input. However, Feit explicitly teaches wherein the volume control container has a different numbers of volume control stops for at least some of the multiple sound sources. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the vehicle control method of Sakamaki to also include wherein the volume control container has a different numbers of volume control stops for at least some of the multiple sound sources, as taught by Feit, with a reasonable expectation of success. Doing so improves the vehicle control method (With regard to this reasoning, see at least [Feit, 0118]). Claims 10-12 are rejected under 35 U.S.C. 103 as being unpatentable over Sakamaki in view of Feit and in further view of Furge et al. (US20050018860A1), hereinafter referred to as Sakamaki, Feit and Furge respectively. Regarding claim 10, Sakamaki in view of Feit The computer-based method of claim 7, Sakamaki in view of Feit does not explicitly teach wherein the multiple sound sources include at least master audio, navigation audio, and short-range wireless audio However, Furge does teach wherein the multiple sound sources include at least master audio, navigation audio, and short-range wireless audio (“Other input signals such as fade, balance, and global volume from the head unit 204, the navigation unit 246, the cellular phone 248, or a combination may also be used.” [0040] and “The gain of the volume gain may be determined manually or by vehicle input signals from the input signal block 217 that are indicative of vehicle operation parameters, as previously discussed.”[0053]). Both Sakamaki in view of Feit and Furge teach methods for vehicle controls based on user input. However, Furge explicitly teaches wherein the multiple sound sources include at least master audio, navigation audio, and short-range wireless audio. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the vehicle control method of Sakamaki in view of Feit to also include wherein the multiple sound sources include at least master audio, navigation audio, and short-range wireless audio, as taught by Furge, with a reasonable expectation of success. Doing so improves the vehicle control method (With regard to this reasoning, see at least [Furge, 0040]). Regarding claim 11, Sakamaki in view of Feit discloses The computer-based method of claim 10, Sakamaki in view of Feit does not explicitly teach wherein the master audio comprises an overall sound of the vehicle However, Furge does teach wherein the master audio comprises an overall sound of the vehicle (“Other input signals such as fade, balance, and global volume from the head unit 204, the navigation unit 246, the cellular phone 248, or a combination may also be used.” [0040] and “The gain of the volume gain may be determined manually or by vehicle input signals from the input signal block 217 that are indicative of vehicle operation parameters, as previously discussed.”[0053]). Both Sakamaki in view of Feit and Furge teach methods for vehicle controls based on user input. However, Furge explicitly teaches wherein the master audio comprises an overall sound of the vehicle. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the vehicle control method of Sakamaki in view of Feit to also include wherein the master audio comprises an overall sound of the vehicle, as taught by Furge, with a reasonable expectation of success. Doing so improves the vehicle control method (With regard to this reasoning, see at least [Furge, 0040]). Regarding claim 12, Sakamaki in view of Feit discloses The computer-based method of claim 10, Sakamaki in view of Feit does not explicitly teach wherein the navigation audio comprises sound that a navigation component is outputting to a sound mixer of the vehicle However, Furge does teach wherein the navigation audio comprises sound that a navigation component is outputting to a sound mixer of the vehicle (“Similarly, an optional secondary source 244 provides source signals from a navigation unit 246 and a cellular phone 248 to analog to digital converters (ADC) 252 and 254, respectively. These digital source signals are input into the mixing block 210 or pre-filter 216.” [0044]). Both Sakamaki in view of Feit and Furge teach methods for vehicle controls based on user input. However, Furge explicitly teaches wherein the navigation audio comprises sound that a navigation component is outputting to a sound mixer of the vehicle. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the vehicle control method of Sakamaki in view of Feit to also include wherein the navigation audio comprises sound that a navigation component is outputting to a sound mixer of the vehicle, as taught by Furge, with a reasonable expectation of success. Doing so improves the vehicle control method (With regard to this reasoning, see at least [Furge, 0040]). Claim 13 is rejected under 35 U.S.C. 103 as being unpatentable over Sakamaki in view of Feit and in further view of Furge and in further view of Yae (US20170195474A1), hereinafter referred to as Sakamaki, Feit, Furge and Yae respectively. Regarding claim 13, Sakamaki in view of Feit and in further view of Furge discloses The computer-based method of claim 10, Sakamaki in view of Feit and in further view of Furge does not explicitly teach wherein the short-range wireless audio comprises sound that a portable electronic device is outputting to a sound mixer of the vehicle by short-range wireless communication However, Yae does teach wherein the short-range wireless audio comprises sound that a portable electronic device is outputting to a sound mixer of the vehicle by short-range wireless communication (“Referring to FIG. 1, an AVN system 200 according to the present embodiment may include a Bluetooth (BT) module 210 paired with a smart device capable of executing a navigation function to thereby constitute a streaming (A2DP) channel,” [0027]). Both Sakamaki in view of Feit and in further view of Furge and Yae teach methods for vehicle controls based on user input. However, Yae explicitly teaches wherein the short-range wireless audio comprises sound that a portable electronic device is outputting to a sound mixer of the vehicle by short-range wireless communication. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the vehicle control method of Sakamaki in view of Feit and in further view of Furge to also include wherein the short-range wireless audio comprises sound that a portable electronic device is outputting to a sound mixer of the vehicle by short-range wireless communication, as taught by Yae, with a reasonable expectation of success. Doing so improves the vehicle control method (With regard to this reasoning, see at least [Yae, 0024]). Claim 15 is rejected under 35 U.S.C. 103 as being unpatentable over Sakamaki in view of Millington et al. (US20170223472A1), hereinafter referred to as Sakamaki and Millington respectively. Regarding claim 15, Sakamaki discloses The computer-based method of claim 3, Sakamaki does not explicitly teach wherein volume is muted in the vehicle before the input is received, wherein the input corresponds to an increase volume command, and wherein the volume control container when presented indicates that the volume is unmuted However, Millington does teach wherein volume is muted in the vehicle before the input is received, wherein the input corresponds to an increase volume command, and wherein the volume control container when presented indicates that the volume is unmuted (“The example audio information source 604 determines (e.g., via the user input interface 802 of FIG. 8) whether an unmute/volume up command has been received from a user input device (block 1004). If an unmute/volume up command has been received (block 1004), the example audio information source 604 (e.g., via the playback device interface 804) sends a source message to an audio playback device 602 (block 1006).” [0129-0130]). Both Sakamaki and Millington teach methods for vehicle controls based on user input. However, Millington explicitly teaches wherein volume is muted in the vehicle before the input is received, wherein the input corresponds to an increase volume command, and wherein the volume control container when presented indicates that the volume is unmuted. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the vehicle control method of Sakamaki to also include wherein volume is muted in the vehicle before the input is received, wherein the input corresponds to an increase volume command, and wherein the volume control container when presented indicates that the volume is unmuted, as taught by Millington, with a reasonable expectation of success. Doing so improves the vehicle control method (With regard to this reasoning, see at least [Millington, 0129-0130]). Claims 20-23 are rejected under 35 U.S.C. 103 as being unpatentable over Sakamaki in view of Yanatsubo et al. (US20170113513A1), hereinafter referred to as Sakamaki and Yanatsubo respectively. Regarding claim 20, Sakamaki discloses The computer-based method of claim 2, Sakamaki does not explicitly teach wherein the container selected from among the multiple containers is a heating, ventilation, and air conditioning (HVAC) control container, wherein the plurality of hardware controls includes wherein the container selected from among the multiple containers is a heating, ventilation, and air conditioning (HVAC) control container, wherein the plurality of hardware controls includes multiple hardware HVAC controls, wherein the indications in the HVAC control container include multiple HVAC values corresponding to present settings of the multiple hardware HVAC controls, and wherein the HVAC control container is presented in response to the user interaction occurring with any of the multiple hardware HVAC controls, and wherein a first control of the multiple hardware HVAC controls is dedicated to a left-side passenger of the vehicle, wherein a second control of the multiple hardware HVAC controls is dedicated to a right-side passenger of the vehicle, and wherein the first and second controls are independent of each other However, Yanatsubo does teach wherein the container selected from among the multiple containers is a heating, ventilation, and air conditioning (HVAC) control container, wherein the plurality of hardware controls includes wherein the container selected from among the multiple containers is a heating, ventilation, and air conditioning (HVAC) control container, wherein the plurality of hardware controls includes multiple hardware HVAC controls, wherein the indications in the HVAC control container include multiple HVAC values corresponding to present settings of the multiple hardware HVAC controls, and wherein the HVAC control container is presented in response to the user interaction occurring with any of the multiple hardware HVAC controls, and wherein a first control of the multiple hardware HVAC controls is dedicated to a left-side passenger of the vehicle, wherein a second control of the multiple hardware HVAC controls is dedicated to a right-side passenger of the vehicle, and wherein the first and second controls are independent of each other (“The temperature setting switch 2011 is operation means for setting the temperature inside the cabin (set temperature Tset) within a prescribed temperature range. The temperature setting switch 2011 includes two driver seat temperature setting switch 2011 a and front passenger seat temperature setting switch 2011 b as shown in FIG. 2 in order to perform independent right and left temperature adjustment.” [0035]). Both Sakamaki and Yanatsubo teach methods for vehicle controls based on user input. However, Yanatsubo explicitly teaches wherein the container selected from among the multiple containers is a heating, ventilation, and air conditioning (HVAC) control container and wherein a first control of the multiple hardware HVAC controls is dedicated to a left-side passenger of the vehicle, wherein a second control of the multiple hardware HVAC controls is dedicated to a right-side passenger of the vehicle, and wherein the first and second controls are independent of each other. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the vehicle control method of Sakamaki to also include wherein the container selected from among the multiple containers is a heating, ventilation, and air conditioning (HVAC) control container and wherein a first control of the multiple hardware HVAC controls is dedicated to a left-side passenger of the vehicle, wherein a second control of the multiple hardware HVAC controls is dedicated to a right-side passenger of the vehicle, and wherein the first and second controls are independent of each other, as taught by Yanatsubo, with a reasonable expectation of success. Doing so improves the vehicle control method (With regard to this reasoning, see at least [Yanatsubo, 0035]). Regarding claim 21, Sakamaki discloses The computer-based method of claim 2, Sakamaki does not explicitly teach wherein the container selected from among the multiple containers is a heating, ventilation, and air conditioning (HVAC) control container, wherein the plurality of hardware controls includes multiple hardware HVAC controls, wherein the indications in the HVAC control container include multiple HVAC values corresponding to present settings of the multiple hardware HVAC controls, and wherein the HVAC control container is presented in response to the user interaction occurring with any of the multiple hardware HVAC controls, further comprising highlighting in the HVAC control container one of the multiple HVAC values corresponding to the one of the multiple hardware HVAC controls with which the user interaction occurs However, Yanatsubo does teach wherein the container selected from among the multiple containers is a heating, ventilation, and air conditioning (HVAC) control container, wherein the plurality of hardware controls includes multiple hardware HVAC controls, wherein the indications in the HVAC control container include multiple HVAC values corresponding to present settings of the multiple hardware HVAC controls, and wherein the HVAC control container is presented in response to the user interaction occurring with any of the multiple hardware HVAC controls, further comprising highlighting in the HVAC control container one of the multiple HVAC values corresponding to the one of the multiple hardware HVAC controls with which the user interaction occurs (“The temperature setting notification unit 2021 notifies an occupant of the setting of the temperature (set temperature Tset). As shown in FIG. 2, the temperature setting notification unit 2021, for example, includes a driver seat temperature setting notification unit 2021 a and a front passenger seat temperature setting notification unit 2021 b that are display units at both right and left ends of a liquid crystal panel (within the dashed-line box in the drawing). The driver seat temperature setting notification unit 2021 a is provided at the right end of the liquid crystal panel, and displays a driver seat set temperature. The front passenger seat temperature setting notification unit 2021 b is provided at the left end of the liquid crystal panel, and displays a front passenger seat set temperature.” [0051]). Both Sakamaki and Yanatsubo teach methods for vehicle controls based on user input. However, Yanatsubo explicitly teaches highlighting in the HVAC control container one of the multiple HVAC values corresponding to the one of the multiple hardware HVAC controls with which the user interaction occurs. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the vehicle control method of Sakamaki to also include highlighting in the HVAC control container one of the multiple HVAC values corresponding to the one of the multiple hardware HVAC controls with which the user interaction occurs, as taught by Yanatsubo, with a reasonable expectation of success. Doing so improves the vehicle control method (With regard to this reasoning, see at least [Yanatsubo, 0035]). Regarding claim 22, Sakamaki discloses The computer-based method of claim 2, Sakamaki does not explicitly teach wherein the container selected from among the multiple containers is a heating, ventilation, and air conditioning (HVAC) control container, wherein the plurality of hardware controls includes multiple hardware HVAC controls, wherein the indications in the HVAC control container include multiple HVAC values corresponding to present settings of the multiple hardware HVAC controls, and wherein the HVAC control container is presented in response to the user interaction occurring with any of the multiple hardware HVAC controls, and wherein the HVAC control container supports simultaneous user interaction using more than one of the multiple hardware HVAC controls However, Yanatsubo does teach wherein the container selected from among the multiple containers is a heating, ventilation, and air conditioning (HVAC) control container, wherein the plurality of hardware controls includes multiple hardware HVAC controls, wherein the indications in the HVAC control container include multiple HVAC values corresponding to present settings of the multiple hardware HVAC controls, and wherein the HVAC control container is presented in response to the user interaction occurring with any of the multiple hardware HVAC controls, and wherein the HVAC control container supports simultaneous user interaction using more than one of the multiple hardware HVAC controls (“The temperature setting switch 2011 is operation means for setting the temperature inside the cabin (set temperature Tset) within a prescribed temperature range. The temperature setting switch 2011 includes two driver seat temperature setting switch 2011 a and front passenger seat temperature setting switch 2011 b as shown in FIG. 2 in order to perform independent right and left temperature adjustment.” [0035]). Both Sakamaki and Yanatsubo teach methods for vehicle controls based on user input. However, Yanatsubo explicitly teaches wherein the HVAC control container supports simultaneous user interaction using more than one of the multiple hardware HVAC controls. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the vehicle control method of Sakamaki to also include wherein the HVAC control container supports simultaneous user interaction using more than one of the multiple hardware HVAC controls, as taught by Yanatsubo, with a reasonable expectation of success. Doing so improves the vehicle control method (With regard to this reasoning, see at least [Yanatsubo, 0035]). Regarding claim 23, Sakamaki discloses The computer-based method of claim 22, Sakamaki does not explicitly teach further comprising highlighting those of the multiple HVAC values corresponding to the more than one of the multiple hardware HVAC controls However, Yanatsubo does teach further comprising highlighting those of the multiple HVAC values corresponding to the more than one of the multiple hardware HVAC controls (“The temperature setting notification unit 2021 notifies an occupant of the setting of the temperature (set temperature Tset). As shown in FIG. 2, the temperature setting notification unit 2021, for example, includes a driver seat temperature setting notification unit 2021 a and a front passenger seat temperature setting notification unit 2021 b that are display units at both right and left ends of a liquid crystal panel (within the dashed-line box in the drawing). The driver seat temperature setting notification unit 2021 a is provided at the right end of the liquid crystal panel, and displays a driver seat set temperature. The front passenger seat temperature setting notification unit 2021 b is provided at the left end of the liquid crystal panel, and displays a front passenger seat set temperature.” [0051]). Both Sakamaki and Yanatsubo teach methods for vehicle controls based on user input. However, Yanatsubo explicitly teaches highlighting those of the multiple HVAC values corresponding to the more than one of the multiple hardware HVAC controls. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the vehicle control method of Sakamaki to also include highlighting those of the multiple HVAC values corresponding to the more than one of the multiple hardware HVAC controls, as taught by Yanatsubo, with a reasonable expectation of success. Doing so improves the vehicle control method (With regard to this reasoning, see at least [Yanatsubo, 0035]). Claim 25 is rejected under 35 U.S.C. 103 as being unpatentable over Sakamaki in view of Ricci et al. (US20170075701A1), hereinafter referred to as Sakamaki and Ricci respectively. Regarding claim 25, Sakamaki discloses The computer-based method of claim 2, Sakamaki does not explicitly teach wherein while the container selected based on the received input is presented, the method further comprises receiving another input generated by another user interaction with another of the plurality of hardware controls, and in response to the other input, replacing content of the container to correspond to the other of the plurality of hardware controls, wherein the content is replaced without ceasing to present the container However, Ricci does teach wherein while the container selected based on the received input is presented, the method further comprises receiving another input generated by another user interaction with another of the plurality of hardware controls, and in response to the other input, replacing content of the container to correspond to the other of the plurality of hardware controls, wherein the content is replaced without ceasing to present the container (“In another example, the media controller subsystem 348 applies screen magnification automatically to the visual content to assist the driver of the vehicle; that is, the user is the driver and larger font is easier to see than smaller font. The screen magnifier is software that interfaces with a computer's graphical output to present enlarged screen content. The simplest form of magnification presents an enlarged portion of the original screen content, the focus, so that it covers some or all of the full screen. This enlarged portion should include the content of interest to the user and the pointer or cursor, also suitably enlarged. As the user moves the pointer or cursor the screen magnifier should track with it and show the new enlarged portion.” [0758]). Both Sakamaki and Ricci teach methods for vehicle controls based on user input. However, Ricci explicitly teaches in response to the other input, replacing content of the container to correspond to the other of the plurality of hardware controls, wherein the content is replaced without ceasing to present the container. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the vehicle control method of Sakamaki to also include in response to the other input, replacing content of the container to correspond to the other of the plurality of hardware controls, wherein the content is replaced without ceasing to present the container, as taught by Ricci, with a reasonable expectation of success. Doing so improves the vehicle control method (With regard to this reasoning, see at least [Ricci, 0758]). Claim 26 is rejected under 35 U.S.C. 103 as being unpatentable over Sakamaki in view of Zhu et al. (US20200182646A1), hereinafter referred to as Sakamaki and Zhu respectively. Regarding claim 26, Sakamaki discloses The computer-based method of claim 2, Sakamaki does not explicitly teach wherein while the container selected based on the received input is presented, the method further comprises receiving another input generated by another user interaction with another of the plurality of hardware controls, and in response to the other input, replacing content of the container to correspond to the other of the plurality of hardware controls, wherein replacing the content comprises ceasing to present the container, and thereafter again presenting the container to include the replaced content However, Zhu does teach wherein while the container selected based on the received input is presented, the method further comprises receiving another input generated by another user interaction with another of the plurality of hardware controls, and in response to the other input, replacing content of the container to correspond to the other of the plurality of hardware controls, wherein replacing the content comprises ceasing to present the container, and thereafter again presenting the container to include the replaced content (“Based on the updated information, processor 106 may update and refresh the display of map portion 222 and map elements 220.” [0030]). Both Sakamaki and Zhu teach methods for vehicle controls based on user input. However, Zhu explicitly teaches wherein replacing the content comprises ceasing to present the container, and thereafter again presenting the container to include the replaced content. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the vehicle control method of Sakamaki to also include wherein replacing the content comprises ceasing to present the container, and thereafter again presenting the container to include the replaced content, as taught by Zhu, with a reasonable expectation of success. Doing so improves the vehicle control method (With regard to this reasoning, see at least [Zhu, 0030]). Claim 28 is rejected under 35 U.S.C. 103 as being unpatentable over Sakamaki in view of Schubert et al. (US20180337870A1), hereinafter referred to as Sakamaki and Schubert respectively. Regarding claim 28, Sakamaki discloses The computer-based method of claim 1, Sakamaki does not explicitly teach wherein the display device comprises a touchscreen input control, and wherein the touchscreen input control is configured for controlling a setting of at least one of the vehicle systems, and wherein the container is not presented upon controlling the setting of the vehicle system using the touchscreen input control However, Schubert does teach wherein the display device comprises a touchscreen input control, and wherein the touchscreen input control is configured for controlling a setting of at least one of the vehicle systems, and wherein the container is not presented upon controlling the setting of the vehicle system using the touchscreen input control (“In some examples, the feedback screen allows the user to change settings related to the operational safety mode (e.g., whether the detected trigger should continue to be active or inactive, selection of other triggers to be active or inactive, whitelisting certain contacts for notification output criteria during operational safety mode).” [0218-0219]). Both Sakamaki and Schubert teach methods for vehicle controls based on user input. However, Schubert explicitly teaches wherein the display device comprises a touchscreen input control, and wherein the touchscreen input control is configured for controlling a setting of at least one of the vehicle systems. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the vehicle control method of Sakamaki to also include wherein the display device comprises a touchscreen input control, and wherein the touchscreen input control is configured for controlling a setting of at least one of the vehicle systems, as taught by Schubert, with a reasonable expectation of success. Doing so improves the vehicle control method (With regard to this reasoning, see at least [Schubert, 0218-0219]). Claim 29 is rejected under 35 U.S.C. 103 as being unpatentable over Sakamaki in view of Skaff et al. (US20100265050A1), hereinafter referred to as Sakamaki and Skaff respectively. Regarding claim 29, Sakamaki discloses The computer-based method of claim 1, Sakamaki does not explicitly teach wherein the display device comprises a touchscreen input control, further comprising ceasing to present the container in response to a gesture detected by the touchscreen input control However, Skaff does teach wherein the display device comprises a touchscreen input control, further comprising ceasing to present the container in response to a gesture detected by the touchscreen input control (“Alternatively or additionally, the operator may cause the pop-up text block 106 to disappear by touching anywhere on the touch screen 68, by selecting a button adjacent the information display 66, or the like.” [0041]). Both Sakamaki and Skaff teach methods for vehicle controls based on user input. However, Skaff explicitly teaches ceasing to present the container in response to a gesture detected by the touchscreen input control. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the vehicle control method of Sakamaki to also include ceasing to present the container in response to a gesture detected by the touchscreen input control, as taught by Skaff, with a reasonable expectation of success. Doing so improves the vehicle control method (With regard to this reasoning, see at least [Skaff, 0041]). Claims 30-33 and 35 are rejected under 35 U.S.C. 103 as being unpatentable over Sakamaki in view of Jahns et al. (US20210237576A1), hereinafter referred to as Sakamaki and Jahns respectively. Regarding claim 30, Sakamaki discloses The computer-based method of claim 1, Sakamaki does not explicitly teach wherein the computer system presents content on the display device, and wherein the container is presented on top of at least some of the content However, Jahns does teach wherein the computer system presents content on the display device, and wherein the container is presented on top of at least some of the content (“According to another aspect, an on-screen gauge that changes from a normal state to a warning state may not require use of the dynamic container 236 to be displayed in the warning state.” [0076] and “The popup notification 500 as shown in FIG. 5A is illustrative of a popup notification associated with an out-of-parameter state gauge, the popup notification 500 b shown in FIG. 5B is illustrative of a popup notification associated with a warning state gauge, the popup notification 500 c shown in FIG. 5C is illustrative of a popup notification associated with a warning state gauge, and the popup notification 500 d shown in FIG. 5D is illustrative of a popup notification associated with a warning state gauge that may be selected for display over another warning state gauge based on a priority level/message severity classification determined based on safety relevance, operational relevance, and time.” [0079]). Both Sakamaki and Jahns teach methods for vehicle controls based on user input. However, Jahns explicitly teaches wherein the computer system presents content on the display device, and wherein the container is presented on top of at least some of the content. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the vehicle control method of Sakamaki to also include wherein the computer system presents content on the display device, and wherein the container is presented on top of at least some of the content, as taught by Jahns, with a reasonable expectation of success. Doing so improves the vehicle control method (With regard to this reasoning, see at least [Jahns, 0076]). Regarding claim 31, Sakamaki discloses The computer-based method of claim 30, Sakamaki does not explicitly teach wherein the content includes modal content, and wherein the container is presented below the modal content However, Jahns does teach wherein the content includes modal content, and wherein the container is presented below the modal content (“The popup notification 500 as shown in FIG. 5A is illustrative of a popup notification associated with an out-of-parameter state gauge, the popup notification 500 b shown in FIG. 5B is illustrative of a popup notification associated with a warning state gauge, the popup notification 500 c shown in FIG. 5C is illustrative of a popup notification associated with a warning state gauge, and the popup notification 500 d shown in FIG. 5D is illustrative of a popup notification associated with a warning state gauge that may be selected for display over another warning state gauge based on a priority level/message severity classification determined based on safety relevance, operational relevance, and time.” [0079]). Both Sakamaki and Jahns teach methods for vehicle controls based on user input. However, Jahns explicitly teaches wherein the content includes modal content, and wherein the container is presented below the modal content. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the vehicle control method of Sakamaki to also include wherein the content includes modal content, and wherein the container is presented below the modal content, as taught by Jahns, with a reasonable expectation of success. Doing so improves the vehicle control method (With regard to this reasoning, see at least [Jahns, 0076]). Regarding claim 32, Sakamaki discloses The computer-based method of claim 1, Sakamaki does not explicitly teach wherein presenting the container comprises presenting an animation wherein the container performs a move out onto the display device from an edge of the display device, and wherein the move takes a predefined time However, Jahns does teach wherein presenting the container comprises presenting an animation wherein the container performs a move out onto the display device from an edge of the display device, and wherein the move takes a predefined time (“In some examples, in order to maintain spatial locations for the gauges already on the screen, the other containers in a third gauge zone 204 a,b may collapse into a smaller area (e.g., transitioned into a compact or smaller version) and allow for an additional gauge (e.g., out-of-parameter or warning state gauge) to appear below them in the dynamic container 218 a,b. When the dynamic container 218 disappears, compacted gauges may transition back to their normal (longer) version.” and “For example, when a hidden gauge, such as gauge 314 f, is brought onto the screen 128 when it goes into an out-of-parameter state, a dynamic animation may be used to call attention to it.” [0069]). Both Sakamaki and Jahns teach methods for vehicle controls based on user input. However, Jahns explicitly teaches wherein presenting the container comprises presenting an animation wherein the container performs a move out onto the display device from an edge of the display device, and wherein the move takes a predefined time. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the vehicle control method of Sakamaki to also include wherein presenting the container comprises presenting an animation wherein the container performs a move out onto the display device from an edge of the display device, and wherein the move takes a predefined time, as taught by Jahns, with a reasonable expectation of success. Doing so improves the vehicle control method (With regard to this reasoning, see at least [Jahns, 0076]). Regarding claim 33, Sakamaki discloses The computer-based method of claim 32, Sakamaki does not explicitly teach wherein a signal from the one of the plurality of hardware controls with which the user interaction occurs is active during the move However, Jahns does teach wherein a signal from the one of the plurality of hardware controls with which the user interaction occurs is active during the move (“According to an aspect, the instrument cluster UI engine 104 is illustrative of a software module, system, or device that is operative or configured to receive various signal inputs from a plurality of data sources and provide the flexible and variability-accommodating instrument cluster 106 for display on the display screen 128 included in the vehicle 102.” [0041] and “The plurality of data sources may include any suitable data source, unit, or sensor operative to provide various data or signaling information that may be used by the instrument cluster UI engine 104 to provide vehicle status-related information via the instrument cluster 106. The plurality of data sources can include, but are not limited to, a vehicle mode data source 108, a gearbox data source 110, an engine state data source 112, a warning and notification manager 114, a speed control function data source 116, a vehicle information data source 118, a navigation data source 120, and steering wheel switch (SWS) infotainment and display actuation data sources 122, 124 (e.g., via a scroll wheel, dial, or other actuator (referred to herein as a cluster control 122)).” [0044]). Both Sakamaki and Jahns teach methods for vehicle controls based on user input. However, Jahns explicitly teaches wherein a signal from the one of the plurality of hardware controls with which the user interaction occurs is active during the move. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the vehicle control method of Sakamaki to also include wherein a signal from the one of the plurality of hardware controls with which the user interaction occurs is active during the move, as taught by Jahns, with a reasonable expectation of success. Doing so improves the vehicle control method (With regard to this reasoning, see at least [Jahns, 0076]). Regarding claim 35, Sakamaki discloses The computer-based method of claim 1, Sakamaki does not explicitly teach changing a reading layout of the container However, Jahns does teach further comprising changing a reading layout of the container (“The dynamic content zone 234 may include specific content unique to the card 228, which may include gauges, custom setup options, ADAS features, TPMS, menu options, and/or trip information.” [0057]). Both Sakamaki and Jahns teach methods for vehicle controls based on user input. However, Jahns explicitly teaches changing a reading layout of the container. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the vehicle control method of Sakamaki to also include changing a reading layout of the container, as taught by Jahns, with a reasonable expectation of success. Doing so improves the vehicle control method (With regard to this reasoning, see at least [Jahns, 0076]). Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to AHMED ALKIRSH whose telephone number is (703) 756-4503. The examiner can normally be reached M-F 9:00 am-5:00 pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, FADEY JABR can be reached on (571) 272-1516. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /AA/Examiner, Art Unit 3668 /Fadey S. Jabr/Supervisory Patent Examiner, Art Unit 3668
Read full office action

Prosecution Timeline

Apr 25, 2024
Application Filed
Sep 05, 2025
Non-Final Rejection — §101, §102, §103
Dec 08, 2025
Response Filed
Dec 19, 2025
Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12578724
Detection of Anomalous Trailer Behavior
2y 5m to grant Granted Mar 17, 2026
Patent 12410589
METHODS AND SYSTEMS FOR IMPLEMENTING A LOCK-OUT COMMAND ON LEVER MACHINES
2y 5m to grant Granted Sep 09, 2025
Patent 12403908
NON-SELFISH TRAFFIC LIGHTS PASSING ADVISORY SYSTEMS
2y 5m to grant Granted Sep 02, 2025
Patent 12370903
METHOD FOR TORQUE CONTROL OF ELECTRIC VEHICLE ON SLIPPERY ROAD SURFACE, AND TERMINAL DEVICE
2y 5m to grant Granted Jul 29, 2025
Patent 12325450
SYSTEMS AND METHODS FOR GENERATING MULTILEVEL OCCUPANCY AND OCCLUSION GRIDS FOR CONTROLLING NAVIGATION OF VEHICLES
2y 5m to grant Granted Jun 10, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
54%
Grant Probability
99%
With Interview (+53.7%)
3y 0m
Median Time to Grant
Moderate
PTA Risk
Based on 43 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month