Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 12/27/2024 and 11/10/2025 were filed before the first action on the merits of the application. The submissions are in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statements are being considered by the examiner.
Specification
The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed.
The following title is suggested: Vehicle Interior Control Method, Device, and Non-transitory computer readable medium for Enhanced Immersive Entertainment.
The current title, while encompassing the invention, is so generalized that it is not clearly indicative of the claimed subject matter. Modern Vehicles have many disparate computerized systems, as such the current title is applicable to nearly any component/control thereof of a vehicle, e.g. a suspension system, headlight controls, engine timing, etc. Therefore while encompassing the invention it is not clearly indicative of the claimed invention.
The suggested title change make more clear that the claimed invention is specifically directed to controlling the interior of the vehicle to enhance entertainment immersion.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claim 17 is rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. The claim(s) does/do not fall within at least one of the four categories of patent eligible subject matter because:
Claim 17 is directed to a “computer program product” (i.e. software). Pure software does not fall within the statutory categories for patent eligible subject matter, as such claim 17 is rejected as directed to software per-se.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-3, 7-10,12, 15-18, and 20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by DE102018208774A1, “Method For Controlling At Least One Actuator In At Least Two Motor Vehicles, Transmitting And Control Device, And Motor Vehicle”, Unger et al. (Cited in IDS, machine translation of spec with paragraph numbers provided as NPL with this office action)
Regarding Claim 1, Unger et al teaches “A method for controlling a vehicle, comprising: obtaining target scene information of a target video and time range information corresponding to the target scene information;”([0009] “The invention is based on the idea of providing a transmitting and control device that is operated externally to the motor vehicle and synchronizes several motor vehicles with media content, i.e., output content, that is output externally to the motor vehicle.
In other words, the transmitting and control device not only synchronizes the output of the media content, for example a video or a sequence of images from a film, but also synchronizes a large number of motor vehicles.
Crucial here is the coordinated transmission of the "control tracks", i.e. the digital information for one or more actuator actions, to the actuators in the vehicles” Here teaches obtaining control tracks (target scene information) and from the synchronizing with the movies teaches time range information);” and controlling a target device on the vehicle to run in a running state matching with the target scene information within a time range corresponding to the target scene information during a process of playing the target video”();” and wherein the target device comprises at least one of: an air-conditioner, a seat heating device, an air suspension system, a seat vibrating device, or an electronic fragrance diffuser.”([0012] “An actuator is understood to be a device or device component that is designed and configured to convert an electrical signal, for example a command originating from the transmitting and control device, into a mechanical movement and/or into another physical quantity, for example pressure and/or temperature.
Examples of actuators include a vehicle ventilation system, a motorized vehicle seat, and a motorized system for generating a rotational movement of the vehicle around an axis of a vehicle-fixed coordinate system.” The actuators include ventilation system (air-conditioner), motorized seat (seat vibration), and suspension (air suspension system) control_ [0015] it is known that the ventilation control = AC which includes both temperature and strength modulation and from [0054] the seat is known to include haptic/vibration feedback/control)
Regarding Claim 2, Unger et al teaches “The method according to claim 1, wherein the target scene information comprises high-temperature scene information, and the target device comprises at least one of the air-conditioner or the seat heating device; and controlling the target device on the vehicle to run in the running state matching with the target scene information within the time range corresponding to the target scene information comprises at least one of: controlling the air-conditioner to heat an in-vehicle environment within a time range corresponding to the high-temperature scene information, or controlling the seat heating device to be turned on within the time range corresponding to the high-temperature scene information.”( [0015] “The transmitting and control device provides at least one time signal that describes the temporal dependency of the action to be performed on the output of the output content.
In other words, the time signal describes a temporal dependency of the action to be performed on a progression of the output content. If, for example, the sample film file describes a film scene in a desert, then the action to be performed could be, for example, switching on the ventilation and setting the air temperature to 28 degrees Celsius, with the time signal specifying that these functions should be activated at the beginning of the desert scene.”)
Regarding Claim 3, Unger et al teaches “The method according to claim 2, wherein controlling the air-conditioner to heat the in-vehicle environment comprises: controlling the air-conditioner to heat the in-vehicle environment so as to increase a temperature of the in-vehicle environment by a first preset ratio”([0015] “…In other words, the time signal describes a temporal dependency of the action to be performed on a progression of the output content. If, for example, the sample film file describes a film scene in a desert, then the action to be performed could be, for example, switching on the ventilation and setting the air temperature to 28 degrees Celsius, with the time signal specifying that these functions should be activated at the beginning of the desert scene.” Here teaches heating of the vehicle (to reach a goal temperature) i.e. be a preset ratio based on the video content)” without exceeding a first set temperature”([0050] “If a performance limit is reached due to a maximum temperature, this can, for example, enable temperature protection for the individual control units of the motor vehicle and/or a receiving and control unit 14.” Here teaches that a maximum temperature limit is set (and not exceeded) to protect the actuators from overheating)
Regarding Claim 7, Unger et al teaches “The method according to claim 1, wherein the target scene information comprises wind scene information, and the target device comprises the air-conditioner; and controlling the target device on the vehicle to run in the running state matching with the target scene information within the time range corresponding to the target scene information comprises: controlling the air-conditioner to run in a higher air blowing gear within a time range corresponding to the wind scene information.”( [0035] One of the exemplary actuators 20 of the respective motor vehicle 10 can be, for example, a ventilation system or a blower, i.e., an air conditioning system with a ventilation system designed to set a climatic parameter, for example, a temperature and/or an airflow strength and/or a humidity level Such an actuator can, for example, be activated simultaneously with a film scene of an explosion or a scene in a desert, for example by simultaneously blowing hot air into the vehicle 10 at a high setting. This allows a wind or pressure wave from an example explosion - or a desert wind - from the example film to be simulated, which the user can perceive)
Regarding Claim 8, Unger et al teaches “The method according to claim 1, wherein the target scene information comprises vibration scene information, and the target device comprises at least one of the air suspension system or the seat vibrating device; and controlling the target device on the vehicle to run in the running state matching with the target scene information within the time range corresponding to the target scene information comprises at least one of: controlling the air suspension system to enable a vehicle body to perform at least one of swinging or vibrating within a time range corresponding to the vibration scene information, or controlling the seat vibrating device to be turned on within the time range corresponding to the vibration scene information.”([0037] “An additional or alternative actuator 20 may be provided, which may, for example, be an electromechanical system of a motor vehicle seat 26, wherein such an actuator 20 may, for example, perform a shaking movement of the seat “ + [0070] “…The generated control tracks can then be provided to the actuators 20, for example an active suspension, a ventilation system, a seat vibration system and/or an interior lighting system.” Here teaches that the control tracks (target scene information) can instruct the suspension and/or seat vibrations to match the video/movie being played)
Regarding Claim 9, Unger et al teaches “The method according to claim 8, wherein controlling the seat vibrating device to be turned on comprises: controlling a seat vibrating device of a seat where a user is seated to be turned on.”( [0070]” The generated control tracks can then be provided to the actuators 20, for example an active suspension, a ventilation system, a seat vibration system and/or an interior lighting system.” Control track includes seat vibrations (turning on a seat vibration system))
Regarding Claim 10, Unger et al teaches “The method according to claim 1, wherein the target device further comprises a first ambient light; and controlling the target device on the vehicle to run in the running state matching with the target scene information within the time range corresponding to the target scene information comprises: controlling the first ambient light to run in a display mode matching with the target scene information within the time range corresponding to the target scene information.”([0038] “…Furthermore, the motor vehicle 10 may optionally have an actuator 20, which may be designed as a lighting device, i.e. as a device or group of devices for illuminating the interior of the motor vehicle and/or an external environment of the motor vehicle 10.” The control of actuators (Based on the track) includes ambient lighting control for both the exterior and/or interior lighting (first and second ambient lights)
Regarding Claim 12, Unger et al teaches “The method according to claim 1, further comprising: determining video type information of the target video; and controlling a second ambient light to run in a display mode matching with the video type information during the process of playing the target video.” ([0038] “…Furthermore, the motor vehicle 10 may optionally have an actuator 20, which may be designed as a lighting device, i.e. as a device or group of devices for illuminating the interior of the motor vehicle and/or an external environment of the motor vehicle 10.” The control of actuators (Based on the track) includes ambient lighting control for both the exterior and/or interior lighting (first and second ambient lights)
Regarding Claim 15, it is a vehicle equivalent to the method claim 1. It has the same grounds of rejection as claim 1.
Regarding Claim 16, it is a non-transitory readable medium which executed by a processor performs the method of claim 1, it has the same grounds of rejection as claim 1.
Regarding Claim 17, it is a computer program equivalent to claim 1’s method. It has the same grounds of rejection.
Regarding Claim 18, it is a vehicle equivalent to the method claim 2, it has the same grounds of rejection.
Regarding Claim 20, it is a vehicle equivalent to method claim 7, it has the same grounds of rejection.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the c,laimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 4 and 13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Unger et al as applied to claims 2 and 1 above, and further in view of US 20200283005 A1, “EMBEDDED IN-VEHICLE PLATFORM FOR PROVIDING IMMERSIVE USER EXPERIENCES”, Shah et al.
Regarding Claim 4, Unger et al does not teach a seat based heater and control thereof as part of the 4D experience. It however does generally teach that actuators include temperature modulation ([0012]”… An actuator is understood to be a device or device component that is designed and configured to convert an electrical signal, for example a command originating from the transmitting and control device, into a mechanical movement and/or into another physical quantity, for example pressure and/or temperature”)
Shah et al teaches a similar piece of prior art, in which the interior of the cabin of a vehicle is controlled to provide an enhanced viewing experience (of Ads) to the occupants,”( Systems and methods are disclosed for an embedded in-vehicle platform that provides immersive user experiences. An example method includes obtaining, by a vehicle controller of a vehicle, a preference profile for a user; loading the preference profile by the vehicle controller; determining a shopping experience of the user occurring on a vehicle shopping platform within the vehicle; determining an ambience profile of a merchant identified in the shopping experience; and controlling, by the vehicle controller, an in-vehicle ambience within the vehicle based on the ambience profile and the preference profile.)” which includes activating/controlling seat heaters to enhance the currently viewed video. ([0027] “The climate control system 130 allows a user to select a temperature within the vehicle 104 as well as control other aspects of climate such as seat heating or cooling. In some embodiments, the climate control system 130 can be controlled through use of the ambience profile and/or the preference profile to increase or decrease temperature or airspeed within the vehicle. In one example use case, the merchant provides vacation services, and the ambience profile includes instructions that cause the entertainment or infotainment system 124 to play a video of a beach scene while the climate control system 130 causes the temperature within the vehicle to be set at 82 degrees, and an airspeed is set to high to mimic a beach environment. The vehicle controller 112 can also cause the climate control system 130 to activate heaters in the seat where the user is located using the seat sensors and/or components 138. As discussed below, the ambience profile can also include instructions that cause a scent dispenser within the vehicle to dispense a tropical scent. These facets collectively create an immersive experience for users, enticing them into purchasing a vacation.”)
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the application to modify Unger et al to include controlling of seat based heaters to match the temperature of the scene (e.g. activate for a hot scene) as taught by Shah et al. One would be motivated to implement the seat heater control to provide an additional sensation source for the passengers improving the immersion into the scene. Shah teaches this motivation in ([0027] “… In one example use case, the merchant provides vacation services, and the ambience profile includes instructions that cause the entertainment or infotainment system 124 to play a video of a beach scene while the climate control system 130 causes the temperature within the vehicle to be set at 82 degrees, and an airspeed is set to high to mimic a beach environment. The vehicle controller 112 can also cause the climate control system 130 to activate heaters in the seat where the user is located using the seat sensors and/or components 138. As discussed below, the ambience profile can also include instructions that cause a scent dispenser within the vehicle to dispense a tropical scent. These facets collectively create an immersive experience for users, enticing them into purchasing a vacation)
Regarding Claim 13, Unger does not teach the use of smell as an actuator/part of the 4D experience.
Shah et al teaches hah et al teaches a similar piece of prior art, in which the interior of the cabin of a vehicle is controlled to provide an enhanced viewing experience (of Ads) to the occupants,”( Systems and methods are disclosed for an embedded in-vehicle platform that provides immersive user experiences. An example method includes obtaining, by a vehicle controller of a vehicle, a preference profile for a user; loading the preference profile by the vehicle controller; determining a shopping experience of the user occurring on a vehicle shopping platform within the vehicle; determining an ambience profile of a merchant identified in the shopping experience; and controlling, by the vehicle controller, an in-vehicle ambience within the vehicle based on the ambience profile and the preference profile.)” which includes activating/controlling a scent dispenser to enhance the currently viewed video. ([0027] “The climate control system 130 allows a user to select a temperature within the vehicle 104 as well as control other aspects of climate such as seat heating or cooling. In some embodiments, the climate control system 130 can be controlled through use of the ambience profile and/or the preference profile to increase or decrease temperature or airspeed within the vehicle. In one example use case, the merchant provides vacation services, and the ambience profile includes instructions that cause the entertainment or infotainment system 124 to play a video of a beach scene while the climate control system 130 causes the temperature within the vehicle to be set at 82 degrees, and an airspeed is set to high to mimic a beach environment. The vehicle controller 112 can also cause the climate control system 130 to activate heaters in the seat where the user is located using the seat sensors and/or components 138. As discussed below, the ambience profile can also include instructions that cause a scent dispenser within the vehicle to dispense a tropical scent. These facets collectively create an immersive experience for users, enticing them into purchasing a vacation.”)
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the application to modify Unger et al to include controlling of seat based heaters to match the temperature of the scene (e.g. activate for a hot scene) as taught by Shah et al. One would be motivated to implement the seat heater control to provide an additional sensation source for the passengers improving the immersion into the scene. Shah teaches this motivation in ([0027] “… In one example use case, the merchant provides vacation services, and the ambience profile includes instructions that cause the entertainment or infotainment system 124 to play a video of a beach scene while the climate control system 130 causes the temperature within the vehicle to be set at 82 degrees, and an airspeed is set to high to mimic a beach environment. The vehicle controller 112 can also cause the climate control system 130 to activate heaters in the seat where the user is located using the seat sensors and/or components 138. As discussed below, the ambience profile can also include instructions that cause a scent dispenser within the vehicle to dispense a tropical scent. These facets collectively create an immersive experience for users, enticing them into purchasing a vacation)
Claim(s) 5 and 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Unger et al as applied to claims 1 and 15 above in view of itself.
Regarding Claim 5, while Unger teaches modulating of the temperature via the ventilation system (AC system) of the vehicle to match the video scene. ([0035] One of the exemplary actuators 20 of the respective motor vehicle 10 can be, for example, a ventilation system or a blower, i.e., an air conditioning system with a ventilation system designed to set a climatic parameter, for example, a temperature and/or an airflow strength and/or a humidity level.”) it does not explicitly teach an example of a “low-temperature scene” and the corresponding reduction of temperature.
That being said Unger teaches the underlying concept of matching the vehicle temperature to correspond with the scene’s temperature ([0035] One of the exemplary actuators 20 of the respective motor vehicle 10 can be, for example, a ventilation system or a blower, i.e., an air conditioning system with a ventilation system designed to set a climatic parameter, for example, a temperature and/or an airflow strength
and/or a humidity level.” Here setting of a climatic parameter under the plain meaning would include raising or lowering) and gives an explicit example of a “high-temperature” scene such as a desert. ([0035] “Such an actuator can, for example, be activated simultaneously with a film scene of an explosion or a scene in a desert, for example by simultaneously blowing hot air into the vehicle 10 at a high setting” Here teaches the underlying principle of setting the AC system air temperature to correspond to the scene.)
As such it would have been obvious to one of ordinary skill in the art, before the effective filing date of the application, to include lowering the temperature for corresponding scenes, (e.g. during a artic exploration or winter scene), this modulation for “low-temperature” scene is a logical extension of the principles taught by the high-temperature scene of the desert.
Claim 19 is a vehicle equivalent to the method claim 5 above, it has the same grounds of rejection as claim 5.
Claim(s) 6 is/are rejected under 35 U.S.C. 103 as being unpatentable over Unger et al as applied to claim 5 above, and further in view of US 20220396279 A1, “AUTONOMOUS OR SEMI-AUTONOMOUS VEHICLE ELECTRONIC ARCHITECTURE”, Chen.
Regarding Claim 6, Unger (as modified in claim 5) above teaches “herein controlling the air-conditioner to cool the in-vehicle environment comprises: controlling the air-conditioner to cool the in-vehicle environment so as to reduce a temperature of the in-vehicle environment by a second preset ratio” (As modified in Claim 5 the reducing of temperature to correspond to a cold scene (such as a winter scene or artic expedition) teaches reducing the temperature of the in-vehicle in environment is a logical extension of the teachings of [0035])
It however does not teach “without being less than a second set temperature.”
Chen et al teaches a vehicle interior control system immersion system ([0027]) which includes setting a temperature range in which the vehicle’s temperature will remain within that acceptable range as part of a immersive experience.([0076]-[0078] ”In one approach, a difference between comfort settings and an occupant preference satisfies the change threshold when the temperature of the vehicle differs from what is preferred by the occupant, a seat/steering wheel temperature differs by what is preferred by the occupant,… [0077] “Responsive to determining that the difference satisfies the change threshold, the control module 220 adjusts the vehicle settings as discussed at 540. Otherwise, when the difference between the vehicle settings and occupant preference do not satisfy the change threshold (i.e., when the vehicle settings are within a tolerable range of the occupant preference and/or the vehicle settings do not invoke physiological reactions indicative of a stressed/distracted mental state), the control module 220 continues to identify the vehicle settings as discussed at 520.”
[0078] At 540, responsive to determining that vehicle settings satisfy the change threshold, the control module 220, in one embodiment, adjusts the vehicle settings according to the occupant preference” in [0076]-[0078] Chen teaches modulating of to the required temperature however it limits it to being within the tolerable range, i.e. temperature will not be less than the second threshold (lower bound of the range))
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the application to modify Unger to include the occupant tolerable temperature range setting of Chen as part of the immersive temperature modulation. One would be motivated to implement the tolerable temperature range (and limiting of changes to being within that range) to improve the individual occupant comfort and safety. (Implicilty from the phrase “tolerable range” in [0077] it is known that temperature outside this range are “intolerable” i.e. unsafe or uncomfortable to the occupants.)
Claim(s) 11 is/are rejected under 35 U.S.C. 103 as being unpatentable over Unger et al as applied to claim 1 above, and further in view of US 10860620 B1, Martel, “Associating Physical Items With Content”.
Regarding Claim 11, Unger while teaches changing/control of ambient lighting ([0038] “…Furthermore, the motor vehicle 10 may optionally have an actuator 20, which may be designed as a lighting device, i.e. as a device or group of devices for illuminating the interior of the motor vehicle and/or an external environment of the motor vehicle 10.”) does not specifically teaching that the lighting control is matching a dominant hue of the video scene/picture.
Martel teaches a similar (applied to home interior as opposed to vehicle cabin) ambient environement control system in which the ambient lighting is controlled to match the (dominant) hue of the currently viewed movie (Column 9-10, lines 44-08 “…In some implementations, the computing device 120 or another device that is configured to perform some or all of the processes described above in reference to the computing device 120 may, in addition to retrieving and providing content 140, perform one or more processes in response to receiving the physical item identifier 109. …in this way, the computing device 120 may instruct various devices within the household of user 102 (e.g., lighting fixtures, sound systems, televisions, home entertainment systems, set-top boxes, etc.) to perform specific operations such that the ambience of the household of user 102 is adjusted when user 102 is provided with content 140 for an enhanced and more immersive user experience. For example, the computing device 120 may provide commands to dim household lighting, change the color of household lighting to match tones and hues of content 140, play songs or ambient sound through speakers within the household, display social media streams or live video streams on a television within the household, and the like. The commands associated with each file or piece of content maintained by the computing device 120 may be determined based on attributes of with each file or piece of content, determined based on metadata associated with each file or piece of content, specified by user 102 through interaction with an application running on a client device that communicates with computing device 120, such as user device 106, or a combination thereof.” Here teaches based on the attributes of the content (movie/video) matching the hue of ambient lighting (lighting fixures of the house))
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the application to modify Unger to include the determination the video (content’s) color hue and matching of the interior lighting to it as taught by the immersion system of Martel. One would be motivated to implement the hue matching of the ambient lighting to further improve the immersion of the user. (Martel teaches this improvement as cited above“…such that the ambience of the household of user 102 is adjusted when user 102 is provided with content 140 for an enhanced and more immersive user experience…)
Claim(s) 14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Unger et al as applied to claim 1 above, and further in view of US 20250313218 A1, “VEHICLE COCKPIT LINKAGE METHODS, EQUIPMENT, MEDIA AND COMPUTER PROGRAM PRODUCTS”, Xia et al.
Regarding Claim 14, Unger et al does not teach that the target scene information and time range information is detected in real-time “based on a scene identification model.”. Instead it teaches that the corresponding controls (target scene information and time range information) are a “track” (i.e. a predetermined sequence) akin to the video and audio tracks commonly used in media.
Xia et al teaches a vehicle ambient environment control system in which the corresponding ambient environment controls (target scene information and time range information)”( 0026] The coordinated control unit 108 is any assembly within the cockpit of the vehicle 102 that is capable of engaging in sensory interaction with the user. For example, the coordinated control unit 108 may comprise ambient lighting, audio, an air conditioner, seats, fragrance, and the like within the cockpit.);” are instead of being predetermined are detected in real-time by matching the currently detected video information with template matching([0034] In some embodiments, in order to trigger vehicle cockpit coordinated control on the basis of the application screen and the user interaction, it may be determined whether the application screen matches a target scenario. Then, in response to the application screen matching the target scenario, the vehicle cockpit coordinated control may be triggered on the basis of the application screen and the user interaction. In some embodiments, in order to determine whether the application screen matches the target scenario, a template for the target scenario may be acquired, and the template comprises a target feature corresponding to the target scenario” Here teaches when a the currently detected scene matches a template/control a corresponding “coordinate control” is performed (which from [0028] is known to include changing of lighting, temperature, or scents of the vehicle interior) which is/can be done via machine learning algorithms (i.e. via a “scene identification model”) ([0064] The computing unit 901 may be various general-purpose and/or dedicated processing components having processing and computing capabilities. Some examples of the computing unit 901 include, but are not limited to, a central processing unit (CPU), a graphics processing unit (GPU), various dedicated artificial intelligence (AI) computation chips, various computing units running machine learning model algorithms, a digital signal processor (DSP), and any suitable processor, controller, microcontroller, etc. The computing unit 901 executes various methods and processes described above, such as the method 200. For example, in some embodiments,”)
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the application to modify Unger to substitute the predetermined “track” for the cabin interior environment controls/modulation to instead use the template/scenario matching as taught by Xia (i.e. implement the real-time “scene identification model”). One would be motivated to implement the template matching system of Xia to improve the versatility of the system by allowing it to operate to a large variety of movies and media, as opposed to each individual piece of media needing to have a corresponding predetermined “track”, reducing the cost to operation/create media with the enhanced immersion/environment. Xia teaches this improvement/motivation in (“In this way, the scenario of the application and the interactive object can be determined on the basis of the application screen and the user interaction, without invoking the data interfaces from the application provider. Therefore, this enables the in-vehicle system provider to independently implement vehicle cockpit coordinated control, thereby reducing costs. In addition, compared with solutions that use artificial intelligence-based object detection technologies, user interaction data may be used to assist in determining interactive objects on the screen, thereby improving processing speed, reducing response time, enhancing versatility, and reducing the costs consumed for training the artificial intelligence models.”)
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. CN 115285047 A; US 20190101976 A1;
Any inquiry concerning this communication or earlier communications from the examiner should be directed to KENNETH MICHAEL DUNNE whose telephone number is (571)270-7392. The examiner can normally be reached Mon-Thurs 8:30-6:30.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Navid Z Mehdizadeh can be reached at (571) 272-7691. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/KENNETH M DUNNE/Primary Examiner, Art Unit 3669