Prosecution Insights
Last updated: April 19, 2026
Application No. 18/356,820

Vehicle Wheel-Based Computing Device

Non-Final OA §103
Filed
Jul 21, 2023
Examiner
CHEN, FRANK S
Art Unit
2611
Tech Center
2600 — Communications
Assignee
Mercedes-Benz Group AG
OA Round
1 (Non-Final)
82%
Grant Probability
Favorable
1-2
OA Rounds
2y 2m
To Grant
91%
With Interview

Examiner Intelligence

Grants 82% — above average
82%
Career Allow Rate
539 granted / 657 resolved
+20.0% vs TC avg
Moderate +9% lift
Without
With
+8.8%
Interview Lift
resolved cases with interview
Fast prosecutor
2y 2m
Avg Prosecution
24 currently pending
Career history
681
Total Applications
across all art units

Statute-Specific Performance

§101
10.1%
-29.9% vs TC avg
§103
55.9%
+15.9% vs TC avg
§102
4.8%
-35.2% vs TC avg
§112
11.1%
-28.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 657 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status 1 The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 103 2. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 3. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. 4. Claims 1-5, 8-9, 12-16, 18-20, and 22 are rejected under 35 U.S.C. 103 as being unpatentable over Ishikawa et al. (US Patent No. 8,096,069 B2) in view of Yongsin Kim (US Patent Application Publication No. 2013/0335311 A1) and further in view of Wen et al. (US Patent Application Publication No. 2023/0350702 A1). 5. Regarding Claim 1, Ishikawa discloses A vehicle wheel-based computing device, (col. 7, lines 19-32 reciting “Certain embodiments of the repeatably displaceable emanating element display 100 relating to the vehicle-related information can be configured to be associated with computers, controllers, micro-processors, networked electronic-based devices, and/or computer-based devices as described in this disclosure with respect to FIG. 12 to perform a variety of operations. For example, certain embodiments of the repeatably displaceable emanating element display 100 relating to the vehicle-related information can be configured to sense a variety of parameters, such as temperature, proximity, position/location, velocity, acceleration, and other parameters that may or may not be associated with the operation of the vehicle or mobile device 106. Certain embodiments of the repeatably displaceable emanating element display 100 relating to the vehicle-related information can be configured to transmit and/or receive a variety of signals, data, information, etc. using a variety of networking, computer, communications, sensor or other technologies and mechanisms.”) comprising: a display device configured for at least partial attachment to a wheel of a vehicle; (see FIG. 5; col. 6, lines 47-62 reciting “Certain embodiments of the repeatably displaceable emanating element display 100 can configure or operate their repeatably displaceable display emanating element(s) 130 to provide a projection type synchronously modifiable oriented image 102 relating to the vehicle-related information, such as to project light onto a nearby illuminated displayed region 140, such as described with respect to FIG. 2. For example, the synchronously modifiable oriented image(s) projected from hubcaps or wheels of a car, bus, or other vehicle or mobile device 106 relating to the vehicle-related information could be projected on such displayed region(s) 140 as the side, hood, front, back, or other region of the vehicle or mobile device, another vehicle or mobile device nearby the emanating vehicle or mobile device, at the wheels of the vehicle or mobile device, or at other portions of the vehicle or mobile device.” Displayed region 140 corresponds to display device at least partially attached to a tire.) and a control circuit configured to: (see FIG. 20 wherein display control 97 is a control circuit; col. 28, lines 36-41 reciting “Certain embodiments of the repeatably displaceable emanating element display controller 97 can as described with respect to FIG. 12 can include a processor 803 such as a central processing unit (CPU), a memory 807, a circuit or circuit portion 809, and an input output interface (I/O) 811 that may include a bus (not shown).”) obtain content to be provided for presentation on a display screen of the display device, (see FIG. 6 and FIG. 7 wherein the element 130 is obtained on the display screen 100.) receive vehicle data indicating a motion parameter associated with the vehicle, and perform a transformation of the content (col. 9, lines 47-56 reciting “ Certain embodiments of the timer/sequencer 815 can determine the position of the wheel, such as by determining when the wheel passes a desired point, on successive rotations, and from there determining the angular velocity. Based on the angular velocity and the determined angular position, it can be determined the duration that each repeatably displaceable display emanating element(s) should be actuated to produce the desired output pattern. Certain embodiments of the repeatably displaceable emanating element display 100 can be similarly suitably configured.”) While not explicitly disclosed by Ishikawa, Kim discloses and (ii) the motion parameter associated with the vehicle. (paragraph [0051] reciting “FIG. 3( b) shows a case in which the user rotates the portable device in the clockwise direction. A rotational angle and angular velocity of the portable device may be sensed by a motion sensor, and the image may be displayed horizontally with respect to the user based on the sensed result. That is, the display image is rotated in the direction opposite to the direction in which the portable device is rotated, and therefore, it is possible for the user to continue to view the image in a horizontal state irrespective of the rotation of the portable device. Also, the rotational angle and angular velocity of the portable device may be sensed by the motion sensor, and the sensed result may be reflected in the game application. In other words, the game application, which is driven by the processor of the portable device, may display the car in the game in a state in which the driving of the car is controlled using the rotational angle and angular velocity of the portable device sensed by the motion sensor.”) It would have been obvious to a person of ordinary skills in the art before the effective filing date of the claimed invention to modify Ishikawa with Kim so that the image on the tire remains stationary despite tire spinning/rotation. This is an obviously beneficial modification since Ishikawa discloses actuating the image on the tires as the tires are spinning and keeping the image horizontal allows the viewers to comprehend the image. Thus, Ishikawa modified by Kim can achieve a horizontally stationary image even while the tires are spinning. While the combination of Ishikawa and Kim does not explicitly disclose, Wen discloses and perform a transformation of the content based on: (i) a refresh rate associated with the display device, (paragraph [0120] recites “In this embodiment of the present disclosure, the data that is of the terminal and that needs to be used for the solving process may specifically include one or more of an inherent device parameter (for example, resolution, pixel density, or a screen refresh rate) of the terminal and the data generated during running of the terminal (for example, data generated in a running process or data obtained (for example, downloaded) from another place in a running process). Different device types usually correspond to different device parameters, for example, different resolution, pixel density, or screen refresh rates.” Different display device screens can have different screen refresh rates but such refresh rates are inherent to device screens. Therefore, the portable device in Kim has a refresh rate that is used in conjunction with the angular velocity to display the screen contents horizontally while device is rotating.) It would have been obvious to a person of ordinary skills in the art before the effective filing date of the claimed invention to modify Ishikawa and Kim with Wen so that the image rotation opposite to the device is displayed with respect to the refresh rate because it is inherent that all screen have refresh rate and the rotation of the object is rendered with respect to the refresh rate of the screen. 6. Regarding Claim 2, Kim further discloses The vehicle wheel-based computing device of claim 1, wherein the transformation comprises a rotation of the content in a direction opposite of a rotation direction of the wheel. (paragraph [0051] reciting “FIG. 3( b) shows a case in which the user rotates the portable device in the clockwise direction. A rotational angle and angular velocity of the portable device may be sensed by a motion sensor, and the image may be displayed horizontally with respect to the user based on the sensed result. That is, the display image is rotated in the direction opposite to the direction in which the portable device is rotated, and therefore, it is possible for the user to continue to view the image in a horizontal state irrespective of the rotation of the portable device. Also, the rotational angle and angular velocity of the portable device may be sensed by the motion sensor, and the sensed result may be reflected in the game application. In other words, the game application, which is driven by the processor of the portable device, may display the car in the game in a state in which the driving of the car is controlled using the rotational angle and angular velocity of the portable device sensed by the motion sensor.”) 7. Regarding Claim 3, Ishikawa further discloses The vehicle wheel-based computing device of claim 1, wherein the display device is circular-shaped. (see FIG. 20 wherein image 100b is displayed on circular surface.) 8. Regarding Claim 4, Ishikawa further discloses The vehicle wheel-based computing device of claim 1, wherein the display device covers a rim of the wheel. (see FIG. 20 wherein image 100b is displayed on a circular surface that covers the rim of the tires.) 9. Regarding Claim 5, Ishikawa further discloses The vehicle wheel-based computing device of claim 1, wherein the display device is integrated into the wheel. (see FIG. 20 wherein image 100b is displayed on a circular surface that is attached (integrated) a top the outside surface of the rim of the tire.) 10. Regarding Claim 8, Ishikawa further discloses The vehicle wheel-based computing device of claim 1, wherein the motion parameter comprises at least one of a speed of the vehicle, an angular velocity of the wheel, revolutions per minute of the wheel, a heading of the wheel or the vehicle, or an acceleration of the vehicle. (col. 9, lines 47-56 reciting “ Certain embodiments of the timer/sequencer 815 can determine the position of the wheel, such as by determining when the wheel passes a desired point, on successive rotations, and from there determining the angular velocity. Based on the angular velocity and the determined angular position, it can be determined the duration that each repeatably displaceable display emanating element(s) should be actuated to produce the desired output pattern. Certain embodiments of the repeatably displaceable emanating element display 100 can be similarly suitably configured.”) 11. Regarding Claim 9, Kim further discloses The vehicle wheel-based computing device of claim 1, further comprising: a motion sensor configured to obtain the vehicle data, and wherein the vehicle data comprises at least one of accelerometer data associated with the display device, gyroscope data associated with the display device, wheel torque data, or brake torque data. (paragraph [0030] reciting “The sensor unit 1030 recognizes various input of a user or an environment of the portable device through a plurality of sensors mounted to the portable device, and transmits the recognized result to the controller 1090. The sensor unit 1030 may include a plurality of sensing devices. In an embodiment, the sensing devices may include a gravity sensor, geomagnetic sensor, motion sensor, gyro sensor, acceleration sensor, inclination sensor, brightness sensor, altitude sensor, olfactory sensor, temperature sensor, depth sensor, pressure sensor, bending sensor, audio sensor, video sensor, global positioning system (GPS) sensor, and touch sensor. …” It would have been obvious to use the various gyro sensor data to determine display device rotation and the amount of opposition rotation to apply to the displayed image in order to create stationary appearance.) 12. Regarding Claim 12, Ishikawa further discloses The vehicle wheel-based computing device of claim 1, wherein: the control circuit is configured to obtain an image captured by a camera disposed at the vehicle, and to provide for presentation on the display screen of the display device an augmented-reality image which comprises one or more virtual objects which are overlaid on the image captured by the camera. (see FIG. 9 wherein a live camera taping a game can be displayed on a hubcap and superimposed with the score, which is not part of the live captured image. Likewise, it would have been obvious that a camera actually disposed with the vehicle can be used to capture real live sports events and generated on its hubcab with overlaid augmented reality virtual content scores.) 13. Regarding Claim 13, Kim further discloses The vehicle wheel-based computing device of claim 1, wherein to perform the transformation of the content the control circuit is configured to rotate the content to match an angular velocity of the display device such that an orientation of the content presented on the display device is maintained. (paragraph [0051] reciting “FIG. 3( b) shows a case in which the user rotates the portable device in the clockwise direction. A rotational angle and angular velocity of the portable device may be sensed by a motion sensor, and the image may be displayed horizontally with respect to the user based on the sensed result. That is, the display image is rotated in the direction opposite to the direction in which the portable device is rotated, and therefore, it is possible for the user to continue to view the image in a horizontal state irrespective of the rotation of the portable device. …” In order for the image to appear stationary in spite of angular velocity, the image’s opposite rotation (opposite angular velocity) must match the tire’s angular velocity.) 14. Regarding Claim 14, Kim further discloses The vehicle wheel-based computing device of claim 1, wherein: the content to be provided for presentation on the display screen of the display device comprises an animation comprising a plurality of image frames, and wherein to perform the transformation of the content, the control circuit is configured to rotate one or more of the image frames to maintain, while the animation is provided for presentation on the display screen of the display device during rotation of the wheel, an orientation of the animation as it appears from a viewpoint external to the vehicle. (paragraph [0050] reciting “FIG. 3( a) shows that an image of a game application is displayed on a display screen provided at the front of the portable device. During display of the image, a user may rotate the portable device in the clockwise direction or in the counterclockwise direction as shown in FIG. 2. In particular, in a car driving game as shown in FIG. 3, an advancing direction of the car in the image is controlled according to the rotation of the portable device. That is, a user may rotate the portable device like a steering wheel of the car to control the advancing direction of the car in the image. In this case, if the entirety of the image is rotated with the portable device, it is not possible for the user to view the image in a horizontal state. As shown in FIG. 3, therefore, rotational inclination and acceleration of the portable device may be sensed, the sensed result may be reflected in the game, and, at the same time, the display image may be controlled.” The game application has frames that are being rendered or displayed in succession at a particular refresh rate and its stationary appearance despite rotation of the display device corresponds to animation transformation.) 15. Regarding Claim 15, the limitation The vehicle wheel-based computing device of claim 1, wherein the control circuit is configured to: obtain motion data associated with the display device based on one or more sensors integrated with the display device, and transmit the motion data to an infotainment system of the vehicle to provide for display within the vehicle a representation of the display screen of the display device based on the motion data is obvious in view of Ishikawa modified by Kim. Ishikawa discloses a vehicle that has a speedometer and infotainment system. Kim disclose a device with gyroscope measuring angular velocity of the device. Ishikawa modified by Kim can obtain gyroscope information and display such information in the car’s infotainment system which is obvious since each car already displays speed information through a speedometer, etc. 16. Regarding Claim 16, Ishikawa discloses A computer-implemented method, (col. 30, line 66 to col. 31, line 1 reciting “Within the disclosure, flow charts of the type described in this disclosure apply to method steps as performed by a computer or controller.”) comprising: obtaining, by a vehicle wheel-based computing device, content to be provided for presentation on a display screen of a display device, (see FIG. 6 and FIG. 7 wherein the element 130 is obtained on the display screen 100.) the display device being configured for at partial attachment to a wheel of a vehicle; (see FIG. 5; col. 6, lines 47-62 reciting “Certain embodiments of the repeatably displaceable emanating element display 100 can configure or operate their repeatably displaceable display emanating element(s) 130 to provide a projection type synchronously modifiable oriented image 102 relating to the vehicle-related information, such as to project light onto a nearby illuminated displayed region 140, such as described with respect to FIG. 2. For example, the synchronously modifiable oriented image(s) projected from hubcaps or wheels of a car, bus, or other vehicle or mobile device 106 relating to the vehicle-related information could be projected on such displayed region(s) 140 as the side, hood, front, back, or other region of the vehicle or mobile device, another vehicle or mobile device nearby the emanating vehicle or mobile device, at the wheels of the vehicle or mobile device, or at other portions of the vehicle or mobile device.” Displayed region 140 corresponds to display device at least partially attached to a tire.) receiving, by the vehicle wheel-based computing device, vehicle data indicating a motion parameter associated with the vehicle; and performing, by the vehicle wheel-based computing device, a transformation of the content (col. 9, lines 47-56 reciting “ Certain embodiments of the timer/sequencer 815 can determine the position of the wheel, such as by determining when the wheel passes a desired point, on successive rotations, and from there determining the angular velocity. Based on the angular velocity and the determined angular position, it can be determined the duration that each repeatably displaceable display emanating element(s) should be actuated to produce the desired output pattern. Certain embodiments of the repeatably displaceable emanating element display 100 can be similarly suitably configured.”) While not explicitly disclosed by Ishikawa, Kim discloses and perform a transformation of the content based on: (i) a refresh rate associated with the display device, (While this limitation is not explicitly disclosed, it is inherent that the device in FIG. 3 of Kim must have some kind of refresh rate. In FIG. 3 of Kim, portable device’s must have a refresh rate working in conjunction with the teaching of paragraph [0051] because it is inherent that all digital screen have refresh rates. Wen et al. (US. Patent Application Publication No. 2023/0350702 A1) is used to prove that display screens have inherent refresh rates. Wen at paragraph [0120] recites “In this embodiment of the present disclosure, the data that is of the terminal and that needs to be used for the solving process may specifically include one or more of an inherent device parameter (for example, resolution, pixel density, or a screen refresh rate) of the terminal and the data generated during running of the terminal (for example, data generated in a running process or data obtained (for example, downloaded) from another place in a running process). Different device types usually correspond to different device parameters, for example, different resolution, pixel density, or screen refresh rates.” Different display device screens can have different screen refresh rates but such refresh rates are inherent to device screens. Therefore, the portable device in Kim has a refresh rate that is used in conjunction with the angular velocity to keep the image horizontal while device is rotating.) and (ii) the motion parameter associated with the vehicle. (paragraph [0051] reciting “FIG. 3( b) shows a case in which the user rotates the portable device in the clockwise direction. A rotational angle and angular velocity of the portable device may be sensed by a motion sensor, and the image may be displayed horizontally with respect to the user based on the sensed result. That is, the display image is rotated in the direction opposite to the direction in which the portable device is rotated, and therefore, it is possible for the user to continue to view the image in a horizontal state irrespective of the rotation of the portable device. Also, the rotational angle and angular velocity of the portable device may be sensed by the motion sensor, and the sensed result may be reflected in the game application. In other words, the game application, which is driven by the processor of the portable device, may display the car in the game in a state in which the driving of the car is controlled using the rotational angle and angular velocity of the portable device sensed by the motion sensor.”) It would have been obvious to a person of ordinary skills in the art before the effective filing date of the claimed invention to modify Ishikawa with Kim so that the image on the tire remains stationary despite tire spinning/rotation. This is an obviously beneficial modification since Ishikawa discloses actuating the image on the tires as the tires are spinning and keeping the image horizontal allows the viewers to comprehend the image. Thus, Ishikawa modified by Kim can achieve a horizontally stationary image even while the tires are spinning. 17. Regarding Claim 18, Ishikawa discloses A vehicle, comprising: (see FIG. 1 depicting a vehicle 106 with wheels.) a wheel; and a vehicle wheel-based computing device (col. 7, lines 19-32 reciting “Certain embodiments of the repeatably displaceable emanating element display 100 relating to the vehicle-related information can be configured to be associated with computers, controllers, micro-processors, networked electronic-based devices, and/or computer-based devices as described in this disclosure with respect to FIG. 12 to perform a variety of operations. For example, certain embodiments of the repeatably displaceable emanating element display 100 relating to the vehicle-related information can be configured to sense a variety of parameters, such as temperature, proximity, position/location, velocity, acceleration, and other parameters that may or may not be associated with the operation of the vehicle or mobile device 106. Certain embodiments of the repeatably displaceable emanating element display 100 relating to the vehicle-related information can be configured to transmit and/or receive a variety of signals, data, information, etc. using a variety of networking, computer, communications, sensor or other technologies and mechanisms.”) including: a display device configured for at least partial attachment to the wheel, (see FIG. 5; col. 6, lines 47-62 reciting “Certain embodiments of the repeatably displaceable emanating element display 100 can configure or operate their repeatably displaceable display emanating element(s) 130 to provide a projection type synchronously modifiable oriented image 102 relating to the vehicle-related information, such as to project light onto a nearby illuminated displayed region 140, such as described with respect to FIG. 2. For example, the synchronously modifiable oriented image(s) projected from hubcaps or wheels of a car, bus, or other vehicle or mobile device 106 relating to the vehicle-related information could be projected on such displayed region(s) 140 as the side, hood, front, back, or other region of the vehicle or mobile device, another vehicle or mobile device nearby the emanating vehicle or mobile device, at the wheels of the vehicle or mobile device, or at other portions of the vehicle or mobile device.” Displayed region 140 corresponds to display device at least partially attached to a tire.) and a control circuit configured to: (see FIG. 20 wherein display control 97 is a control circuit; col. 28, lines 36-41 reciting “Certain embodiments of the repeatably displaceable emanating element display controller 97 can as described with respect to FIG. 12 can include a processor 803 such as a central processing unit (CPU), a memory 807, a circuit or circuit portion 809, and an input output interface (I/O) 811 that may include a bus (not shown).”) obtain content to be provided for presentation on a display screen of the display device, (see FIG. 6 and FIG. 7 wherein the element 130 is obtained on the display screen 100.) receive vehicle data indicating a motion parameter associated with the vehicle, and perform a transformation of the content (col. 9, lines 47-56 reciting “ Certain embodiments of the timer/sequencer 815 can determine the position of the wheel, such as by determining when the wheel passes a desired point, on successive rotations, and from there determining the angular velocity. Based on the angular velocity and the determined angular position, it can be determined the duration that each repeatably displaceable display emanating element(s) should be actuated to produce the desired output pattern. Certain embodiments of the repeatably displaceable emanating element display 100 can be similarly suitably configured.”) While not explicitly disclosed by Ishikawa, Kim discloses and perform a transformation of the content based on: (i) a refresh rate associated with the display device, (While this limitation is not explicitly disclosed, it is inherent that the device in FIG. 3 of Kim must have some kind of refresh rate. In FIG. 3 of Kim, portable device’s must have a refresh rate working in conjunction with the teaching of paragraph [0051] because it is inherent that all digital screen have refresh rates. Wen et al. (US. Patent Application Publication No. 2023/0350702 A1) is used to prove that display screens have inherent refresh rates. Wen at paragraph [0120] recites “In this embodiment of the present disclosure, the data that is of the terminal and that needs to be used for the solving process may specifically include one or more of an inherent device parameter (for example, resolution, pixel density, or a screen refresh rate) of the terminal and the data generated during running of the terminal (for example, data generated in a running process or data obtained (for example, downloaded) from another place in a running process). Different device types usually correspond to different device parameters, for example, different resolution, pixel density, or screen refresh rates.” Different display device screens can have different screen refresh rates but such refresh rates are inherent to device screens. Therefore, the portable device in Kim has a refresh rate that is used in conjunction with the angular velocity to keep the image horizontal while device is rotating.) and (ii) the motion parameter associated with the vehicle. (paragraph [0051] reciting “FIG. 3( b) shows a case in which the user rotates the portable device in the clockwise direction. A rotational angle and angular velocity of the portable device may be sensed by a motion sensor, and the image may be displayed horizontally with respect to the user based on the sensed result. That is, the display image is rotated in the direction opposite to the direction in which the portable device is rotated, and therefore, it is possible for the user to continue to view the image in a horizontal state irrespective of the rotation of the portable device. Also, the rotational angle and angular velocity of the portable device may be sensed by the motion sensor, and the sensed result may be reflected in the game application. In other words, the game application, which is driven by the processor of the portable device, may display the car in the game in a state in which the driving of the car is controlled using the rotational angle and angular velocity of the portable device sensed by the motion sensor.”) 18. Regarding Claim 19, Kim further discloses The vehicle of claim 18, wherein the transformation comprises a rotation of the content in a direction opposite of a rotation direction of the wheel. (paragraph [0051] reciting “FIG. 3( b) shows a case in which the user rotates the portable device in the clockwise direction. A rotational angle and angular velocity of the portable device may be sensed by a motion sensor, and the image may be displayed horizontally with respect to the user based on the sensed result. That is, the display image is rotated in the direction opposite to the direction in which the portable device is rotated, and therefore, it is possible for the user to continue to view the image in a horizontal state irrespective of the rotation of the portable device. Also, the rotational angle and angular velocity of the portable device may be sensed by the motion sensor, and the sensed result may be reflected in the game application. In other words, the game application, which is driven by the processor of the portable device, may display the car in the game in a state in which the driving of the car is controlled using the rotational angle and angular velocity of the portable device sensed by the motion sensor.”) 19. Regarding Claim 20, Ishikawa further discloses The vehicle of claim 18, wherein the display device is circular-shaped and covers a rim of the wheel. (see FIG. 20 wherein image 100b is displayed on a circular surface that covers the rim of the tires.) 20. Regarding Claim 22, Ishikawa further discloses The vehicle of claim 18, wherein the motion parameter comprises an angular velocity of the display device, the display device comprises a motion sensor configured to measure motion data associated with the display device, and the control circuit is configured to determine the angular velocity of the display device based on the motion data. (col. 9, lines 47-56 reciting “ Certain embodiments of the timer/sequencer 815 can determine the position of the wheel, such as by determining when the wheel passes a desired point, on successive rotations, and from there determining the angular velocity. Based on the angular velocity and the determined angular position, it can be determined the duration that each repeatably displaceable display emanating element(s) should be actuated to produce the desired output pattern. Certain embodiments of the repeatably displaceable emanating element display 100 can be similarly suitably configured.”) 21. Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Ishikawa in view of Kim in view of Wen and further in view of Derek S. Harris (US Patent Application Publication No. 2014/0204344 A1).22. Regarding Claim 6, while the combination of Ishikawa and Kim does not explicitly disclose, Harris discloses The vehicle wheel-based computing device of claim 1, wherein the display device is configured to be detachable from the wheel and attachable to the wheel. (see FIG. 3; paragraph [0015] reciting “… Manufactured with a projection screen material such as plasma, liquid crystal display or LCD, or cathode ray tube or CRT, the DS Digi Holo Wheel is shaped as a convex, concaved, or flat screen. The screen coverings are manufactured in sizes appropriate for various styles and sizes of popular tires. The standard size is approximately twenty two inches. The DS Digi Holo Wheels rims are packaged in sets of four rims.”; paragraph [0017] reciting “FIGS. 3, 4 and 5 show the device wherein 1 is the clear plastic or glass cover that can withstand any road condition. Element 2 is the digital screen, holographic screen or image, or picture image for a hub cap with or without LED lights on the back. Element 3 is a specially designed car rim for containing the digital or holographic screens, or regular screen for a hub cap picture image. Element 4 is a standard car tire.” Element 2 is detachable/attachable with the wheel.) It would have been obvious to a person of ordinary skills in the art before the effective filing date of the claimed invention to modify Ishikawa, Kim, and Wen with Harris so that the display devices are attachable/detachable from the wheel itself. This is an obviously beneficial modification since it allows for easier replacing or fixing of the display devices if required. 23. Claim 23 is rejected under 35 U.S.C. 103 as being unpatentable over Ishikawa in view of Kim in view of Wen and further in view of Scott MacLaughlin (US Patent Application Publication No. 2011/0074342 A1). 24. Regarding Claim 23, while the combination of Ishikawa and Kim does not explicitly disclose, MacLaughlin discloses The vehicle of claim 18, further comprising a charging system disposed proximate to the display device, the charging system being configured to transfer power wirelessly to a power source of the display device. (see FIG. 1 and FIG. 3 wherein the charging station 105 can send electrical power wireless over a distance to a device with sensor 114 such as device 102 or display devices in Ishikawa.) It would have been obvious to a person of ordinary skills in the art before the effective filing date of the claimed invention to modify Ishikawa, Kim, and Wen with MacLaughlin so that wireless devices can be placed with the vehicle and be used to charge the display device when needed. This is an obviously beneficial modification since wireless power charging is more convenient and requires no connection wires for charging. Allowable Subject Matter 25. Claims 7, 10-11, 17, and 21 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. 26. The following is a statement of reasons for the indication of allowable subject matter: Claim 7 recites the limitation wherein the control circuit is configured to render the content to be provided for presentation on the display screen of the display device, the content being generated at least partially via one or more machine-learned models, wherein the one or more machine-learned models comprise one or more machine-learned generative models trained based on training data indicative of a plurality of wheel-based features which is neither disclosed nor suggested by the cited references, either singly or in combination. 27. Claim 10 recites the limitation wherein the motion parameter comprises an angular velocity of the display device and wherein the transformation comprises a rotation of the content, and wherein when the angular velocity of the display device is less than a threshold angular velocity value, the control circuit is configured to rotate the content at a rotation rate which matches the angular velocity of the display device, and when the angular velocity of the display device is more than the threshold angular velocity value, the content corresponds to a static image and the control circuit is configured to provide the static image for presentation on the display screen of the display device which is neither disclosed nor suggested by the cited references, either singly or in combination. 28. Claim 11 recites the limitation wherein the motion parameter comprises an angular velocity of the display device and wherein the transformation comprises a rotation of the content, and wherein when the angular velocity of the display device is less than a threshold angular velocity value, the control circuit is configured to rotate the content at a rotation rate which matches the angular velocity of the display device, and when the angular velocity of the display device is more than the threshold angular velocity value, the content corresponds to a video stream and the control circuit is configured to provide the video stream for presentation on the display screen of the display device which appears stationary on the display device which is neither disclosed nor suggested by the cited references, either singly or in combination. 29. Claim 17 recites the limitation further comprising rendering, by the vehicle wheel-based computing device, the content to be provided for presentation on the display screen of the display device, the content being generated at least partially via one or more machine-learned models, wherein the one or more machine-learned models comprise one or more machine-learned generative models trained based on training data indicative of a plurality of wheel-based features which is neither disclosed nor suggested by the cited references, either singly or in combination. 30. Claim 21 recites the limitation wherein the control circuit is configured to receive an input requesting content to be provided for presentation on the display screen of the display device, and to render the content to be provided for presentation on the display screen of the display device based on the input, the content being rendered at least partially via one or more machine-learned models, wherein the one or more machine-learned models comprise one or more machine-learned generative models trained based on training data indicative of a plurality of wheel-based features which is neither disclosed nor suggested by the cited references, either singly or in combination. CONTACT Any inquiry concerning this communication or earlier communications from the examiner should be directed to FRANK S CHEN whose telephone number is (571)270-7993. The examiner can normally be reached Mon - Fri 8-11:30 and 1:30-6. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kee Tung can be reached at 5712727794. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /FRANK S CHEN/Primary Examiner, Art Unit 2611
Read full office action

Prosecution Timeline

Jul 21, 2023
Application Filed
Jan 29, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597111
SYSTEMS AND METHODS FOR DULL GRADING
2y 5m to grant Granted Apr 07, 2026
Patent 12596007
DISPLAY CONTROL APPARATUS, DISPLAY SYSTEM, DISPLAY METHOD, AND COMPUTER READABLE MEDIUM
2y 5m to grant Granted Apr 07, 2026
Patent 12592029
SYSTEMS AND METHODS FOR MEDIA CONTENT GENERATION
2y 5m to grant Granted Mar 31, 2026
Patent 12586308
GENERATING OBJECT REPRESENTATIONS USING NEURAL NETWORKS FOR AUTONOMOUS SYSTEMS AND APPLICATIONS
2y 5m to grant Granted Mar 24, 2026
Patent 12586293
SCENE RECONSTRUCTION FROM MONOCULAR VIDEO
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
82%
Grant Probability
91%
With Interview (+8.8%)
2y 2m
Median Time to Grant
Low
PTA Risk
Based on 657 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month