DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
2. This Office Action is sent in response to Applicant's Communication received on October 14, 2025.
Response to Arguments
Applicant’s amendment/remarks filed October 14, 2025, with respect to claim 17 rejections under 35 U.S.C. 112(b) have been fully considered and are persuasive. Accordingly, said claim 17 rejections under 35 U.S.C. 112(b) have been withdrawn.
Applicant's arguments/remarks filed October 14, 2025, with respect to claims 1-17 rejections under 35 U.S.C. 103 as being unpatentable over (OKADA – US 2016/0251030 A1), in view of (Maley – US 2016/0353049 A1) have been fully considered but they are not persuasive as explained below.
Applicant respectfully asserts that the cited prior art fails to discloses/teaches the limitations “determining an actual position of the camera relative to the wheel; wherein determining the actual position of the camera is performed based on any one or more of the following: a signal of an inertial measurement unit; an image acquired by the camera, showing a portion of an external environment; a signal of a LIDAR device directed to an external environment; and/or at least one signal of at least one proximity sensor; obtaining a corrected parameter on the basis of the actual position of the camera; and determining an operation state of the wheel based on the corrected parameter”, as required in at least claim 1.
The Examiner respectfully submits the following:
Maley (Figs. 1 and 3-6) discloses/teaches ground engaging member 310 in three different states. In FIG. 3, the ground engaging member 310 is orientated straight, and in FIGS. 4 and 5, the ground engaging member 310 is orientated at an angle. If the state of the ground engaging member 310 in FIG. 3 is taken as the reference position, then the state the ground engaging member 310 in FIG. 3 may be represented as 0° and the states of the ground engaging member 310 in FIGS. 4 and 5 may be represented as an angular displacement from the position of the ground engaging member 310 in FIG. 3. Examples of ground engaging members 125, 135, 145, 155 include a wheel with a tire mounted thereon and a continuous track system. The processor may calculate a projected path of travel for the machine using the determined states of the ground engaging members along with additional information about the ground engaging members, the properties of the images captured by the cameras, and the machine. Such additional information may include physical properties of the ground engaging members that may affect is path of travel (e.g., traction properties, contact patch, load sensitivity) and any distortions present in the images due to the position of the cameras and their lens ([0021]).
Further on, Maley (Figs. 1 and 3-6) discloses/teaches that metadata may also include extrinsic data of the image capture device 170 such as its orientation and position and data regarding the machine (e.g., articulation angle) ([0039]). The computing device 250 may take into account any distortions or other optical aberrations that are present in the landscape image captured by the image capture device 170. The computing device 250 may also consider the position and orientation of the image capture device 170 and the location of the machine 110 relative to the area shown in the landscape image ([0051]). When the machine is in operation, the two cameras mounted on the machine may take images including sections of the two front wheels of the machine. {{{The system may use the images captured by the two cameras to determine a current state or position of the front wheels}}} ([0074]).
Still further, it is submitted that OKADA already disclosed making corrections to the state of the wheels position, like correcting for the distance of the wheels within the measurement image. Also, Maley discloses a similar/equivalent wheel monitoring system using image capture devices (e.g., cameras, etc.) to measure the positions and state of the machine wheels.
Accordingly, when one skilled in the art incorporates the teachings of Maley into OKADA, it should’ve arrived at the claimed limitations “determining an actual position of the camera relative to the wheel; wherein determining the actual position of the camera is performed based on any one or more of the following: a signal of an inertial measurement unit; an image acquired by the camera, showing a portion of an external environment; a signal of a LIDAR device directed to an external environment; and/or at least one signal of at least one proximity sensor; obtaining a corrected parameter on the basis of the actual position of the camera; and determining an operation state of the wheel based on the corrected parameter” as required in at least claim 1, to minimize/reduce/eliminate any distortions present in the images due to the position of the cameras and their lens (motivation disclosed in Maley above).
Disposition of Claims
Claims 1-17 are pending in this application.
Claims 1-17 are rejected.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or non-obviousness.
Claims 1-17 are rejected under 35 U.S.C. 103 as being unpatentable over (OKADA – US 2016/0251030 A1), in view of (Maley – US 2016/0353049 A1).
Regarding claim 1, OKADA discloses:
A method for monitoring (steering angle correction method for vehicle 10: Fig. 1) at least one wheel (steered wheel 14: Fig. 1) of a vehicle (vehicle 10: Fig. 1), said method comprising:
acquiring an image (acquiring a measurement image, using at least side cameras 11 thru acquisition unit 21, of a steered wheel 14 in correspondence with a steering angle of vehicle 10: Figs. 1-2) of said at least one wheel (steered wheel 14: Fig. 1) of the vehicle (vehicle 10: Fig. 1) by using a camera (at least one of side cameras 11) that is secured to the vehicle (vehicle 10: Fig. 1);
identifying at least one visual feature of the wheel on the image (Fig. 2 and [0053]: “The extraction unit 23 is configured to extract the steered wheel 14 on a measurement image and the steered wheel 14 on the reference image. For example, the extraction unit 23 extracts the steered wheel 14 on each image by edge detection and ellipse detection (approximation). Edge detection uses, for example, the Canny filter to detect edge components on the image. Ellipse detection uses, for example, Hough transformation to calculate an elliptical shape approximated to the steered wheel 14 and parameters of the ellipse. The parameters of the ellipse refer to the central coordinate (x, y), the long side length a, and the short side length b of the ellipse and an inclination φ formed by the horizontal axis and the major axis of the ellipse on the image. Preferably, the extraction unit 23 extracts the steered wheel 14 on the image in a predetermined area on the image that may include the steered wheel 14. The predetermined area on the image that may include the steered wheel 14 may be determined in advance in consideration of, for example, differences in mounting angle of the side cameras 11 or may be determined automatically by the extraction unit 23 according to any method”);
detecting at least one parameter of said at least one identified feature (Fig. 2 and [0055]: “The arithmetic unit 24 (refer to FIG. 2) is configured to calculate the degree of coincidence between the steered wheel 14 on a measurement image and the steered wheel 14 on the reference image. The degree of coincidence is an indicator of the degree of correspondence in shape of the steered wheel 14 on both the images. In the present embodiment, the arithmetic unit 24 performs image matching by using the measurement image from which the steered wheel 14 is extracted by the extraction unit 23 and the reference image and calculates the degree of correlation obtained from image matching as the degree of coincidence. The arithmetic unit 24 also stores, in the correction device storage 22, data of the calculated degree of coincidence (correlation) and the steering angle θm of the steered wheel that the steering angle sensor detects when the measurement image used for the calculation of the degree of coincidence is captured, in correspondence with each other”);
determining an operation state of the wheel based on said parameter (Fig. 2 and [0056]: “The arithmetic unit 24 also calculates a steering angle θp at the peak position of the calculated probability distribution. Preferably, the arithmetic unit 24 calculates the steering angle θp when sufficient data to calculate the probability distribution is stored. For example, the arithmetic unit 24 may calculate the steering angle θp when the number of pieces of stored data is greater than or equal to a predetermined threshold th1 and when a variance σ2 of the calculated probability distribution is less than a predetermined threshold th2”);
;
:
.
But OKADA does not explicitly and/or specifically meet the following limitations:
(A) determining an actual position of the camera relative to the wheel; wherein determining the actual position of the camera is performed based on any one or more of the following: a signal of an inertial measurement unit; an image acquired by the camera, showing a portion of an external environment; a signal of a LIDAR device directed to an external environment; and/or at least one signal of at least one proximity sensor; obtaining a corrected parameter on the basis of the actual position of the camera; and determining an operation state of the wheel based on the corrected parameter.
However, regarding limitation (A) above, Maley (Figs. 1 and 3-6) discloses/teaches the following:
The processor may determine the current states of the ground engaging members by analyzing the orientation of physical markers on the ground engaging members. Physical markers on a ground engaging member may include a paint marker, a tread pattern, or other visible feature associated with the ground engaging member. To provide an example, a ground engaging member such as a wheel may have a tread line that runs parallel to its sides. The processor may identify the tread line and, knowing that the tread line runs parallel to the sides of the wheel, determine whether the wheel is facing forward or rotated at an angle. The processor may also determine the current states of the ground engaging members by comparing the images of the ground engaging members captured by the cameras to a pre-stored set of images, each of which depicts the ground engaging members in a different state. Then, by identifying a match between the images captured by the cameras and one of the pre-stored images, the processor may determine the current state of a ground engaging member ([0020]).
The processor may calculate a projected path of travel for the machine using the determined states of the ground engaging members along with additional information about the ground engaging members, the properties of the images captured by the cameras, and the machine. Such additional information may include physical properties of the ground engaging members that may affect is path of travel (e.g., traction properties, contact patch, load sensitivity) and any distortions present in the images due to the position of the cameras and their lens. Such information may also include information about the machine such as axial length or wheel-based length, ground engaging member size or tract size, type of ground engaging member, etc. The projected path of travel may be displayed overlaying an image of the area surrounding the machine in its direction of travel. A camera mounted in the front or rear of the machine may be used to capture an image of the area surrounding the machine in its direction of travel. The camera may incorporate a wide-angle lens in order to maximize the surrounding area that it captures. Because wide-angle lens cameras typically distort the horizontal or vertical perspectives of an image, the processor may also take into account any distortions caused by this camera in calculating the projected path of travel of the machine ([0021]).
Specifically, the machine 100 may include one or more image capture devices 120, 130, 140, 150, 160, 170. The image capture devices 120, 130, 140, 150, 160, 170 may be digital cameras, video recorders, or other types of optical instruments that records images that can be stored directly or transmitted to another location. While the machine 110 is shown having six image capture devices 120, 130, 140, 150, 160, 170, those skilled in the art will appreciate that the machine 110 may include any number of image capture devices arranged in any manner capable of accomplishing the methods disclosed herein ([0025]).
The system 200 may include a computing device 250 configured to receive and analyze data (e.g., images captured by the image capture devices 120, 130, 140, 150, 160, 170) relating to the machine 110 and its ground engaging members 125, 135, 145, 155 ([0035]). The processor 270 of the computing device 250, executing the state determining module 276, may analyze the portions of the images captured by the image capture devices 120, 130 that depict the ground engaging members 125, 135 to determine a current state of each of the ground engaging members 125, 135. One method of performing the analysis and determining the current states of the ground engaging members 125, 135 involves comparing the portions to a set of reference portions to identify a reference portion that matches ones of the portions of the images captured by the image capture devices 120, 130. The set of reference portions may be from images that were previously taken of the ground engaging members 125, 135 during a calibration process, described in further detail below. Each of the reference portions is associated with a particular state of one of the ground engaging members 125, 135. Once a match is found between a reference portion and a portion of an image captured by the image capture devices 120, 130, the processor 270 may determine that the current state of the ground engaging member is the state of that ground engaging member that is associated with the reference portion ([0045]). A second method of performing the analysis to determine the current states of the ground engaging members 125, 135 involves using one or more features of a specific ground engaging member in the one or more portions of the images captured by image capture devices 120, 130. To provide an example, if a particular ground engaging member has tread marks, the processor 270 of the computing device 250 may identify a tread mark of the ground engaging member and use its orientation (e.g., configuration) to determine the current state of the ground engaging member. As depicted in FIGS. 3, 4, and 5, the ground engaging member 310 may comprise a tread mark 320. The orientation of the tread mark 320 may change as the state of the ground engaging member 310 changes. Specifically, as the state of the ground engaging member 310 displaces in an angular direction to the left (e.g., from FIG. 3 to FIG. 4), the tread mark 320 changes its orientation in a similar fashion. For each state of the ground engaging member 310, the orientation of the tread mark 320 would be different. For example, when the state of the ground engaging member 310 is that shown in FIG. 4, the orientation of the tread mark 320 may be angled to the left by 5°. Then, when the state of the ground engaging member 310 is that shown in FIG. 5, the orientation of the tread mark 320 may be angled to the left by 10°. Thus, by analyzing the tread mark 320 and determining its orientation, the processor 270 may be able to discern a current state of the ground engaging member 310 ([0046]).
The computing device 250 may generate a graphic (e.g., a set of lines) representing the first projected path of the machine 110 to overlay on the landscape image captured by the image capture device 170. In generating the graphic representing the first projected path, the computing device 250 may take into account any distortions or other optical aberrations that are present in the landscape image captured by the image capture device 170. The computing device 250 may also consider the position and orientation of the image capture device 170 and the location of the machine 110 relative to the area shown in the landscape image ([0051]). When the machine is in operation, the two cameras mounted on the machine may take images including sections of the two front wheels of the machine. The system may use the images captured by the two cameras to determine a current state or position of the front wheels. In particular, the system may process the images to isolate the portions of the images that depict the wheels. The system may compare the portions that depict the wheels to portions depicting the wheels in the set of calibration images. If a match is found between a portion depicting the wheels in operation and a portion depicting the wheels in the set of calibration images, then the system may determine that the current state of a wheel is the state of the wheel that is associated with the calibration image including the matching portion ([0074]).
Still further, regarding the motivation for combining elements under 35 U.S.C. 103, the Examiner respectfully notes that the above combination is a 2-way linear superposition of “Prior Art Elements”, where one skilled in the art can incorporate the teachings of Maley into OKADA, or the other way around, incorporate the teachings of OKADA into Maley, where both ways would result in the claimed limitations.
Accordingly, one skilled in the art would have been motivated to incorporate the teachings of Maley into OKADA to minimize/reduce/eliminate any distortions present in the images due to the position of the cameras and their lens.
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the vehicle wheel monitoring system of OKADA incorporating additional controller instructions/calculation-steps as taught by Maley to minimize/reduce/eliminate any distortions present in the images due to the position of the cameras and their lens.
Regarding claim 2, OKADA as combined above disclose the method according to claim 1, and further on OKADA as combined above also discloses:
wherein obtaining the corrected parameter comprises one or more of:
applying an image correction to the image before identifying the at least one visual feature (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]);
applying a feature correction to the identified feature before detecting the at least one parameter (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]); or
applying a parameter correction on the detected parameter (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]).
Regarding claim 3, OKADA as combined above disclose the method according to claim 1, and further on OKADA as combined above also discloses:
wherein determining the operation state of the wheel comprises determining the orientation of the wheel relative to the vehicle (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]).
Regarding claim 4, OKADA as combined above disclose the method according to claim 1, and further on OKADA as combined above also discloses:
wherein determining the operation state of the wheel comprises determining puncture of the wheel (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]).
Regarding claim 5, OKADA as combined above disclose the method according to claim 1, and further on OKADA as combined above also discloses:
wherein determining the operation state of the wheel comprises determining looseness of the wheel (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]).
Regarding claim 6, OKADA as combined above disclose the method according to claim 1, and further on OKADA as combined above also discloses:
wherein said identification step comprises identifying at least one of:
a rim of the wheel (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]),
at least one optically detectable marking arranged on the wheel (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]),
a contact region between the wheel and a road surface (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]),
an outer edge of the tire of the wheel (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]).
Regarding claim 7, OKADA as combined above disclose the method according to claim 1, and further on OKADA as combined above also discloses:
wherein said identification step comprises
fitting an ellipse onto the boundary between the edge of a rim of the wheel or onto a shape scribed by a marking on the wheel (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]), and the detected parameter is any of the following:
the length of a major and/or minor axis of said ellipse; and/or
an orientation of the major or minor axis of said ellipse (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]).
OKADA (Figs. 4A-4D and 14A-14D; [0053-0054, 0087]) specifically discloses examples where ellipse detection is applied to a wheel portion of the steered wheel 14. However, ellipse detection may also be applied to, for example, a tire portion of the steered wheel 14. Although in the examples where θR=10° or more the wheel portion of the steered wheel 14 has a partly missing elliptical shape (refer to FIGS. 3B to 3D), an ellipse conforming to the wheel portion of the steered wheel 14 is detected by ellipse detection (refer to FIGS. 4B to 4D). The shape of the detected ellipse differs depending on the steering angle θR of the steered wheel when the image is captured. In detail, as the steering angle θR of the steered wheel increases, the inclination φ formed by the horizontal axis and the major axis of the detected ellipse on the image decreases, and the short side length b of the same increases. On the other hand, even when the steering angle θR of the steered wheel increases, changes in the remaining parameters, namely, the central coordinate (x, y) and the long side length a, of the ellipse are relatively small (refer to FIGS. 4B to 4D). The parameters of the ellipse refer to the central coordinate (x, y), the long side length a, and the short side length b of the ellipse and an inclination φ formed by the horizontal axis and the major axis of the ellipse on the image. Preferably, the extraction unit 23 extracts the steered wheel 14 on the image in a predetermined area on the image that may include the steered wheel 14. The predetermined area on the image that may include the steered wheel 14 may be determined in advance in consideration of, for example, differences in mounting angle of the side cameras 11 or may be determined automatically by the extraction unit 23 according to any method.
Regarding claim 8, OKADA as combined above disclose the method according to claim 1, and further on OKADA as combined above also discloses:
wherein the vehicle is a tractor for semi-trailers (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]).
Regarding claim 9, OKADA as combined above disclose the method according to claim 1, and further on OKADA as combined above also discloses:
taking further action based on the determined operation state any one or more of the following:
providing a warning to the driver regarding the operation state (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]);
providing assistance for navigation or parking by displaying a predicted path of the vehicle (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]);
transmitting a warning signal to a remote computer regarding the operation state; initiating an emergency maneuver (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]); and/or
controlling the steering of the vehicle by applying a steering angle correction on the basis of the operation state (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]).
Regarding claim 10, OKADA as combined above disclose the method according to claim 1, and further on OKADA as combined above also discloses:
wherein the wheel comprises a steered wheel and an actual steering angle is determined based on the determined orientation (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]).
Regarding claim 11, OKADA as combined above disclose the method according to claim 1, and further on OKADA as combined above also discloses:
wherein:
the vehicle comprises a tractor-trailer combination (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]);
the camera is located on one of the tractor and the trailer (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]); and
acquiring an image of the at least one wheel comprises acquiring an image of a wheel of the other one of the tractor and the trailer (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]).
Regarding claim 12, OKADA as combined above disclose the method according to claim 11, and further on OKADA as combined above also discloses:
wherein:
acquiring an image of said wheel comprises acquiring an image of an unsteered wheel (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]); and
an articulation angle of the trailer relative to the tractor is determined on the basis of the determined orientation of the wheel (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]).
Regarding claim 13, OKADA as combined above disclose the method according to claim 1, and further on OKADA as combined above also discloses:
wherein the detected visual feature is the outer edge of a tire of the wheel (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]).
Regarding claim 14, OKADA as combined above disclose the method according to claim 1, and further on OKADA as combined above also discloses:
a system for monitoring at least one wheel of a vehicle (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]),
said system comprising:
a camera for recording images of the at least one wheel, wherein said camera is secured to the vehicle (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]);
a processing unit for processing said images, wherein said processing unit is in data communication with the camera, said processing unit comprises a computer program having instructions that, when executed on the processing unit (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]), cause the processing unit to
determining an operation state of the wheel based on said processed images, characterized in that further comprising means for determining an actual position of the camera relative to the wheel, wherein the means for determining the actual position of the camera comprise an image processor connected to the processing unit for determining the actual position of the camera based on images recorded by the camera and visual features identified on said images, and said processing unit comprises a computer program having instructions that, when executed on the processing unit (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]), cause the processing unit to perform the method according to claim 1.
Regarding claim 15, OKADA as combined above disclose the system according to claim 14, and further on OKADA as combined above also discloses:
wherein the means for determining the actual position of the camera comprise at least one of:
a LIDAR device directed to an external environment (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]); and/or
at least one proximity sensor (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]);
an inertial measurement unit comprising {{{one or more}}} of:
at least one accelerometer (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]),
at least one gyroscope (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]), and
at least one magnetometer (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]).
Regarding claim 16, OKADA as combined above disclose the system according to claim 14, and further on OKADA as combined above also discloses:
wherein:
the vehicle comprises a tractor-trailer combination (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]);
the camera is located on one of the tractor and the trailer (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]); and
acquiring an image of the at least one wheel comprises acquiring an image of a wheel of the other one of the tractor and the trailer (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]).
Regarding claim 17, OKADA as combined above disclose the system according to claim 14, and further on OKADA as combined above also discloses:
a vehicle, having at least two wheels (OKADA [0050-0079] and Maley [0020-0021, 0025, 0035, 0045-0046, 0051, 0074]), characterized by comprising the system according to claim 14.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Ruben Picon-Feliciano whose telephone number is (571)-272-4938. The examiner can normally be reached on Monday-Thursday within 11:30 am-7:30 pm ET.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Lindsay M. Low can be reached on (571)272-1196. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/RUBEN PICON-FELICIANO/Examiner, Art Unit 3747
/GRANT MOUBRY/Primary Examiner, Art Unit 3747