DETAILED ACTION
This communication is a Non-Final Office Action on the Merits. Claims 1-10 as originally filed are currently pending and have been considered as follows:
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Specification
The specification is objected to because of the following informalities:
“car wash lane 11” in ¶23 & ¶27 should read “car wash lane 10”
Appropriate correction is required.
Claim Objections
Claim(s) 5 is objected to because of the following informalities:
“control module” in Claim 5 should read “control unit”
Appropriate correction is required.
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are:
“recognition unit,” in claim(s) 1 & 2
“positioning unit” in claim(s) 1 & 4
“control unit” in claim(s) 1, 3, & 5
Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claim 1, 7, and 8 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 1 recites the limitation "the parking area”. There is insufficient antecedent basis for this limitation in the claim as the limitation “the parking area” is not previously referred to in the previous limitations. Therefore, the claim is indefinite.
Claim 7 recites “the corresponding vehicle type information is compared and read from the database through the vehicle feature.” This limitation is indefinite because it is unclear what is being “compared” (i.e., compared to what), and it is further unclear how the “read from the database” operation is performed “through the vehicle feature.” As written, the step boundaries and required operations are not reasonably ascertainable. Therefore, the claim is indefinite.
Claim 8 recites: “after obtaining the vehicle feature, it is determined whether the vehicle feature exists in the database…”. The pronoun “it” lacks clear antecedent basis and does not identify what performs the “determining” step (i.e., the control unit, the recognition unit, a human operator, etc.). As a result, it is unclear who/what is carrying out the recited determination. Therefore, the claim is indefinite.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 1, 2, and 6 are rejected under 35 U.S.C. 103 as being unpatentable over Kim (US Pub. No. 20200189532) in view of Akeel (US Pub. No. 20190047145) in further view of Masahiko (JP Pat. No. 4022331).
As per Claim 1, Kim discloses of a vehicle washing apparatus, comprising:
at least one car wash lane, configured for one vehicle to pass and park; (as per Fig. 1)
a recognition unit, located close to the car wash lane for identifying the vehicle that is going to enter the car wash lane and obtaining a vehicle feature; (as per “The data collection unit 110 may determine from a camera 31 whether a vehicle 10 enters a vehicle wash region of a vehicle 10 as a wash target. The camera 31 may capture an image of at least the vehicle wash region 20… The data collection unit 110 may acquire an image for determination of the data and the outer appearance (e.g., additional accessories) of the vehicle” in ¶30)
a positioning unit, located close to the car wash lane for locating the vehicle parked in the car wash lane to obtain parking data; (as per “The data collection unit 110 may acquire an image for determination of the data and the outer appearance (e.g., additional accessories) of the vehicle” in ¶30, as per “whether the vehicle 10 is disposed in the vehicle wash region 20 may be detected based on information acquired through at least one of the camera 31 and the sensor 32 (S301)” in ¶61, as per Fig. 1)
at least one washing unit, located close to the car wash lane and having a robot arm, the robot arm having a working end equipped with a washing structure for washing the vehicle; (as per “washing the vehicle through a robot unit including at least one vehicle washing device” in Claim 1, as per “The vehicle washing device may include a roller 450, a solution nozzle module 460, and an air nozzle 470. The cameras 440, the roller 450, the solution nozzle module 460, and the air nozzle 470 may be disposed on the bottom surface of the body 410 of the robot unit 400” in ¶55)
a control unit, electrically connected to the recognition unit, the positioning unit and the washing unit, (as per “The vehicle wash control device 100 may include a data collection unit 110, a data storage unit 120, a communication unit 130, a vehicle wash controller 140, an audio processor 150, a display unit 160, a user input unit 170, and a fluid controller 180” in ¶29, as per “The body 410 may be connected to the robot holder 310 of the moving unit through the connection unit 350, and may be connected to a signal line of the control line CL, for control of the robot unit 400, through the connection unit 350. The body 410 may have therein a processor for control of the overall operation of the robot unit 400 including the remaining components such as the legs 420, the ground parts 430, and the cameras 440 according to a control signal from the vehicle wash controller 140, and an actuator for moving each of the legs 420” in ¶49)
wherein when the vehicle is parked in the parking area (as per “determining, by a vehicle controller, information on an outer appearance of a vehicle based on information acquired from at least one of a data collection unit configured to acquire an image of the vehicle positioned in a vehicle wash region from at least a camera” in ¶12, as per “whether the vehicle 10 is disposed in the vehicle wash region 20 may be detected based on information acquired through at least one of the camera 31 and the sensor 32 (S301)” in ¶61)
the parking data obtained by the positioning unit is set as initial data, (as per “. The data collection unit 110 may further acquire position information of the vehicle 10 in the vehicle wash region 20 from a sensor 32. In this case, two or more cameras 31 may be disposed to have different photographing ranges” in ¶30, as per “The vehicle wash controller 140 may acquire information on the outer appearance of the vehicle based on data collected through at least one of the data collection unit 110, the data storage unit 120, the communication unit 130, and the user input unit 170” in ¶34)
wherein when the vehicle is parked in the parking area (as per “determining, by a vehicle controller, information on an outer appearance of a vehicle based on information acquired from at least one of a data collection unit configured to acquire an image of the vehicle positioned in a vehicle wash region from at least a camera” in ¶12, as per “whether the vehicle 10 is disposed in the vehicle wash region 20 may be detected based on information acquired through at least one of the camera 31 and the sensor 32 (S301)” in ¶61)
Kim fails to expressly disclose:
the control unit being further electrically connected to a database, the control unit selectively executing a modeling process and a washing process;
the modeling process is selected and executed, the washing unit is pulled manually to move along a contour of the vehicle,
a movement trajectory of the washing unit is recorded to generate a washing path,
the initial data and the washing path are integrated into vehicle type information to be stored in the database;
wherein when the vehicle is parked in the parking area and the washing process is selected and executed, the vehicle feature obtained by the recognition unit is compared with the database to determine whether there is the corresponding vehicle type information in the database,
if yes, the corresponding vehicle type information is read, the parking data obtained by the positioning unit is compared with the initial data of the vehicle type information to generate a compensation parameter, the washing path is adjusted based on the compensation parameter, and the washing unit is controlled to move along the adjusted washing path to wash the vehicle.
Akeel discloses of vision guided robot path programming, comprising:
the control unit being further electrically connected to a database, the control unit selectively executing a modeling process and a washing process; (as per “By identifying the object geometry as similar to one in the memory, the robot can modify a memorized path and process requirements; it can then operate on the object autonomously without having to be re-programmed for the object” in ¶10, as per “The recorded path points may be used for repetitive execution or for future uses on identical targets” in ¶31)
the modeling process is selected and executed, the washing unit is pulled manually to move along a contour of the vehicle, (as per “Lead-through teach programming: The robot, or a geometrically identical mechanism, is moved manually or by a force sensing handle along the desired path as points along the path are recorded and process operation location identified. The resulting program is then generated as if the motion was commanded through the teach pendant and then played back to perform a processing operation” in ¶6, as per “c. Teaching the robot a nominal path to trace desired locations along the approximate form” in ¶73-¶76)
a movement trajectory of the washing unit is recorded to generate a washing path, (as per “Points along the path, representing the absolute or incremental values of the position of each robot joint, are recorded together with associated process commands” in ¶3, as per “During this process the path is recorded, and its associated program could be fully executed along the way, including activating all process functions” in ¶31, as per “(a) moving the robot to point kd, and recording coordinates of point kd;” in Claim 1)
the initial data and the washing path are integrated into vehicle type information to be stored in the database; (as per “By identifying the object geometry as similar to one in the memory, the robot can modify a memorized path and process requirements; it can then operate on the object autonomously without having to be re-programmed for the object” in ¶10, as per “follow a desired path on an actual object to move a process tool relative to the actual object based on prior knowledge of a nominal object and a nominal path image on the nominal object” in Claim 1, as per “The recorded path points may be used for repetitive execution or for future uses on identical targets” in ¶31)
the washing process is selected and executed (as per “the robot recognizes the starting point by vision guidance, then traces the recorded path and executes a process by activating the process trigger points along the path” in ¶38) the vehicle feature obtained by the recognition unit is compared with the database to determine whether there is the corresponding vehicle type information in the database, (as per “identifying a starting point 1 d on the actual object by comparing features in the visual image of the actual object with data concerning these features on the nominal object;” in Claim 1, as per “(c) identifying point (k+1)d on the actual object by comparing features in the kth visual image of the actual object with data concerning these features on the nominal object;” in Claim 1, as per “By identifying the object geometry as similar to one in the memory, the robot can modify a memorized path and process requirements; it can then operate on the object autonomously without having to be re-programmed for the object” in ¶10)
In this way, Akeel operates to teach a robot a desired path by manual lead-through while recording points along the path for later program generation/playback (¶6). Like Kim, Akeel is concerned with robotic systems (¶31).
It would have been obvious for one of ordinary skill in the art before the effective filing date to have modified the system(s) Kim with lead-through teaching and recorded-path generation as taught by Akeel to enable another standard means of creating a robot motion path for repeatable automated execution (¶6). Such modification also allows the combined system to store and reuse the taught path for future operations to improve repeatability and automation (¶31).
Kim and Akeel fail to expressly disclose:
the parking data obtained by the positioning unit is compared with the initial data of the vehicle type information to generate a compensation parameter, the washing path is adjusted based on the compensation parameter, and the washing unit is controlled to move along the adjusted washing path to wash the vehicle.
Masahiko discloses of a car washing machine, comprising:
the parking data obtained by the positioning unit (as per “The travel position given by the position detector 11a is output as the vehicle end position” in ¶21) is compared with the initial data of the vehicle type information to generate a compensation parameter, (as per “11d is a deviation detection unit, which compares the vehicle end position detected by the vehicle end detection unit 11c in the step of storing the vehicle shape in the vehicle shape storage unit 11b with the vehicle end position detected in the subsequent process, A deviation between the two is detected and deviation data Z is output” in ¶22, as per “A deviation between the two is detected and deviation data Z is output” in ¶22) the washing path is adjusted based on the compensation parameter, and the washing unit is controlled to move along the adjusted washing path to wash the vehicle. (as per “Reference numeral 11e denotes a position adjusting unit that adjusts the position to be a control target of each processing device with respect to the vehicle body surface by adding the shift data Z given by the shift detection unit 11d to the vehicle shape data S stored in the vehicle shape storage unit 11b” in ¶22, as per “The position 9s of the processing device represented by the dotted line is a standard position in a state where there is no deviation between the traveling position at the time of acquisition of the vehicle shape data S and the traveling position in the process being executed, whereas the position 9z represented by the solid line. The runlineThis is an adjustment position set in consideration of the deviation data Z when there is a deviation in the position” in ¶24, as per “is moved up and down to control the car washing process while maintaining the distance between the upper surface brush 3 and the upper surface blower nozzle 9 serving as the processing device with respect to the upper surface of the vehicle body substantially constant.)” in ¶19)
In this way, Masahiko operates to compare a detected vehicle end position with a previously stored vehicle end position to detect deviation data and output deviation data (¶22). Like Kim and Akeel, Masahiko is concerned with robotic systems (¶24).
It would have been obvious for one of ordinary skill in the art before the effective filing date to have modified the system(s) of Kim and Akeel with the deviation detection as taught by Masahiko to enable another standard means of generating compensation data for wash control (¶22). Such modification also allows the combined system to adjust a target processing position based on deviation data to account for run-to-run parking variation (¶24).
As per Claim 2, the combination of Kim, Akeel, and Masahiko teaches or suggests all limitations of Claim 1. Kim further discloses wherein the recognition unit is a camera for obtaining the vehicle feature by means of image recognition. (as per “a data collection unit configured to acquire an image of the vehicle positioned in a vehicle wash region from at least a camera, and a communication unit configured to communicate with a smart device or an external vehicle information server. The method further includes determining, by the vehicle wash controller, a moving path for washing the vehicle based on the determined information of the outer appearance” in ¶12, as per “When the vehicle is detected, vehicle data information may be acquired via image-based vehicle type recognition using an image captured by the camera 31, a connection to a vehicle information server through the communication unit 130, or a manual input using the input unit 170 (S302)” in ¶62)
As per Claim 6, the combination of Kim, Akeel, and Masahiko teaches or suggests all limitations of Claim 1. Kim further discloses wherein the washing structure has a body, the body has a plurality of spray holes, and the body further has a plurality of movable brushes. (as per “the robot unit 400 may include a body 410, a plurality of legs 420, ground parts 430 that are disposed at respective ends of the plurality of legs 420, and one or more cameras 440” in ¶48, as per “he vehicle washing device may include a roller 450, a solution nozzle module 460, and an air nozzle 470. The cameras 440, the roller 450, the solution nozzle module 460, and the air nozzle 470 may be disposed on the bottom surface of the body 410 of the robot unit 400” in ¶55, as per “The roller 450 may be used for foam washing and may remove pollutants from the surface of a vehicle body via rotatory motion in a contact manner. To this end, the surface of the roller 450 may be formed of a soft spongy body or fiber aggregate” in ¶56)
Claim(s) 3, 7, and 10 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kim (US Pub. No. 20200189532) in view of Akeel (US Pub. No. 20190047145) in view of Masahiko (JP Pat. No. 4022331) in further view of Liu (CN Pub. No. 115158237).
As per Claim 3, the combination of Kim, Akeel, and Masahiko teaches or suggests all limitations of Claim 2. Kim, Akeel, and Masahiko fail to expressly disclose wherein the vehicle feature is a license plate number, the license plate number is associated with the corresponding vehicle type information of the database, when the control unit executes the washing process, it determines and selects the vehicle type information based on the license plate number.
Liu discloses of an intelligent car washing system, wherein the vehicle feature is a license plate number, the license plate number is associated with the corresponding vehicle type information of the database, when the control unit executes the washing process, it determines and selects the vehicle type information based on the license plate number. (as per “a license plate identification system for obtaining the license plate image, obtaining license plate information based on the license plate
image, and based on the license plate information, determining the vehicle information” in Abstract, as per “can utilize preset database query or matching identification of the license plate number, obtaining vehicle type” in P5¶2)
In this way, Liu operates to obtain license plate information from a license plate image and determine vehicle information based on the license plate information (Abstract). Like Kim, Akeel, and Masahiko, Liu is concerned with robotic systems (P5¶2).
It would have been obvious for one of ordinary skill in the art before the effective filing date to have modified the system(s) of Kim, Akeel, and Masahiko with license-plate-based database matching as taught by Liu to enable another standard means of selecting corresponding stored vehicle information during a wash process (P5¶2). Such modification also allows the combined system to identify vehicles using a common vehicle feature already present on vehicles to support automated selection/control (Abstract).
As per Claim 7, Kim discloses of a vehicle washing apparatus, comprising:
wherein the vehicle type information contains initial data of the position of the vehicle parked in a car wash lane, (as per “whether the vehicle 10 is disposed in the vehicle wash region 20 may be detected based on information acquired through at least one of the camera 31 and the sensor 32 (S301)” in ¶61, as per “. The data collection unit 110 may further acquire position information of the vehicle 10 in the vehicle wash region 20 from a sensor 32. In this case, two or more cameras 31 may be disposed to have different photographing ranges” in ¶30)
wherein when the vehicle to be washed is going to enter the car wash lane, (as per “determining, by a vehicle controller, information on an outer appearance of a vehicle based on information acquired from at least one of a data collection unit configured to acquire an image of the vehicle positioned in a vehicle wash region from at least a camera” in ¶12, as per “whether the vehicle 10 is disposed in the vehicle wash region 20 may be detected based on information acquired through at least one of the camera 31 and the sensor 32 (S301)” in ¶61)
obtaining parking data of the vehicle parked in the car wash lane, (as per “. The data collection unit 110 may further acquire position information of the vehicle 10 in the vehicle wash region 20 from a sensor 32. In this case, two or more cameras 31 may be disposed to have different photographing ranges” in ¶30, as per “The vehicle wash controller 140 may acquire information on the outer appearance of the vehicle based on data collected through at least one of the data collection unit 110, the data storage unit 120, the communication unit 130, and the user input unit 170” in ¶34)
controlling the washing unit to move along the adjusted washing path to wash the vehicle to be washed. (as per “washing the vehicle through a robot unit including at least one vehicle washing device” in Claim 1, as per “The vehicle washing device may include a roller 450, a solution nozzle module 460, and an air nozzle 470. The cameras 440, the roller 450, the solution nozzle module 460, and the air nozzle 470 may be disposed on the bottom surface of the body 410 of the robot unit 400” in ¶55, as per “The roller 450 may be used for foam washing and may remove pollutants from the surface of a vehicle body via rotatory motion in a contact manner. To this end, the surface of the roller 450 may be formed of a soft spongy body or fiber aggregate” in ¶56))
Kim fails to expressly disclose:
creating vehicle type information for a plurality of vehicles of different types and storing the vehicle type information in a database,
a washing unit is pulled manually to move along a contour of the vehicle to generate a washing path;
creating a vehicle feature for each of the plurality of vehicles, wherein the vehicle feature is associated with the corresponding vehicle type information of the database;
the vehicle feature of the vehicle to be washed is obtained, and the corresponding vehicle type information is compared and read from the database through the vehicle feature;
comparing the parking data with the initial data of the vehicle type information to generate a compensation parameter;
adjusting the washing path of the vehicle type information based on the compensation parameter, and
Akeel discloses of vision guided robot path programming, comprising:
creating vehicle type information for a plurality of vehicles of different types and storing the vehicle type information in a database, (as per “By identifying the object geometry as similar to one in the memory, the robot can modify a memorized path and process requirements; it can then operate on the object autonomously without having to be re-programmed for the object” in ¶10, as per “follow a desired path on an actual object to move a process tool relative to the actual object based on prior knowledge of a nominal object and a nominal path image on the nominal object” in Claim 1, as per “The recorded path points may be used for repetitive execution or for future uses on identical targets” in ¶31)
a washing unit is pulled manually to move along a contour of the vehicle to generate a washing path; (as per “Lead-through teach programming: The robot, or a geometrically identical mechanism, is moved manually or by a force sensing handle along the desired path as points along the path are recorded and process operation location identified. The resulting program is then generated as if the motion was commanded through the teach pendant and then played back to perform a processing operation” in ¶6, as per “c. Teaching the robot a nominal path to trace desired locations along the approximate form” in ¶73-¶76)
the vehicle feature of the vehicle to be washed is obtained, and the corresponding vehicle type information is compared and read from the database through the vehicle feature; (as per “identifying a starting point 1 d on the actual object by comparing features in the visual image of the actual object with data concerning these features on the nominal object;” in Claim 1, as per “(c) identifying point (k+1)d on the actual object by comparing features in the kth visual image of the actual object with data concerning these features on the nominal object;” in Claim 1, as per “By identifying the object geometry as similar to one in the memory, the robot can modify a memorized path and process requirements; it can then operate on the object autonomously without having to be re-programmed for the object” in ¶10)
In this way, Akeel operates to teach a robot a desired path by manual lead-through while recording points along the path for later program generation/playback (¶6). Like Kim, Akeel is concerned with robotic systems (¶31).
It would have been obvious for one of ordinary skill in the art before the effective filing date to have modified the system(s) Kim with lead-through teaching and recorded-path generation as taught by Akeel to enable another standard means of creating a robot motion path for repeatable automated execution (¶6). Such modification also allows the combined system to store and reuse the taught path for future operations to improve repeatability and automation (¶31).
Kim and Akeel fail to expressly disclose:
creating a vehicle feature for each of the plurality of vehicles, wherein the vehicle feature is associated with the corresponding vehicle type information of the database;
comparing the parking data with the initial data of the vehicle type information to generate a compensation parameter;
adjusting the washing path of the vehicle type information based on the compensation parameter, and
Masahiko discloses of a car washing machine, comprising:
comparing the parking data (as per “The travel position given by the position detector 11a is output as the vehicle end position” in ¶21 with the initial data of the vehicle type information to generate a compensation parameter; (as per “11d is a deviation detection unit, which compares the vehicle end position detected by the vehicle end detection unit 11c in the step of storing the vehicle shape in the vehicle shape storage unit 11b with the vehicle end position detected in the subsequent process, A deviation between the two is detected and deviation data Z is output” in ¶22, as per “A deviation between the two is detected and deviation data Z is output” in ¶22)
adjusting the washing path of the vehicle type information based on the compensation parameter, (as per “Reference numeral 11e denotes a position adjusting unit that adjusts the position to be a control target of each processing device with respect to the vehicle body surface by adding the shift data Z given by the shift detection unit 11d to the vehicle shape data S stored in the vehicle shape storage unit 11b” in ¶22, as per “The position 9s of the processing device represented by the dotted line is a standard position in a state where there is no deviation between the traveling position at the time of acquisition of the vehicle shape data S and the traveling position in the process being executed, whereas the position 9z represented by the solid line. The runlineThis is an adjustment position set in consideration of the deviation data Z when there is a deviation in the position” in ¶24, as per “is moved up and down to control the car washing process while maintaining the distance between the upper surface brush 3 and the upper surface blower nozzle 9 serving as the processing device with respect to the upper surface of the vehicle body substantially constant.)” in ¶19)
In this way, Masahiko operates to compare a detected vehicle end position with a previously stored vehicle end position to detect deviation data and output deviation data (¶22). Like Kim and Akeel, Masahiko is concerned with robotic systems (¶24).
It would have been obvious for one of ordinary skill in the art before the effective filing date to have modified the system(s) of Kim and Akeel with the deviation detection as taught by Masahiko to enable another standard means of generating compensation data for wash control (¶22). Such modification also allows the combined system to adjust a target processing position based on deviation data to account for run-to-run parking variation (¶24).
Kim, Akeel, and Masahiko fail to expressly disclose:
creating a vehicle feature for each of the plurality of vehicles, wherein the vehicle feature is associated with the corresponding vehicle type information of the database;
Liu discloses of an intelligent car washing system, comprising:
creating a vehicle feature for each of the plurality of vehicles, wherein the vehicle feature is associated with the corresponding vehicle type information of the database; (as per “a license plate identification system for obtaining the license plate image, obtaining license plate information based on the license plate image, and based on the license plate information, determining the vehicle information” in Abstract, as per “can utilize preset database query or matching identification of the license plate number, obtaining vehicle type” in P5¶2)
In this way, Liu operates to obtain license plate information from a license plate image and determine vehicle information based on the license plate information (Abstract). Like Kim, Akeel, and Masahiko, Liu is concerned with robotic systems (P5¶2).
It would have been obvious for one of ordinary skill in the art before the effective filing date to have modified the system(s) of Kim, Akeel, and Masahiko with license-plate-based database matching as taught by Liu to enable another standard means of selecting corresponding stored vehicle information during a wash process (P5¶2). Such modification also allows the combined system to identify vehicles using a common vehicle feature already present on vehicles to support automated selection/control (Abstract).
As per Claim 10, the combination of Kim, Akeel, Masahiko, and Liu teaches or suggests all limitations of Claim 7. Kim further discloses wherein the washing unit is a robot arm (as per “the moving unit may be configured in the form of a robot arm or the like and may not be limited to a particular type as long as the moving unit enables the robot unit 400 to move as desired within the vehicle wash region 20” in ¶45) the robot arm has a working end equipped with a washing structure, the washing structure has a body, the body has a plurality of spray holes, and the body further has a plurality of movable brushes. (as per “the robot unit 400 may include a body 410, a plurality of legs 420, ground parts 430 that are disposed at respective ends of the plurality of legs 420, and one or more cameras 440” in ¶48, as per “he vehicle washing device may include a roller 450, a solution nozzle module 460, and an air nozzle 470. The cameras 440, the roller 450, the solution nozzle module 460, and the air nozzle 470 may be disposed on the bottom surface of the body 410 of the robot unit 400” in ¶55, as per “The roller 450 may be used for foam washing and may remove pollutants from the surface of a vehicle body via rotatory motion in a contact manner. To this end, the surface of the roller 450 may be formed of a soft spongy body or fiber aggregate” in ¶56)
Claim(s) 4 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kim (US Pub. No. 20200189532) in view of Akeel (US Pub. No. 20190047145) in view of Masahiko (JP Pat. No. 4022331) in further view of Fan (CN Pub. No. 210027361).
As per Claim 4, the combination of Kim, Akeel, and Masahiko teaches or suggests all limitations of Claim 1. Kim, Akeel, and Masahiko fail to expressly disclose wherein the positioning unit has a plurality of sensing modules for obtaining the parking data by sensing the position of wheels of the vehicle.
Fan discloses of a car position detecting system, wherein the positioning unit has a plurality of sensing modules for obtaining the parking data by sensing the position of wheels of the vehicle. (as per “will head front wheel at the position between the two side of the car position and car washing machine are respectively provided with a sensor,” in P1, as per “signal sent by the first detecting device and the second detecting device to be washed vehicle is parked in the limited range,” in P2, as per “set on the washing position, connected with the main control device, when detecting the vehicle parking to be washed, sending a fourth detecting device of parking signal to the main control device.” in P2)
In this way, Fan operates to detect whether a vehicle is parked within a limited range using sensors and send a parking signal to a main control device (P2). Like Kim, Akeel, and Masahiko, Fan is concerned with robotic systems (P1).
It would have been obvious for one of ordinary skill in the art before the effective filing date to have modified the system(s) of Kim, Akeel, and Masahiko with multi-sensor parking detection as taught by Fan to enable another standard means of obtaining parking data and confirming proper vehicle placement (P2). Such modification also allows the combined system to improve robustness of parking detection by using multiple sensor locations to confirm vehicle position (P1).
Claim(s) 5 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kim (US Pub. No. 20200189532) in view of Akeel (US Pub. No. 20190047145) in view of Masahiko (JP Pat. No. 4022331) in further view of Shimizu (JP Pub. No. 2019108058).
As per Claim 5, the combination of Kim, Akeel, and Masahiko teaches or suggests all limitations of Claim 1. Kim further discloses wherein the washing unit further has a camera module for photographing the vehicle to obtain a parking image. (as per “whether the vehicle 10 is disposed in the vehicle wash region 20 may be detected based on information acquired through at least one of the camera 31 and the sensor 32 (S301)” in ¶61, as per “The cameras 440 may be disposed to acquire at least an image of a vehicle part that is currently being washed” in ¶52)
Kim fails to expressly disclose when the control unit executes the modeling process, the parking image is set as an initial image and integrated into the vehicle type information, when the control unit executes the washing process, the parking image is compared with the initial image to generate a secondary compensation parameter for the control module to readjust the washing path based on the secondary compensation parameter, and the washing unit is controlled to move along the adjusted washing path to wash the vehicle.
See Claim 1 for teachings of Akeel. Akeel further discloses when the control unit executes the modeling process. (as per “Lead-through teach programming: The robot, or a geometrically identical mechanism, is moved manually or by a force sensing handle along the desired path as points along the path are recorded and process operation location identified. The resulting program is then generated as if the motion was commanded through the teach pendant and then played back to perform a processing operation” in ¶6, as per “c. Teaching the robot a nominal path to trace desired locations along the approximate form” in ¶73-¶76)
In this way, Akeel operates to teach a robot a desired path by manual lead-through while recording points along the path for later program generation/playback (¶6). Like Kim and Masahiko, Akeel is concerned with robotic systems (¶31).
It would have been obvious for one of ordinary skill in the art before the effective filing date to have modified the system(s) Kim and Masahiko with lead-through teaching and recorded-path generation as taught by Akeel to enable another standard means of creating a robot motion path for repeatable automated execution (¶6). Such modification also allows the combined system to store and reuse the taught path for future operations to improve repeatability and automation (¶31).
Kim, Akeel, and Masahiko fail to expressly disclose wherein the parking image is set as an initial image and integrated into the vehicle type information, when the control unit executes the washing process, the parking image is compared with the initial image to generate a secondary compensation parameter for the control module to readjust the washing path based on the secondary compensation parameter, and the washing unit is controlled to move along the adjusted washing path to wash the vehicle.
Shimizu discloses of a car washing device, wherein the parking image is set as an initial image and integrated into the vehicle type information, (as per “nd detects that the car has stopped at the stop position given by the car shape sensor 9 and the entering sensor 10 (3) Yes Take an image (4). In this image, as shown in FIG. 4B, the main body frame 1 including the rear portion of the car stopped at the stop position is photographed” in P2, as per “As shown in FIG. 4C, an image of the rear of the vehicle is cut out to create a basic image” in P2, as per “and the basic image and the scaled image are stored in the data storage unit 26 as a template image” in P2) when the control unit executes the washing process (as per “FIG. 6 is a flowchart showing a process of detecting the movement of a car during car wash” in P2) the parking image is compared with the initial image to generate a secondary compensation parameter for the control module to readjust the washing path based on the secondary compensation parameter (as per “, the image processing unit 25 matches the input image captured by the camera device 17 after the start of car washing with the template image, the data storage unit 26 stores various data, and the car shape data” in P2, as per “and as a result of performing template matching processing with each template image Tv, the template image Tv having the highest similarity to the input image Iv is determined, and the car being washed is It is judged whether it moved or not.” in P3) and the washing unit is controlled to move along the adjusted washing path to wash the vehicle. (as per “The car wash control unit 18 executes the traveling motor 8 and the brush 3 of the main body frame 1 according to a sequence programmed based on the signals from the traveling encoder 7, the car shape sensor 9, the entering sensor 10, the leaving sensor 11 and the camera device 17” in P3)
In this way, Shimizu operates to perform template matching between an input image and a stored template image to judge whether a vehicle moved during washing (P3). Like Kim, Akeel, and Masahiko, Shimizu is concerned with robotics (P2).
It would have been obvious for one of ordinary skill in the art before the effective filing date to have modified the system(s) of Kim, Akeel, and Masahiko with template-image matching movement detection as taught by Shimizu to enable another standard means of detecting vehicle movement/deviation using camera images (P3). Such modification also allows the combined system to support more accurate wash control by comparing current images to stored template images to detect deviation (P2).
Claim(s) 9 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kim (US Pub. No. 20200189532) in view of Akeel (US Pub. No. 20190047145) in view of Masahiko (JP Pat. No. 4022331) in view of Liu (CN Pub. No. 115158237) in further view of Swedberg (NPL Title: RFID Helps Car Wash Customers Cruise Through Lines, Year: 2008).
As per Claim 8, the combination of Kim, Akeel, Masahiko, and Liu teaches or suggests all limitations of Claim 7. Kim, Akeel, Masahiko, and Liu fail to expressly disclose wherein after obtaining the vehicle feature, it is determined whether the vehicle feature exists in the database, if yes, the vehicle to be washed is allowed to enter the car wash lane.
Swedberg discloses of an RFID system in to recognize prepaid members and provide automatic washes wherein after obtaining the vehicle feature, it is determined whether the vehicle feature exists in the database, if yes, the vehicle to be washed is allowed to enter the car wash lane.(as per “an AWID RFID reader positioned 8 feet high on the driver’s side captures the ID number encoded to the label’s tag. The interrogator transmits that data via a wired connection to the back-end system, where the ID number is linked to a customer’s payment plan. Cruz Thru’s software system then instructs the gate to open” in ¶7, as per “The cameras photographed both plates then transmitted the images to the company’s back-end system, which compared the photographed license numbers with those stored in the database. If the system discovered a match, it displayed the related information on a screen for the vehicle operator, greeting that person by name; if it did not, then that indicated the vehicle’s license plate number had been rejected” in ¶4)
In this way, Liu operates to obtain license plate information from a license plate image and determine vehicle information based on the license plate information (Abstract). Like Kim, Akeel, Masahiko, and Liu, Swedberg is concerned with robotic systems (P5¶2).
It would have been obvious for one of ordinary skill in the art before the effective filing date to have modified the system(s) of Kim, Akeel, Masahiko, and Liu with RFID database matching as taught by Swedberg to enable another standard means of controlling access to a car wash via a feature (¶7). Such modification also allows for a faster and less intrusive method for allowing customers through the car wash (¶3).
Claim(s) 9 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kim (US Pub. No. 20200189532) in view of Akeel (US Pub. No. 20190047145) in view of Masahiko (JP Pat. No. 4022331) in view of Liu (CN Pub. No. 115158237) in further view of Shimizu (JP Pub. No. 2019108058).
As per Claim 9, the combination of Kim, Akeel, Masahiko, and Liu teaches or suggests all limitations of Claim 7. Kim further discloses:
wherein the vehicle type information further contains an initial image of the vehicle parked in the car wash lane, (as per “whether the vehicle 10 is disposed in the vehicle wash region 20 may be detected based on information acquired through at least one of the camera 31 and the sensor 32 (S301)” in ¶61, as per “The cameras 440 may be disposed to acquire at least an image of a vehicle part that is currently being washed” in ¶52)
a parking image of the vehicle parked in the car wash lane is obtained, (as per “The cameras 440 may be disposed to acquire at least an image of a vehicle part that is currently being washed” in ¶52)
Kim and Akeel fail to expressly disclose:
after the washing path of the vehicle type information is adjusted based on the compensation parameter,
the parking image is compared with the initial image to generate a secondary compensation parameter,
after the washing path is readjusted based on the secondary compensation parameter,
the washing unit is controlled to move along the adjusted washing path to wash the vehicle to be washed.
See Claim 7 for teachings of Masahiko. Masahiko further discloses:
after the washing path of the vehicle type information is adjusted based on the compensation parameter, . (as per “Reference numeral 11e denotes a position adjusting unit that adjusts the position to be a control target of each processing device with respect to the vehicle body surface by adding the shift data Z given by the shift detection unit 11d to the vehicle shape data S stored in the vehicle shape storage unit 11b” in ¶22, as per “The position 9s of the processing device represented by the dotted line is a standard position in a state where there is no deviation between the traveling position at the time of acquisition of the vehicle shape data S and the traveling position in the process being executed, whereas the position 9z represented by the solid line. The runlineThis is an adjustment position set in consideration of the deviation data Z when there is a deviation in the position” in ¶24, as per “is moved up and down to control the car washing process while maintaining the distance between the upper surface brush 3 and the upper surface blower nozzle 9 serving as the processing device with respect to the upper surface of the vehicle body substantially constant.)” in ¶19)
In this way, Masahiko operates to compare a detected vehicle end position with a previously stored vehicle end position to detect deviation data and output deviation data (¶22). Like Kim, Akeel, and Liu, Masahiko is concerned with robotic systems (¶24).
It would have been obvious for one of ordinary skill in the art before the effective filing date to have modified the system(s) of Kim, Akeel, and Liu with the deviation detection as taught by Masahiko to enable another standard means of generating compensation data for wash control (¶22). Such modification also allows the combined system to adjust a target processing position based on deviation data to account for run-to-run parking variation (¶24).
Kim, Akeel, Masahiko, and Liu fail to expressly disclose:
the parking image is compared with the initial image to generate a secondary compensation parameter,
after the washing path is readjusted based on the secondary compensation parameter,
the washing unit is controlled to move along the adjusted washing path to wash the vehicle to be washed.
Shimizu discloses of a car washing device, comprising:
the parking image is compared with the initial image to generate a secondary compensation parameter, , (as per “nd detects that the car has stopped at the stop position given by the car shape sensor 9 and the entering sensor 10 (3) Yes Take an image (4). In this image, as shown in FIG. 4B, the main body frame 1 including the rear portion of the car stopped at the stop position is photographed” in P2, as per “As shown in FIG. 4C, an image of the rear of the vehicle is cut out to create a basic image” in P2, as per “and the basic image and the scaled image are stored in the data storage unit 26 as a template image” in P2)
after the washing path is readjusted based on the secondary compensation parameter, (as per “, the image processing unit 25 matches the input image captured by the camera device 17 after the start of car washing with the template image, the data storage unit 26 stores various data, and the car shape data” in P2, as per “and as a result of performing template matching processing with each template image Tv, the template image Tv having the highest similarity to the input image Iv is determined, and the car being washed is It is judged whether it moved or not.” in P3
the washing unit is controlled to move along the adjusted washing path to wash the vehicle to be washed. (as per “The car wash control unit 18 executes the traveling motor 8 and the brush 3 of the main body frame 1 according to a sequence programmed based on the signals from the traveling encoder 7, the car shape sensor 9, the entering sensor 10, the leaving sensor 11 and the camera device 17” in P3)
In this way, Shimizu operates to perform template matching between an input image and a stored template image to judge whether a vehicle moved during washing (P3). Like Kim, Akeel, Masahiko, and Liu, Shimizu is concerned with robotics (P2).
It would have been obvious for one of ordinary skill in the art before the effective filing date to have modified the system(s) of Kim, Akeel, Masahiko, and Liu with template-image matching movement detection as taught by Shimizu to enable another standard means of detecting vehicle movement/deviation using camera images (P3). Such modification also allows the combined system to support more accurate wash control by comparing current images to stored template images to detect deviation (P2).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Paolozzi (US Pub. No. 20230123504) discloses methods for controlled cleaning of vehicles.
Bao (CN Pub. No. 105205468) discloses a vehicle washing device, system and method capable of automatically identifying vehicle types.
Melton (WO Pub. No. 2011039542) discloses programming a robot.
Mayer (EP Pub. No. 4180281) discloses a feature model and feature model-based control of a washing system.
Groh (DE Pub. No. 102021205145) discloses a vehicle wash facility having a footprint for a vehicle to be cleaned and method.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to TYLER R ROBARGE whose telephone number is (703)756-5872. The examiner can normally be reached Monday - Friday, 8:00 am - 5:00 pm EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ramon Mercado can be reached on (571) 270-5744. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/T.R.R./Examiner, Art Unit 3658
/Ramon A. Mercado/Supervisory Patent Examiner, Art Unit 3658