DETAILED ACTION
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Amendment filed 12/31/2025 have been entered. Claims 1-17 are pending.
Applicant' s arguments, see page 9 of Remarks, filed 12/31/2025, with respect to the rejection of claims 1-17 under 35 U.S.C. 112(b) have been fully considered and are persuasive. The 112(b) rejection of claims 1-17 has been withdrawn.
Applicant's arguments filed 12/31/2025 with regards to the rejections of claims 1-3, 12, and 14-17 under 35 U.S.C. 103 have been fully considered but they are not persuasive.
Applicant argues, on pages 10-11 of Remarks, that Liang does not teach combining the five-sense information with the converted five-sense data. Because the claim does not require a specific method for combining the five-sense information with the converted five-sense data, the step of combining is interpreted to be presenting a combination of both converted five-sense data and the five-sense information on a display to a user. Since Liang teaches processing visible data and non-visual data to be presented to a user (Liang, [0118] “images of cameras 802 and 804 are video images providing a real-time visual information regarding the surroundings of vehicle 100 ... When presenting image content on a display for a human observer, additional or alternative processing may be required to convert non-visible image data into a representation for presentation to a human or processor.”), the device of Liang is capable of presenting/combining both visible data and converted non-visible data on a display. Therefore, the rejections of claims 1-3, 12, and 14-17 in view of Liang et al. (US 2020/0314333 A1) and Watabe et al. (US 2024/0149458) are maintained.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-3, 12, and 14-17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Liang et al. (US 2020/0314333 A1), in view of Watabe et al. (US 2024/0149458).
Regarding claim 1, Liang teaches:
An information processing device (Fig. 3A; [0036] “communication system 300”) comprising:
detection information reception circuitry ([0049] “sensor processor 340”) to receive five-sense information ([0118] “the images of cameras 802 and 804 are video images providing a real-time visual information regarding the surroundings of vehicle 100. While cameras 802 and 804 are generally described to operate in the visible spectrum”) and extended five-sense information ([0118] “ additional or alternative processing may be required to convert non-visible image data”), the five-sense information being information at an apparatus ([0118] “the images of cameras 802 and 804 are video images providing a real-time visual information regarding the surroundings of vehicle 100.”), perceivable by at least one of five senses of human ([0118] “ cameras 802 and 804 are generally described to operate in the visible spectrum”), the extended five-sense information being information at the apparatus ([0118] “ it should be appreciated that other wavelengths (e.g., infrared, microwave, etc.) and/or mechanical waves (e.g., ultrasonic) may also be utilized”; [0026] discloses vehicle/apparatus 100 includes infrared sensors radio frequency sensors, ultrasonic sensors that measure extended five-sense information), imperceptible by the five senses of human ([0118] “ non-visible image data”) ...;
conversion circuitry to convert the extended five-sense information into converted five-sense data, the converted five-sense data representing information perceivable by at least one of the five senses of human ([0118] “When presenting image content on a display for a human observer, additional or alternative processing may be required to convert non-visible image data into a representation for presentation to a human or processor.”);
combining circuitry to combine the converted five-sense data with the five-sense information corresponding to the converted five-sense data ([0118] “It should be appreciated that the images of cameras 802 and 804 are video images providing a real-time visual information regarding the surroundings of vehicle 100. While cameras 802 and 804 are generally described to operate in the visible spectrum, it should be appreciated that other wavelengths (e.g., infrared, microwave, etc.) and/or mechanical waves (e.g., ultrasonic) may also be utilized. When presenting image content on a display for a human observer, additional or alternative processing may be required to convert non-visible image data into a representation for presentation to a human or processor.” – It can be understood that converted non-visible image data and visible image data captured by cameras 802 and 804 are combined to be presented to the user); and
representation information transmission circuitry to transmit combined data to a representation device ([0050] “The vehicle control system 348 may receive processed sensor information from the sensor processor 340 and determine to control an aspect of the vehicle 100. Controlling an aspect of the vehicle 100 may include presenting information via one or more display devices 372 associated with the vehicle”), the representation device being to transfer the combined data to a user to enable the user to perceive the combined data by at least one of the five senses ([0050]; [0117] “whereby visual information is received for processing and/or presentation on a display, such as at least one of vehicle operational display 420A-N, one or more auxiliary displays 424 (e.g., configured to present and/or display information segregated from the operational display 420, entertainment applications, movies, music, etc.), and/or a heads-up display 434 (e.g., configured to display any information”; [0118] “When presenting image content on a display for a human observer, additional or alternative processing may be required to convert non-visible image data into a representation for presentation to a human or processor.”), the combined data being a result of combining performed by the combining circuitry ([0118] “It should be appreciated that the images of cameras 802 and 804 are video images providing a real-time visual information regarding the surroundings of vehicle 100. While cameras 802 and 804 are generally described to operate in the visible spectrum, it should be appreciated that other wavelengths (e.g., infrared, microwave, etc.) and/or mechanical waves (e.g., ultrasonic) may also be utilized. When presenting image content on a display for a human observer, additional or alternative processing may be required to convert non-visible image data into a representation for presentation to a human or processor.”).
Liang does not specifically teach the apparatus being controlled according to a manipulation performed by a user at a location apart from the apparatus.
However, in the same field of endeavor, Watabe teaches:
an apparatus being controlled according to a manipulation performed by a user at a location apart from the apparatus (Fig. 1; [0061] “the operator easily remotely operates the robot.”; [0099] “The operator Us remotely operates the robot 2 by moving the hand or fingers wearing the controller 6 while viewing an image displayed on the HMD 5.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Liang, to manipulate the apparatus by a user at a location apart from the apparatus, as taught by Watabe, in order to cause the robot to perform work even when the user is not present at the work area.
Regarding claim 2, Liang further teaches wherein the extended five-sense information is information relating to a sound that is imperceptible by humans ([0118] “it should be appreciated that ... mechanical waves (e.g., ultrasonic) may also be utilized.” – Ultrasonic waves are known to be sound waves with frequencies above the human hearing limit).
Regarding claim 3, Liang further teaches wherein the extended five-sense information is a result of detection of light having a wavelength outside wavelengths of visible light ([0118] “it should be appreciated that other wavelengths (e.g., infrared, microwave, etc.) and/or mechanical waves (e.g., ultrasonic) may also be utilized. When presenting image content on a display for a human observer, additional or alternative processing may be required to convert non-visible image data into a representation for presentation to a human or processor.”).
Regarding claim 12, Liang does not specifically teaches motion information reception circuitry to obtain pieces of motion information, the pieces of the motion information representing positions and actions of parts of the user and corresponding to a plurality of apparatus driving devices of the apparatus; apparatus control circuitry to generate, using the motion information, pieces of control information for controlling operations of the plurality of apparatus driving devices of the apparatus; control information transmission circuitry to transmit the control information to the apparatus; and remote control delay adjustment circuitry to adjust delays to cause the operations to be performed simultaneously in the plurality of apparatus driving devices, the operations being based on pieces of the control information generated using pieces of the motion information obtained at a same time.
However, Watabe teaches:
motion information reception circuitry to obtain pieces of motion information, the pieces of the motion information representing positions and actions of parts of the user ([0115] “The controller 6 detects orientation, movement of each finger, and movement of the hand using the sensor 61, and transmits the detected operator state information to the robot remote operation control device 3.”; [0116] “The sensor 61 detects operator arm information (operator sensor value, operator state information), which is information on the posture and position of the operator's arm such as orientation, movement of each finger, and movement of the hand, and outputs the detected operator arm information to the control unit 62. The operator arm information includes information on the entire human arm, such as hand tip position/posture information, angle information on each finger, elbow position/posture information, and movement tracking information on each part.”) and corresponding to a plurality of apparatus driving devices (Fig. 1; [0099] “The robot 2 also includes gripping portions 222 (222a , 222b)”) of the apparatus ([0146] “The intention estimation unit 33 classifies the posture of the arm of the robot 2 including the gripping portion 222 by classifying the postures of the arm of the operator on the basis of operator sensor values of the controller 6. The intention estimation unit 33 estimates the intention of a motion that the operator desires the robot to perform on the basis of a classification result.”; [0162]-[0163]);
apparatus control circuitry to generate, using the motion information, pieces of control information for controlling operations of the plurality of apparatus driving devices of the apparatus ([0161] “The intention estimation unit 33 estimates the operator's intention using at least one of line-of-sight information, operator arm information, and head motion information. The intention estimation unit 33 may estimate the operator's motion intention using the environment sensor value as well. Subsequently, the gripping method determination unit 34 calculates a remote motion command to the robot 2 on the basis of the estimation result.”; [0162] “The control unit 21 calculates a driving command value for stable gripping on the basis of a remote motion command value calculated by the robot remote operation control device 3.”);
control information transmission circuitry to transmit the control information to the apparatus ([0162] “The control unit 21 controls the driving unit 22 in accordance with the driving command value to drive the gripping unit of the robot 2”); and
remote control delay adjustment circuitry to adjust delays to cause the operations to be performed simultaneously in the plurality of apparatus driving devices ([0179] “The gripping method determination unit 34 calculates the amount of deviation using, for example, the delay time stored in the storage unit 37. Subsequently, the gripping method determination unit 34 corrects the amount of deviation between the positions of the hand and fingers of the operator and the position of the gripping portion of the robot”), the operations being based on pieces of the control information generated using pieces of the motion information obtained at a same time ([0178]; [0179] “the gripping method determination unit 34 calculates the amount of deviation between the positions of the hand and fingers of the operator and the position of the gripping portion of the robot. The storage unit 37 stores, for example, a delay time or the like, which is a previously measured time required from an instruction to the motion of the driving unit 22 . The gripping method determination unit 34 calculates the amount of deviation using, for example, the delay time stored in the storage unit 37. Subsequently, the gripping method determination unit 34 corrects the amount of deviation between the positions of the hand and fingers of the operator and the position of the gripping portion of the robot.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Liang, to obtain pieces of motion information, the pieces of the motion information representing positions and actions of parts of the user and corresponding to a plurality of apparatus driving devices of the apparatus, to generate, using the motion information, pieces of control information for controlling operations of the plurality of apparatus driving devices of the apparatus, to transmit the control information to the apparatus; and to adjust delays to cause the operations to be performed simultaneously in the plurality of apparatus driving devices, the operations being based on pieces of the control information generated using pieces of the motion information obtained at a same time, as taught by Watabe, in order to cause the apparatus to perform work with high accuracy without the need for accurate positioning by a user, as stated by Watabe in [0058].
Regarding claim 14, Liang teaches:
A manipulation system (Fig. 3A) comprising:
an apparatus (Fig. 1; [0024] “vehicle 100”);
an information processing device (Fig. 3A; [0036] “communication system 300”) to transmit control information to the apparatus ([0050] “The vehicle control system 348 may receive processed sensor information from the sensor processor 340 and determine to control an aspect of the vehicle 100. Controlling an aspect of the vehicle 100 may include presenting information via one or more display devices 372 associated with the vehicle, sending commands to one or more computing devices 368 associated with the vehicle, and/or controlling a driving operation of the vehicle.”), the control information being for controlling the apparatus ([0050] “Controlling an aspect of the vehicle 100 may include presenting information via one or more display devices 372 associated with the vehicle, sending commands to one or more computing devices 368 associated with the vehicle, and/or controlling a driving operation of the vehicle.”)...; and
a representation device to transfer, to the user, information perceivable by the user by five senses to the user ([0117] “whereby visual information is received for processing and/or presentation on a display, such as at least one of vehicle operational display 420A-N, one or more auxiliary displays 424”), wherein
the apparatus includes
detection information acquisition circuitry to detect five-sense information ([0118] “the images of cameras 802 and 804 are video images providing a real-time visual information regarding the surroundings of vehicle 100. While cameras 802 and 804 are generally described to operate in the visible spectrum”) and extended five-sense information ([0118] “ it should be appreciated that other wavelengths (e.g., infrared, microwave, etc.) and/or mechanical waves (e.g., ultrasonic) may also be utilized”; [0026] discloses vehicle/apparatus 100 includes infrared sensors radio frequency sensors, ultrasonic sensors that measure extended five-sense information), the five-sense information being information at the apparatus and perceivable by at least one of five senses of human ([0118] “ cameras 802 and 804 are generally described to operate in the visible spectrum”), the extended five-sense information being information at the apparatus, imperceptible by the five senses of human ([0118] “ non-visible image data”), and
the information processing device includes
detection information reception circuitry ([0049] “sensor processor 340”) to receive the five-sense information ([0118] “the images of cameras 802 and 804 are video images providing a real-time visual information regarding the surroundings of vehicle 100. While cameras 802 and 804 are generally described to operate in the visible spectrum”) and the extended five-sense information from the apparatus ([0118] “ additional or alternative processing may be required to convert non-visible image data”),
conversion circuitry to convert the extended five-sense information into converted five-sense data, the converted five-sense data representing information perceivable by at least one of the five senses of human ([0118] “When presenting image content on a display for a human observer, additional or alternative processing may be required to convert non-visible image data into a representation for presentation to a human or processor.”),
combining circuitry to combine the converted five-sense data with the five-sense information corresponding to the converted five-sense data ([0118] “It should be appreciated that the images of cameras 802 and 804 are video images providing a real-time visual information regarding the surroundings of vehicle 100. While cameras 802 and 804 are generally described to operate in the visible spectrum, it should be appreciated that other wavelengths (e.g., infrared, microwave, etc.) and/or mechanical waves (e.g., ultrasonic) may also be utilized. When presenting image content on a display for a human observer, additional or alternative processing may be required to convert non-visible image data into a representation for presentation to a human or processor.” – It can be understood that converted non-visible image data and visible image data captured by cameras 802 and 804 are combined to be presented to the user), and
representation information transmission circuitry to transmit combined data to the representation device ([0050]; [0117] “whereby visual information is received for processing and/or presentation on a display, such as at least one of vehicle operational display 420A-N, one or more auxiliary displays 424 (e.g., configured to present and/or display information segregated from the operational display 420, entertainment applications, movies, music, etc.), and/or a heads-up display 434 (e.g., configured to display any information”; [0118] “When presenting image content on a display for a human observer, additional or alternative processing may be required to convert non-visible image data into a representation for presentation to a human or processor.”), the combined data being a result of combining performed by the combining circuitry ([0118] “It should be appreciated that the images of cameras 802 and 804 are video images providing a real-time visual information regarding the surroundings of vehicle 100. While cameras 802 and 804 are generally described to operate in the visible spectrum, it should be appreciated that other wavelengths (e.g., infrared, microwave, etc.) and/or mechanical waves (e.g., ultrasonic) may also be utilized. When presenting image content on a display for a human observer, additional or alternative processing may be required to convert non-visible image data into a representation for presentation to a human or processor.”).
Liang does not specifically teach the control information being for controlling the apparatus according to a manipulation performed by a user at a location apart from the apparatus.
However, in the same field of endeavor, Watabe teaches:
the control information being for controlling the apparatus according to a manipulation performed by a user at a location apart from the apparatus (Fig. 1; [0061] “the operator easily remotely operates the robot.”; [0099] “The operator Us remotely operates the robot 2 by moving the hand or fingers wearing the controller 6 while viewing an image displayed on the HMD 5.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Liang, to manipulate the apparatus by a user at a location apart from the apparatus, as taught by Watabe, in order to cause the robot to perform work even when the user is not present at the work area.
Regarding claim 15, Liang teaches:
An information processing method for use in an information processing device (Fig. 3A; [0036] “communication system 300”), the information processing method comprising:
receiving five-sense information ([0118] “the images of cameras 802 and 804 are video images providing a real-time visual information regarding the surroundings of vehicle 100. While cameras 802 and 804 are generally described to operate in the visible spectrum”) and extended five-sense information ([0118] “ additional or alternative processing may be required to convert non-visible image data”), the five-sense information being information at an apparatus ([0118] “the images of cameras 802 and 804 are video images providing a real-time visual information regarding the surroundings of vehicle 100.”), perceivable by at least one of five senses of human ([0118] “ cameras 802 and 804 are generally described to operate in the visible spectrum”), the extended five-sense information being information at the apparatus ([0118] “ it should be appreciated that other wavelengths (e.g., infrared, microwave, etc.) and/or mechanical waves (e.g., ultrasonic) may also be utilized”; [0026] discloses vehicle/apparatus 100 includes infrared sensors radio frequency sensors, ultrasonic sensors that measure extended five-sense information), imperceptible by the five senses of human ([0118] “ non-visible image data”) ...;
converting the extended five-sense information into converted five-sense data, the converted five-sense data representing information perceivable by at least one of the five senses of human ([0118] “When presenting image content on a display for a human observer, additional or alternative processing may be required to convert non-visible image data into a representation for presentation to a human or processor.”);
generating combined data by combining the converted five-sense data with the five-sense information corresponding to the converted five-sense data ([0118] “It should be appreciated that the images of cameras 802 and 804 are video images providing a real-time visual information regarding the surroundings of vehicle 100. While cameras 802 and 804 are generally described to operate in the visible spectrum, it should be appreciated that other wavelengths (e.g., infrared, microwave, etc.) and/or mechanical waves (e.g., ultrasonic) may also be utilized. When presenting image content on a display for a human observer, additional or alternative processing may be required to convert non-visible image data into a representation for presentation to a human or processor.” – It can be understood that converted non-visible image data and visible image data captured by cameras 802 and 804 are combined to be presented to the user); and
transmitting the combined data to a representation device, the representation device being to transfer the combined data to the user to enable the user to perceive the combined data by at least one of the five senses ([0050]; [0117] “whereby visual information is received for processing and/or presentation on a display, such as at least one of vehicle operational display 420A-N, one or more auxiliary displays 424 (e.g., configured to present and/or display information segregated from the operational display 420, entertainment applications, movies, music, etc.), and/or a heads-up display 434 (e.g., configured to display any information”; [0118] “When presenting image content on a display for a human observer, additional or alternative processing may be required to convert non-visible image data into a representation for presentation to a human or processor.”).
Liang does not specifically teach the apparatus being controlled according to a manipulation performed by a user at a location apart from the apparatus.
However, in the same field of endeavor, Watabe teaches:
an apparatus being controlled according to a manipulation performed by a user at a location apart from the apparatus (Fig. 1; [0061] “the operator easily remotely operates the robot.”; [0099] “The operator Us remotely operates the robot 2 by moving the hand or fingers wearing the controller 6 while viewing an image displayed on the HMD 5.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Liang, to manipulate the apparatus by a user at a location apart from the apparatus, as taught by Watabe, in order to cause the robot to perform work even when the user is not present at the work area.
Regarding claim 16, Liang further teaches:
generating control information for controlling the apparatus ([0050] “Controlling an aspect of the vehicle 100 may include presenting information via one or more display devices 372 associated with the vehicle, sending commands to one or more computing devices 368 associated with the vehicle, and/or controlling a driving operation of the vehicle.”); and
transmitting the control information to the apparatus ([0050] “The vehicle control system 348 may receive processed sensor information from the sensor processor 340 and determine to control an aspect of the vehicle 100. Controlling an aspect of the vehicle 100 may include presenting information via one or more display devices 372 associated with the vehicle, sending commands to one or more computing devices 368 associated with the vehicle, and/or controlling a driving operation of the vehicle.”).
Regarding claim 17, Liang teaches:
A non-transitory readable storage medium storing program causing a computer system to perform: (Fig. 3A; [0036] “communication system 300”)
receiving five-sense information ([0118] “the images of cameras 802 and 804 are video images providing a real-time visual information regarding the surroundings of vehicle 100. While cameras 802 and 804 are generally described to operate in the visible spectrum”) and extended five-sense information ([0118] “ additional or alternative processing may be required to convert non-visible image data”), the five-sense information being information at an apparatus ([0118] “the images of cameras 802 and 804 are video images providing a real-time visual information regarding the surroundings of vehicle 100.”), perceivable by at least one of five senses of human ([0118] “ cameras 802 and 804 are generally described to operate in the visible spectrum”), the extended five-sense information being information at the apparatus ([0118] “ it should be appreciated that other wavelengths (e.g., infrared, microwave, etc.) and/or mechanical waves (e.g., ultrasonic) may also be utilized”; [0026] discloses vehicle/apparatus 100 includes infrared sensors radio frequency sensors, ultrasonic sensors that measure extended five-sense information), imperceptible by the five senses of human ([0118] “ non-visible image data”) ...;
converting the extended five-sense information into converted five-sense data, the converted five-sense data representing information perceivable by at least one of the five senses of human ([0118] “When presenting image content on a display for a human observer, additional or alternative processing may be required to convert non-visible image data into a representation for presentation to a human or processor.”);
generating combined data by combining the converted five-sense data with the five-sense information corresponding to the converted five-sense data ([0118] “It should be appreciated that the images of cameras 802 and 804 are video images providing a real-time visual information regarding the surroundings of vehicle 100. While cameras 802 and 804 are generally described to operate in the visible spectrum, it should be appreciated that other wavelengths (e.g., infrared, microwave, etc.) and/or mechanical waves (e.g., ultrasonic) may also be utilized. When presenting image content on a display for a human observer, additional or alternative processing may be required to convert non-visible image data into a representation for presentation to a human or processor.” – It can be understood that converted non-visible image data and visible image data captured by cameras 802 and 804 are combined to be presented to the user); and
transmitting the combined data to a representation device, the representation device being to transfer the combined data to the user to enable the user to perceive the combined data by at least one of the five senses ([0050]; [0117] “whereby visual information is received for processing and/or presentation on a display, such as at least one of vehicle operational display 420A-N, one or more auxiliary displays 424 (e.g., configured to present and/or display information segregated from the operational display 420, entertainment applications, movies, music, etc.), and/or a heads-up display 434 (e.g., configured to display any information”; [0118] “When presenting image content on a display for a human observer, additional or alternative processing may be required to convert non-visible image data into a representation for presentation to a human or processor.”).
Liang does not specifically teach the apparatus being controlled according to a manipulation performed by a user at a location apart from the apparatus.
However, in the same field of endeavor, Watabe teaches:
an apparatus being controlled according to a manipulation performed by a user at a location apart from the apparatus (Fig. 1; [0061] “the operator easily remotely operates the robot.”; [0099] “The operator Us remotely operates the robot 2 by moving the hand or fingers wearing the controller 6 while viewing an image displayed on the HMD 5.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Liang, to manipulate the apparatus by a user at a location apart from the apparatus, as taught by Watabe, in order to cause the robot to perform work even when the user is not present at the work area.
Claim 4 is rejected under 35 U.S.C. 103 as being unpatentable over Liang et al. (US 2020/0314333 A1), in view of Watabe et al. (US 2024/0149458), and further in view of Bujoreanu et al. (RO 134710 A1).
Regarding claim 4, neither Liang nor Watabe specifically teaches wherein the extended five-sense information is information representing at least one of vibration, pressure, airflow, or an inclination difficult to be detected by humans.
However, Bujoreanu teaches wherein the extended five-sense information is information representing vibration difficult to be detected by humans (page 2 8th paragraph, “Detects the low level of noise produced by the failure of an element, assembly or subassembly because it detects noise or vibration from the earliest stage, at a low intensity, imperceptible to the human ear.”; page 2 9th paragraph “- Detects noise and vibration in the field of infrasound and ultrasound, undetectable by the human ear.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Liang, in view of Watabe, to include information representing vibration difficult to be detected by humans as the extended five-sense information, as taught by Bujoreanu, in order to notify or warn user in advance of possible malfunctioning of an element in response to detected noise and vibration from a level imperceptible to the human ear, as stated by Bujoreanu on first page 4th paragraph.
Claim 5 is rejected under 35 U.S.C. 103 as being unpatentable over Liang et al. (US 2020/0314333 A1), in view of Watabe et al. (US 2024/0149458), and further in view of Siewert et al. (EP 3141329 A1).
Regarding claim 5, neither Liang nor Watabe specifically teaches wherein the extended five-sense information is information relating to a smell that is difficult to be perceived by humans.
However, Siewert teaches wherein the extended five-sense information is information relating to a smell that is imperceptible by humans ([0065] “wherein input data, which represent measured values of physical variables recorded by a sensor, which cannot be perceived by the human eye, hearing or odor sense, are also converted into output data that can be perceived by the human eye or hearing.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Liang, in view of Watabe, to include extended five-sense information as information relating to a smell that is difficult to be perceived by humans, as taught by Siewert, in order to convert into output data that can be perceived by the human eye or hearing, as stated by Siewert in [0065].
Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Liang et al. (US 2020/0314333 A1), in view of Watabe et al. (US 2024/0149458), and further in view of Costanzo et al. (US 2017/0087363 A1).
Regarding claim 6, neither Liang nor Watabe specifically teaches wherein the extended five-sense information is information related to a taste that is difficult to be perceived by humans.
However, Costanzo teaches wherein the extended five-sense information is information related to a taste that is imperceptible by humans ([0011] “ system and method to sense and monitor substances within the oral cavity of a subject who is unable to detect or identify such substances due to an impaired sense of taste.”; [0014] “ The gustatory implant system generates taste maps by detecting tastes with an array of chemical sensors”; [0016] “In other embodiments, signals from the intra-oral device are transmitted wirelessly to an external electronic device, particularly a handheld device such as a smart phone, watch, tablet or laptop computer, or to a desk-top or mainframe computer, which is able to process signal data, display real-time information, and trigger alerts if excessive tastants are ingested or undesired or unsafe substances are detected.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Liang, in view of Watabe, to include information related to a taste that is difficult ot be perceived by humans as the extended five-sense information, as taught by Costanzo, in order to trigger alerts if excessive tastants are ingested or undesired or unsafe substances are detected, as stated by Costanzo in [0016].
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Shimomura et al. (US 2021/0162603 A1) teaches a manipulation device configured to receive an input of an operational instruction of an operator in order to remotely operate a robot to cause the robot to perform work.
Border (US 2016/0015470 A1) teaches providing assistance to medical professionals during the performance of medical procedures through the use of technologies facilitated through a head-worn computer, which combines visible and converted non-visible lighted view of material.
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to NHI Q BUI whose telephone number is (571)272-3962. The examiner can normally be reached Monday - Friday: 8:00am-5:00pm EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, KHOI TRAN can be reached at (571) 272-6919. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/NHI Q BUI/Primary Examiner, Art Unit 3656