Authorization for Internet Communications
The examiner encourages Applicant to submit an authorization to communicate with the examiner via the Internet by making the following statement (from MPEP 502.03):
“Recognizing that Internet communications are not secure, I hereby authorize the USPTO to communicate with the undersigned and practitioners in accordance with 37 CFR 1.33 and 37 CFR 1.34 concerning any subject matter of this application by video conferencing, instant messaging, or electronic mail. I understand that a copy of these communications will be made of record in the application file.”
Please note that the above statement can only be submitted via Central Fax, Regular postal mail, or EFS Web (PTO/SB/439).
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Examiner Notes
Examiner cites particular columns and line numbers in the references as applied to the claims below for the convenience of the applicant. Although the specified citations are representative of the teachings in the art and are applied to the specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested that, in preparing responses, the applicant fully consider the references in entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the examiner.
Election/Restrictions
Claims 15-20 are withdrawn from further consideration pursuant to 37 CFR 1.142(b) as being drawn to a nonelected invention, there being no allowable generic or linking claim. Election was made without traverse in the reply filed on June 3, 2025.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1, 5-7, 8, 12-14, 21, and 25-26 are rejected under 35 U.S.C. 103 as being unpatentable over Kang et al. (U.S. PG PUB 2009/0099693) in view of Chen et al. (U.S. PG PUB 2020/0001471) and Chang et al. (U.S. PG PUB 2022/0066760).
Regarding claim 1, Kang teaches receiving, by a real-time control layer of a real-time robotics control framework (see ¶[ [0053] “in the emotion action expression control system, the emotional action expression/actuation controller 300 receives the emotion information created by the emotion engine 200 is provided to the emotional action expression/actuation controller 300 and recorded in an input log in step S110.” Note, the emotion action expression control system is a real time control layer because it is responding to stimulus applied to the robot, this happens in real time, see ¶[0033] and ¶ [0006]),
real- time control code that defines one or more custom actions and that includes respective names of multiple software parts (see ¶ [0023] “In particular, the present invention enables reduction of the weight of an entire system and transplantation to a specific emotion system as well as another emotion system, by configuring an embedded system with a library or an application program interface module for expression and control of emotion action in association with an emotion engine based on an embedded operating system.” See ¶[0035] “The emotional action expression/actuation control unit 300 is an embedded system for expressing and controlling emotion action in association with an emotion engine based on an embedded operating system, and includes a library or an application program interface.” Note: the custom action is the emotional action, and respective software parts is the library or application interface),
wherein each software part defines a respective group of one or more of multiple software interfaces (see ¶[0035] “The emotional action expression/actuation control unit 300 is an embedded system for expressing and controlling emotion action in association with an emotion engine based on an embedded operating system, and includes a library or an application program interface” Note: The software part is a library and a library contains multiple software interfaces);
receiving, by a real-time hardware abstraction layer of the real-time robotics control framework (see Fig. 2, 300, Note: it is the hardware expression layer because it is the interface between the hardware unit (i.e. actuator) and the emotion engine, see ¶ [0040] and ¶[0046] “FIG. 4 is a view illustrating functional relations for calling the internal resources or controlling external hardware units according to the determined action expression by the action expresser 320.” Note: It is real time because responding to external stimulus see ¶ [0016] “extracting characteristic information and creating emotion information of the robot with respect to an internal or external stimulus applied to the robot using a plurality of sensors;” see ¶[0019] “selecting a target actuator expressing the motion of the robot depending on the action expression and determining control type of the target actuator;”), custom hardware configuration data for a robot (see ¶ [0019] “selecting a target actuator expressing the motion of the robot depending on the action expression and determining control type of the target actuator” and ¶[0043] “Then, the data resolver 302 performs a data resolving function of loading the emotion platform profile and detecting the emotion property from the received emotion information. In this case, the emotion platform profile includes information about the actuators of the emotional action expression control system such as a robot for the emotional action expression and information about specifically actuated ranges of the actuators and the control type of the control command.” Note: the emotional platform profile is the configuration of the actuator (i.e. hardware)),
wherein the custom hardware configuration data specifies a mapping between (i) the multiple software interfaces belonging to multiple software modules that each correspond to a respective robotic hardware element of the robot and (ii) the multiple software parts (see ¶[0036] “The management resource unit 500 includes an action map 510 providing mapping information including emotion based behavior information, the action expression and a control command, I/O log data 520 including important information created in an input/output process, environment setting data 530 including the environment setting information, and a resource file 540 including an action file and an action script that enable expression and execution of actions in units of files, and a sound file for expressing sound effects and voice information.”);
controlling, by the real-time robotics control framework and based on executing the real- time control code(Note: the system is operating in real time because it is responding to stimulus applied to the robot, this happens in real time, see ¶[0033] and ¶ [0006], see ¶ [0023] “In particular, the present invention enables reduction of the weight of an entire system and transplantation to a specific emotion system as well as another emotion system, by configuring an embedded system with a library or an application program interface module for expression and control of emotion action in association with an emotion engine based on an embedded operating system.”), the robot to perform the one or more custom actions using the respective robotic hardware elements (see ¶ [0023] “Furthermore, the present invention enables optimization of an interaction between a person and a robot based on an emotion, by expression of a natural emotion action suitable for change in emotion in an emotion system such as an emotion robot and an intelligent robot through a hardware-based device for expression of a faithful emotion, an internal system for control of the hardware-based device, and a technology for organic connection between them.”).
Kang does not expressly disclose, however, Chen teaches
establishing, by the real-time robotics control framework, one or more control channels (see ¶[0052] “the ability to configure the related data store element for digital input, the element key of the data store element, the element path of the data store element, the number of channels/bits of the digital input, an ability to configure the related data store element for digital output, the element key of the data store element, the element path of the data store element, and/or the number of channels/bits of the digital output.”) in shared memory resources according to the mapping between the multiple software parts and the multiple software interfaces defined in the custom hardware configuration data (see ¶[0023] “In some embodiments, data store 102 may be configured to hold some or all of the data the robotic control process needs to share between objects. Data maps in the data store may be scoped by key and value types. Some of the data in data store 102 may be templated and can hold any key or value type. Data Stores may be nested and accessed using predefined paths. These concepts are discussed in further detail hereinbelow.”), wherein the shared memory resources are accessed by both the real-time control layer and the real-time hardware abstraction layer to pass information between multiple software parts and multiple software interfaces through the one or more control channels (see ¶[0077] “In some embodiments, the control system may provide a common interface for inter process communication for data transfer. Accordingly data transfer layers (e.g., EtherCAT, shared memory, DDS, etc.) may be implementations of a common interface. The common interface may be configured to support mapping of manipulator joints (e.g. Actin, etc.) to individual servos through the hardware data transfer layer. The common interface may support the transfer of data from hardware layers to the application data store (e.g., Actin, etc.) through filters that handle units conversion, calibration scaling/offsets, and any user defined filter. The common interface may support type conversion and be flexible to accommodate different serialization implementations.”);
executing, by the real-time hardware abstraction layer of the real-time robotics control framework, each software module in a separate process of the real-time hardware abstraction layer (see ¶[0030] “In some embodiments, one or more threads 210 may be configured to run various other modules. As discussed above, the data used may be thread safe and the individual threads may have different update rates, dependencies, and/or priorities. Any number of threads 210 may be used and threads 210 may be accessible via a shared memory layer. In some embodiments, threads may be explicitly defined in a language such as WL. Callbacks may be retrieved from data store 202 and called serially.”).
Hence, it would have been obvious to one or ordinary skill in the art before the effective filing date to modify the teachings of Kang by adapting Chen to allow communication between the graphical user interface and the plurality of controllers (see ¶[0008] of Chen).
Kang and Chen do not expressly disclose, however, Chang teaches wherein access to the shared memory resources is synchronized such that the multiple software modules executing at the real-time hardware abstraction layer perform a read operation only after a write operation is performed by the real-time control code (see ¶[0109] “The dispatch circuit 240 receives decoded or partially decoded instructions. The dispatch circuit 240 also conducts data dependency checks for both the maps buffer 105 and the kernel buffers 125, such as checking for read after write operations (e.g., if a load operation is not complete, it will stall a MAC circuit 190).”).
Hence, it would have been obvious to one or ordinary skill in the art before the effective filing date to modify the teachings of Kang and Chen by adapting Chang to perform robotics real-time processing efficiently (see ¶[0004] of Chang).
Regarding claim 5, Kang teaches wherein the mapping between the multiple software parts and the multiple software interfaces specifies that a first software part references different software interfaces belonging to different respective software modules (see ¶ [0036] “] The management resource unit 500 includes an action map 510 providing mapping information including emotion based behavior information, the action expression and a control command, I/O log data 520 including important information created in an input/output process, environment setting data 530 including the environment setting information, and a resource file 540 including an action file and an action script that enable expression and execution of actions in units of files, and a sound file for expressing sound effects and voice information.”).
Regarding claim 6, Kang teaches wherein the mapping between the multiple software parts and the multiple software interfaces specifies that a first software interface can receive commands from different respective software parts (see ¶[0042] “As illustrated in FIG. 3, the data receiver 301 functions to receive the emotion information including the emotion state and the emotion intensity from the emotion engine 200, and the logger 310 performs a logging function of recording the emotion information. Then, the recorded log data 520a includes the emotion information and time stamp information thereof and a hash value for safe storage and management.”).
Regarding claim 7, Kang does not expressly disclose, however, Chen teaches wherein the real-time robotics control framework implements a common communications protocol between the multiple software parts and the multiple software interfaces, and wherein the multiple software interfaces use different communications protocols with the respective robotic hardware elements to effectuate received commands (see ¶[0006] “The manipulator controller may include a state machine. The hardware communication layer may determine a communication protocol to use, one or more actuators or sensors to use, and one or more parameters associated with the one or more actuators or sensors. The thread may determine when the hardware communication layer and the manipulator controller run.”).
Hence, it would have been obvious to one or ordinary skill in the art before the effective filing date to modify the teachings of Kang by adapting Chen to allow communication between the graphical user interface and the plurality of controllers (see ¶[0008] of Chen).
Regarding claim 8, is an independent system claim, corresponding with method claim 1, and is rejected for the same reasons. In addition, Chen teaches a system comprising one or more computers and one or more storage devices storing instructions that are operable, when executed by the one or more computers, to cause the one or more computers to perform operation (see ¶[0084])
Regarding claims 12-14, correspond with claims 5-7, and are rejected for the same reasons.
Regarding claim 21, is an independent system claim, corresponding with method claim 1, and is rejected for the same reasons. In addition, Kang teaches one or more non-transitory computer-readable storage media storing instructions that when executed by one or more computers cause the one or more computers to perform operations (see ¶[0080]).
Regarding claims 25-26, correspond with method claims 6-7, and are rejected for the same reasons.
Claim(s) 2-4, 9-11, and 22-24 are rejected under 35 U.S.C. 103 as being unpatentable over Kang et al. (U.S. PG PUB 2009/0099692) in view of Chen et al. (U.S. PG PUB 2020/0001471), and Chang et al. (U.S. PG PUB 2022/0066760) as applied in claim 1, 8, and 21, further in view of Dupuis (U.S. PG PUB 2021/0064007).
Regarding claim 2, Kang, Chen, and Chang do not expressly disclose, however Dupuis teaches wherein the custom hardware configuration data is hardware agnostic (see ¶[0048] “In some implementations, the robot interface subsystem 160 provides a hardware-agnostic interface so that the commands 155 issued by onsite execution engine 150 are compatible with multiple different versions of robots”).
Hence, it would have been obvious to one or ordinary skill in the art before the effective filing date to modify the teachings of Kang, Chen, Chang by adapting Dupuis to provide compatibility to multiple versions of robots for greater flexibility (see ¶[0048] and ¶[0154] of Dupuis).
Regarding claim 3, Kang, Chen, and Chang do not expressly disclose, however Dupuis teaches wherein the same custom hardware configuration data references software module implementations for different models of robots (see ¶[0157] “Instead, the same actions generated during the planning process can actually be executed by different robot models so long as they support the same degrees of freedom and the appropriate control levels have been implemented in the software stack.”).
Hence, it would have been obvious to one or ordinary skill in the art before the effective filing date to modify the teachings of Kang, Chen, and Chang by adapting Dupuis to provide compatibility to multiple versions of robots for greater flexibility (see ¶[0048] and ¶[0154] of Dupuis).
Regarding claim 4, Kang, Chen, and Chang do not expressly disclose, however Dupuis teaches wherein the real-time control code of the real-time control layer is operable to cause the different models of robots to perform a same task (see ¶[0040] “In this specification, a schedule is data that assigns each task to at least one robot. A schedule also specifies, for each robot, a sequence of actions to be performed by the robot.”).
Hence, it would have been obvious to one or ordinary skill in the art before the effective filing date to modify the teachings of Kang, Chen, and Chang by adapting Dupuis to provide compatibility to multiple versions of robots for greater flexibility (see ¶[0048] and ¶[0154] of Dupuis).
Regarding claims 9-11, correspond with claims 2-4, and are rejected for the same reasons.
Regarding claims 22-24, correspond with method claims 2-4, and are rejected for the same reasons.
Response to Arguments
Applicant's arguments filed 03/2/2026 have been fully considered but they are not persuasive.
Applicant’s arguments with respect to claim(s) 1-14, and 21-26 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Support for Amendments and Newly Added Claims
Applicants are respectfully requested, in the event of an amendment to claims or submission of new claims, that such claims and their limitations be directly mapped to the specification, which provides support for the subject matter. This will assist in expediting compact prosecution. MPEP 714.02 recites: “Applicant should also specifically point out the support for any amendments made to the disclosure. See MPEP § 2163.06. An amendment which does not comply with the provisions of 37 CFR 1.121(b), (c), (d), and (h) may be held not fully responsive. See MPEP § 714.” Amendments not pointing to specific support in the disclosure may be deemed as not complying with provisions of 37 C.F.R. 1.121(b), (c), (d), and (h) and therefore held not fully responsive. Generic statements such as “Applicants believe no new matter has been introduced” may be deemed insufficient.
Interview Requests
In accordance with 37 CFR 1.133(a)(3), requests for interview must be made in advance. Interview requests are to be made by telephone (571-270-7848) call or email (carina.yun@uspto.gov). Applicants must provide a detailed agenda as to what will be discussed (generic statement such as “discuss §102 rejection” or “discuss rejections of claims 1-3” may be denied interview). The detail agenda along with any proposed amendments is to be written on a PTOL-413A or a custom form and should be emailed, (subject to MPEP 713.01.I / MPEP 502.03) to the Examiner prior to requesting for interview. Interview requests submitted within amendments may be denied because the Examiner was not notified, in advance, of the Applicant Initiated Interview Request and due to time constraints may not be able to review the interview request to prior to the mailing of the next Office Action.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
US PG PUB (2007/0157196) Goicea teaches installing a first operating system, including a first hardware abstraction layer and other operating system functions, into a client computer. An image of the other operating system functions and a second hardware abstraction layer is loaded into the client computer. The second hardware abstraction layer is functionally interrelated with the second hardware abstraction layer in the image. There is automatic detection that the second hardware abstraction layer loaded into the client computer is incompatible with the client computer. In response, the second hardware abstraction layer with the first hardware abstraction layer is automatically replaced in the client computer. Subsequently, the first operating system including the first hardware abstraction layer and the other operating systems functions is booted up in the client computer. In one example, the operating system is Windows XP, the first hardware abstraction layer is adapted for an ACPI PIC type client computer, and the second hardware abstraction layer is adapted for an ACPI APIC type client computer.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to CARINA YUN whose telephone number is (571)270-7848. The examiner can normally be reached Mon, Tues, Thurs, 9-4 (EST).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to call.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kevin Young can be reached on (571) 270-3180. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
Carina Yun
Patent Examiner
Art Unit 2194
/CARINA YUN/Examiner, Art Unit 2194