DETAILED CORRESPONDENCE
This final office action is in response to the Amendments filed on 02 March 2026, regarding application number 18/130,405.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Response to Amendment
Claims 42-61 remain pending in the application, while claims 1-41 have been cancelled.
Response to Arguments
Applicant’s arguments see Pages 1, filed 02 March 2026, with respect to the rejections of claims 42-61 under 35 U.S.C. § 103 have been considered and are persuasive. Therefore, the rejections have been withdrawn. However, upon further consideration, a new ground(s) of rejection is made further in view of newly cited reference Maisonnier et al. (US 20170125008 A1). See full details below.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 42, 45-49, 52-56 and 59-61 are rejected under 35 U.S.C. 103 as being unpatentable over Kumagai (US 20020072408 A1 and Kumagai hereinafter), in view of Maisonnier et al. (US 20170125008 A1 and Maisonnier hereinafter).
Regarding Claims 42, 49 and 56
Regarding claim 42, Kumagai teaches a computer-implemented method for performing a mood determination of an interactive virtual machine (see all Figs.; [0007]-[0008]), the computer-implemented method comprising:
initializing, by one or more processors, an interactive virtual machine based on a default interactive personality mode corresponding to a default interactive personality profile for the interactive virtual machine (see Fig. 3, all; Fig. 6, all; [0097 "In this game, the setting of one of a plurality of types of temperaments, indicated by A, B, C, D, E, F, G, H, I, and J, as shown in FIG. 6, of a given (selected) dog is changed during the period of the game, for example, every day, according to predetermined conditions. Any one of the temperaments is set when the dog is given,..."] and [0101 "It is now assumed that the current dog's temperament is A..."]);
identifying, by the one or more processors, a stimulus based on modeled environmental data received from a data stream corresponding to one or more actions of an avatar or a user interacting with the interactive virtual machine within a virtual environment (see Fig. 7, all; Figs. 8A-8B, all, especially "praise" and "stroke"; [0101 "When the game player performs a predetermined action, such as “praising” or “stroking”, on the dog, a predetermined numerical value corresponding to the action is added to the index value in accordance with the current dog's temperament ... It is now assumed that the current dog's temperament is A, and that the “praising” action is done."]-[0102 "If the “stroking” action is done, the numerical value +2 is read for the basic parameters a and c from the parameter/temperament table by the numerical- value providing unit 451."]);
comparing, by the one or more processors, the identified stimulus to stimuli stored in a stimuli library, each stimulus being stored in relation to one or more interactive personality modes (see Fig. 7, all; Figs. 8A-8B, all, especially "praise" and "stroke"; Fig. 9, all; [0099], [0101 "When the game player performs a predetermined action, such as “praising” or “stroking”, on the dog, a predetermined numerical value corresponding to the action is added to the index value in accordance with the current dog's temperament, as indicated by the parameter/temperament tables shown in FIGS. 8A and 8B. It is now assumed that the current dog's temperament is A, and that the “praising” action is done. For example, concerning the basic parameter a, the numerical value −2 is read from the parameter/temperament table by the numerical- value providing unit 451. Concerning the basic parameter c, the numerical value +1 is read from the parameter/temperament table by the numerical-value providing unit 451."]-[0102] and [0103]-[0104 "The parameter/temperament tables and the temperament conversion table shown in FIGS. 8A, 8B, and 9 have been stored, together with the calculation expressions, in a storage unit, such as in the recording medium 40."]);
detecting, by the one or more processors, that the identified stimulus matches at least one stimulus of the stimuli library associated with a current interactive personality mode (see Figs. 7-8B, all; [0101 "When the game player performs a predetermined action, such as “praising” or “stroking”, on the dog, a predetermined numerical value corresponding to the action is added to the index value in accordance with the current dog's temperament, as indicated by the parameter/temperament tables shown in FIGS. 8A and 8B."]-[0102]);
updating, by the one or more processors, the interactive virtual machine from the default interactive personality mode to an updated interactive personality mode based on the current interactive personality mode (see Fig. 9, all; [0097]-[0098] and [0103 "Then, based on the two selected basic parameters, the temperament to be set is determined by referring to the temperament conversion table shown in FIG. 9. The selection of the two basic parameters and the determination of the corresponding temperament are performed by the reading unit 453."]-[0104 "By referring to column c and row b of the temperament conversion table shown in FIG. 9, the temperament setting is changed to C if c is a positive value and b is a negative value. If c is a negative value and b is a positive value, the temperament setting is changed to H. "]);
determining, by the one or more processors, a stimulus response for the interactive virtual machine based on the updated interactive personality mode (see Fig. 6, all; [0007 "A behavior-pattern storage unit stores a plurality of behavior patterns which are set in accordance with the temperament of the character. A behavior selection unit selects one of the behavior patterns in accordance with the temperament of the character."]-[0008], [0026 "The character then behaves based on the changed temperament according to various situations in the game space. For example, when the owner returns to the doorway, the character ignores or barks at the owner. Thus, even with the same environments and events surrounding the character, the character behaves differently according to the character's current temperament."], [0086], [0097]-[0098], [0117]-[0119 "For example, in regards to “reacting to the owner who has just returned to the room”, when the owner returns to the doorway, the dog may ignore or bark at the owner, or may show a friendly attitude toward the owner according to the dog's temperament or the cumulative value of the predetermined basic parameter. The relationships between the dog's temperament and the cumulative values of the basic parameters are stored as a table in a storage unit, such as in the recording medium 40."]); and
executing, by the one or more processors, the stimulus response by the interactive virtual machine (see Fig. 6, all; [0007 "A behavior control unit causes the character to behave according to the behavior pattern selected by the behavior selection unit."]-[0008], [0026], [0117]-[0119 "For example, in regards to “reacting to the owner who has just returned to the room”, when the owner returns to the doorway, the dog may ignore or bark at the owner, or may show a friendly attitude toward the owner according to the dog's temperament or the cumulative value of the predetermined basic parameter."]).
Regarding claim 49, Kumagai additionally teaches a computer system for performing a mood determination of an interactive virtual machine (see all Figs.; [0007]-[0008] and [0030]), the computer system comprising:
at least one memory storing instructions (see [0041]); and
at least one processor (see [0041]) configured to execute the instructions to perform operations comprising the above process (as discussed above).
Regarding claim 56, Kumagai additionally teaches a non-transitory computer-readable medium containing instructions that (see all Figs.; [0007]-[0008] and [0041]), when executed by a processor, cause the processor to perform operations for performing a mood determination of an interactive virtual machine, the operations comprising the above process (as discussed above).
Kumagai is silent regarding determining the stimulus response based on user profile data of a user interacting with the interactive virtual machine; and
modifying, by the one or more processors, the stimulus response based on one or more user preferences corresponding to social context data.
Maisonnier teaches a computer-implemented method for performing a mood determination of an interactive machine (see all Figs.; [0006]), the computer-implemented method comprising:
initializing, by one or more processors, an interactive machine based on a default interactive personality mode corresponding to a default interactive personality profile for the interactive machine (see Figs. 1-2, robot 130; [0010] and [0051 "The robot generally uses its default standard voice skin (form) and outputs standard and predefined dialog contents (substance). For example, the robot says the dialog sentence 141."]);
identifying, by the one or more processors, a stimulus based on modeled environmental data received from a data stream corresponding to one or more actions of a user interacting with the interactive machine within a environment (see [0019], [0030], [0051 "Depending on certain parameters (users requests or environmental parameters), the robot can switch to another voice skin and/or to another dialog content, for example 142."]-[0052 "The dialog execution rules 220 are for example influenced or determined by a user request 221 and/or by a perceived environment 222 (for example determined through the sensors or the robot, filtered by extractors or according to described embodiments regarding the logic implemented in the Mind of the robot) ... The two modes of activation (or de-activation) can be combined, i.e. the triggering of a new dialog mode can determined partly by user requests and partly by the environment. For example, upon a user request, environmental parameters can confirm or inhibit a change in dialog mode."] and [0062 "There are several ways to trigger the launch or execution of a dialog mode comprising dialog content and dialog voice skin during a dialog comprising sentences between a human user and a robot. These different ways (in particular described hereinafter) to trigger the launch or execution of one or more dialog modes can be independent and can be further combined with one another."]-[0065]);
updating, by the one or more processors, the interactive machine from the default interactive personality mode to an updated interactive personality mode based on the current interactive personality mode (see [0008 "In a development, the modified dialog mode is obtained by modifying the current dialog content and/or to the current dialog voice skin of the current dialog."]-[0011], [0019], [0030] and [0051 "Depending on certain parameters (users requests or environmental parameters), the robot can switch to another voice skin and/or to another dialog content, for example 142. The robot also can switch back to the initial or default voice. In more details, starting with the default voice skin and dialog content 200 (or from an initial/modified voice skin and/or modified dialog content), dialog execution rules 220 determine if and to what extent the dialog has to be modified."]-[0052]);
determining, by the one or more processors, a stimulus response for the interactive machine based on the updated interactive personality mode and user profile data of a user interacting with the interactive machine (see [0013], [0019], [0023 "Information actively or passively gathered about a user (e.g. user profiling or user declared preferences), can be used as an input for launching conditions (e.g. a voice skin or dialog pattern should only launch if the user loves “Bienvenue chez les Ch'tis”). Mechanisms of machine learning can be performed: voice skins or dialog patterns which are launched or executed by the system will evolve depending on what is learned about the user."] and [0070 "Conditions” and “cross conditions” enable to modify what the robot is going to say as a function of predefined variables (user preferences for example)..."]);
modifying, by the one or more processors, the stimulus response based on one or more user preferences corresponding to social context data (see [0013], [0019 "Specific words also can be filtered depending on users, history, feedbacks, moods, location, date and time (for example). When a person does not understand a sentence, the robot can repeat slowly and/or with synonyms, if asked to do so or at its own initiative. The robot also can learn the preferences of the user (speak more or less quickly with which vocabulary), improving the mood of the user."], [0023 "Information actively or passively gathered about a user (e.g. user profiling or user declared preferences), can be used as an input for launching conditions (e.g. a voice skin or dialog pattern should only launch if the user loves “Bienvenue chez les Ch'tis”)."], [0030], [0047]-[0052 "The dialog execution rules 220 are for example influenced or determined by a user request 221 and/or by a perceived environment 222 ... The two modes of activation (or de-activation) can be combined, i.e. the triggering of a new dialog mode can determined partly by user requests and partly by the environment. For example, upon a user request, environmental parameters can confirm or inhibit a change in dialog mode."] and [0070 "Conditions” and “cross conditions” enable to modify what the robot is going to say as a function of predefined variables (user preferences for example) ... From the age group and the absence of occupation in the evening, the reasoning of the robot can later in the evening infer or propose “do you want to play with me?”.]); and
executing, by the one or more processors, the stimulus response by the interactive machine (see [0019], [0023], [0052 "The two modes of activation (or de-activation) can be combined, i.e. the triggering of a new dialog mode can determined partly by user requests and partly by the environment. For example, upon a user request, environmental parameters can confirm or inhibit a change in dialog mode."] and [0062 "There are several ways to trigger the launch or execution of a dialog mode comprising dialog content and dialog voice skin during a dialog comprising sentences between a human user and a robot. These different ways (in particular described hereinafter) to trigger the launch or execution of one or more dialog modes can be independent and can be further combined with one another."]-[0065]).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the process/computer system/non-transitory computer-readable medium of Kumagai to further determine the stimulus response based on user profile data of a user interacting with the interactive virtual machine and modify the stimulus response based on one or more user preferences corresponding to social context data, as taught by Maisonnier, in order to personalize and enhance interactions between the user and virtual machine, thus improving the user’s experience.
Regarding Claims 45, 52 and 59
Modified Kumagai teaches the computer-implemented method of claim 42, the computer system of claim 49 and the non-transitory computer-readable medium of claim 56 (as discussed above in claims 42, 49 and 56),
Kumagai further teaches wherein the one or more interactive personality modes include a set of personality trait values (see [0021 "According to the video game machine, the emotion setting unit may set the emotion in accordance with the temperament of the character when the instruction concerning the action is executed."]-[0022 "The aforementioned video game machine may further include an emotion storage unit for storing a numerical value used as an index for each of the emotions."], [0085] and [0108]).
Regarding Claims 46, 53 and 60
Modified Kumagai teaches the computer-implemented method of claim 42, the computer system of claim 49 and the non-transitory computer-readable medium of claim 56 (as discussed above in claims 42, 49 and 56),
Kumagai further teaches wherein the updated interactive personality mode includes a set of quantitative values (see Figs. 8A-8B, all; [0099]-[0101 "It is now assumed that the current dog's temperament is A, and that the “praising” action is done. For example, concerning the basic parameter a, the numerical value −2 is read from the parameter/temperament table by the numerical- value providing unit 451. Concerning the basic parameter c, the numerical value +1 is read from the parameter/temperament table by the numerical-value providing unit 451."] and [0102]).
Regarding Claims 47, 54 and 61
Modified Kumagai teaches the computer-implemented method of claim 42, the computer system of claim 49 and the non-transitory computer-readable medium of claim 56 (as discussed above in claims 42, 49 and 56), comprising:
Kumagai further teaches updating, by the one or more processors, the updated interactive personality mode by updating one or more personality mode values (see Figs. 8A-8B, all; [0099]-[0101 "It is now assumed that the current dog's temperament is A, and that the “praising” action is done. For example, concerning the basic parameter a, the numerical value −2 is read from the parameter/temperament table by the numerical- value providing unit 451. Concerning the basic parameter c, the numerical value +1 is read from the parameter/temperament table by the numerical-value providing unit 451."] and [0104 "If c is a negative value and b is a positive value, the temperament setting is changed to H. In selecting two basic parameters, the unadjusted cumulative values may be used. Alternatively, the unadjusted cumulative values may be used only for specific basic parameters."]).
Regarding Claims 48 and 55
Modified Kumagai teaches the computer-implemented method of claim 42 and the computer system of claim 49 (as discussed above in claims 42 and 49),
Kumagai further teaches the computer-implemented method comprising:
providing, by the one or more processors, an alert corresponding to the updated interactive personality mode to a response determination module (see [0083 "A reading unit 453 reads the temperament corresponding to the addition result obtained by the addition unit 452 from the recording medium 40. In the recording medium 40, the addition results and the temperaments are stored in correspondence with each other. The temperament setting of the character appearing in the game is changed to the temperament read by the reading unit 453."]-[0084 "The reading unit 453 serves as a selection unit for selectively determining the character's temperament according to the addition result obtained by the addition unit 452.]).
Claims 43, 50 and 57 are rejected under 35 U.S.C. 103 as being unpatentable over Kumagai (as modified by Maisonnier) as applied to claims 42, 49 and 56 above, and further in view of Saito (US 20020016128 A1 and Saito hereinafter).
Regarding Claims 43, 50 and 57
Modified Kumagai teaches the computer-implemented method of claim 42, the computer system of claim 49 and the non-transitory computer-readable medium of claim 56 (as discussed above in claims 42, 49 and 56),
Kumagai is silent regarding wherein the identifying the stimulus based on the modeled environmental data received from the data stream includes:
receiving, by the one or more processors, a stimulus alert that includes a stimulus identifier; and
determining, by the one or more processors, the stimulus that corresponds to the stimulus identifier.
Saito teaches a computer-implemented method for performing a mood determination of an interactive machine (see all Figs.; [0008]), the computer-implemented method comprising:
initializing, by one or more processors, an interactive machine (see Fig. 1, all; [0008]);
identifying, by the one or more processors, a stimulus based on modeled environmental data received from a data stream corresponding to one or more actions of a user interacting with the interactive machine within a environment (see Fig. 8, all; [0008]);
determining, by the one or more processors, a stimulus response for the interactive machine based on the interactive personality mode (see [0008]-[0013] and [0052 "The reaction behavior select unit 14 determines the reaction behavior pattern to the inputted stimulus by considering the character parameter XY stored in the character state storage unit 13. Concretely, with reference to the reaction behavior pattern tables for every growth stage shown in FIGS. 5 to 7, one of the reaction behavior patterns to a certain stimulus is selected according to the appearance probability to which is prescribed beforehand."]); and
executing, by the one or more processors, the stimulus response by the interactive machine (see [0008]-[0013] and [0052 "Then, the reaction behavior select unit 14 controls the actuators 3 or the speaker 4, and makes the dog type robot 1 behave as if it were taking reaction behavior to the stimulus."]);
wherein the identifying the stimulus based on the modeled environmental data received from the data stream includes:
receiving, by the one or more processors, a stimulus alert that includes a stimulus identifier (see [0043 "The stimulus recognition unit 11 detects the existence of a stimulus from the outside based on the stimulus signal from the stimulus sensors 5, and distinguishes the contents of the stimulus (kinds or stimulus places)."], [0061]-[0063] and [0070]-[0084]); and
determining, by the one or more processors, the stimulus that corresponds to the stimulus identifier (see Fig. 8, all; [0043 "The stimulus recognition unit 11 detects the existence of a stimulus from the outside based on the stimulus signal from the stimulus sensors 5, and distinguishes the contents of the stimulus (kinds or stimulus places)."], [0061]-[0063] and [0070]-[0084]).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to further modify the process/computer system/non-transitory computer-readable medium of modified Kumagai to receive a stimulus alert that includes a stimulus identifier and determine the stimulus that corresponds to the stimulus identifier, as taught by Saito, in order to provide appropriate reaction behaviors to a variety of stimuli.
Claims 44, 51 and 58 are rejected under 35 U.S.C. 103 as being unpatentable over Kumagai (as modified by Maisonnier) as applied to claims 42, 49 and 56 above, and further in view of Kim (US 20120059781 A1 and Kim hereinafter).
Regarding Claims 44, 51 and 58
Modified Kumagai teaches the computer-implemented method of claim 42, the computer system of claim 49 and the non-transitory computer-readable medium of claim 56 (as discussed above in claims 42, 49 and 56), the computer-implemented method comprising:
Kumagai further teaches retrieving, by the one or more processors, one or more stimulus-response pairs from a data store (see Fig. 11A-11B, all; [0106 "In the game used in this embodiment, one of the four emotions, such as “joy”, “anger”, “sadness”, and “happiness”, of a given (selected) dog, is set (determined), as shown in FIG. 10, in accordance with the action performed on the dog by the game player and the dog's temperament when the action has been made. According to the newly set emotion, the dog's action (behavior) is changed, thereby making the game dynamic and highly entertaining."]-[0109]; and
identifying, by the one or more processors, a behavior subset, wherein the behavior subset includes one or more behaviors within a similarity degree of an interactive personality mode corresponding to the current interactive personality mode (see Fig. 11A-11B, all; [0106 "In the game used in this embodiment, one of the four emotions, such as “joy”, “anger”, “sadness”, and “happiness”, of a given (selected) dog, is set (determined), as shown in FIG. 10, in accordance with the action performed on the dog by the game player and the dog's temperament when the action has been made. According to the newly set emotion, the dog's action (behavior) is changed, thereby making the game dynamic and highly entertaining."]-[0109]).
Kumagai is silent regarding scoring, by the one or more processors, each of the one or more stimulus- response pairs based on a scale; and
identifying a behavior subset based on the scoring.
Kim teaches a computer-implemented method for performing a mood determination of an interactive machine (see all Figs.; [0027] and [0042]-[0043]), the computer-implemented method comprising:
initializing, by one or more processors, an interactive machine (see Fig. 1, all; [0048]);
identifying, by the one or more processors, a stimulus based on modeled environmental data received from a data stream corresponding to one or more actions of a user interacting with the interactive machine within a environment (see [0043], [0061]-[0062] and [0068]);
updating, by the one or more processors, the interactive machine from the default interactive personality mode to an updated interactive personality mode based on the current interactive personality mode (see Figs. 5-7, all; [0043], [0064]-[0068] and [0075]-[0076]);
determining, by the one or more processors, a stimulus response for the interactive machine based on the interactive personality mode (see [0027 "In various embodiments, a method of determining a response in a particular artificial personality comprises the steps of: (1) establishing potential responses to a particular stimulus; (2) selecting a subset of potential responses that an artificial personality may reach in response to the particular stimulus; (3) waiting for the particular stimulus to occur; (4) determining whether the particular stimulus has occurred; (5) in response to the particular stimulus occurring, selecting a response from the subset of potential responses to the particular stimulus in a substantially random (e.g., entirely random) manner;..."] and [0043]); and
executing, by the one or more processors, the stimulus response by the interactive machine (see [0027 "...(5) in response to the particular stimulus occurring, selecting a response from the subset of potential responses to the particular stimulus in a substantially random (e.g., entirely random) manner; and (6) performing the response."] and [0043]);
the computer-implemented method comprising:
retrieving, by the one or more processors, one or more stimulus-response pairs from a data store (see Fig. 7, all; [0042]-[0043], [0063]-[0068] and [0076]-[0077]);
scoring, by the one or more processors, each of the one or more stimulus- response pairs based on a scale (see Fig. 7, all; [0043], [0064]-[0068], especially [0043 "Within the light wave representing the personality trait, differences in amplitude or color may be used to represent different responses to particular stimuli."], [0068 "As may be understood from FIG. 7, different amplitudes of a waveform may correspond to different potential responses to a particular stimulus."]); and
identifying, by the one or more processors, a behavior subset based on the scoring, wherein the behavior subset includes one or more behaviors within a similarity degree of an interactive personality mode corresponding to the current interactive personality mode (see [0043], [0068], especially [0043 "When a particular stimulus is received, a response may be selected based on the configuration of the light wave at the time the stimulus is received."], [0068 "For example, in the happiness waveforms of FIG. 7: (1) Amplitude A may correspond to a potential response including laughter; (2) Amplitude B may correspond to a potential response including a slight smile; and (3) Amplitude C may correspond to a potential response including crying."]).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to further modify the process/computer system/non-transitory computer-readable medium of modified Kumagai to include instructions for scoring each of the one or more stimulus-response pairs based on a scale and identifying a behavior subset including one or more behaviors within a similarity degree of an interactive personality mode corresponding to the current interactive personality mode, based on the scoring, as taught by Kim, in order to provide a realistic personality mode which responds appropriately to stimuli according to personality trait values.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to TANNER LUKE CULLEN whose telephone number is (303)297-4384. The examiner can normally be reached Monday-Friday 9:00-5:00 MT.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Khoi Tran can be reached at (571) 272-6919. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/TANNER L CULLEN/Examiner, Art Unit 3656 /KHOI H TRAN/Supervisory Patent Examiner, Art Unit 3656