Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAILED ACTION
Examiner cites particular columns or paragraphs, and line numbers in the references as applied to the claims below for the convenience of the applicant. Although the specified citations are representative of the teachings in the art and are applied to the specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested that, in preparing responses, the applicant fully consider the references in entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the examiner.
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
In reply to the Non-Final Office Action mailed on 9/17/2025, the Applicant has filed response on 12/4/2025 amending claims 1 and 9. Claim 4 has been cancelled. No claim has been added. Claims 1-3 and 5-10 are pending in this application.
Previous claim objection is withdrawn in view of applicant’s amendments filed on 12/4/2025.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-3, 5-7 and 9-10 are rejected under 35 U.S.C. 103 as being unpatentable over Jakubiak et al. (US 2014/0298271), in view of Hauenstein et al. (US 10,318,034), and further in view of Takamatsu et al. (US 2017/0158056).
Regarding claim 1, Jakubiak discloses a projection apparatus for projecting a projection image onto a projection target (see 100 in Figs. 1, 3A-3I and 6A-6B), the projection apparatus comprising:
a first button and a second button that operate the projection apparatus, the first button functioning as a first operator, the second button functioning as a second operator (see e.g. buttons corresponding to “plurality of function keys 311, 312, 313, and 314” or other operable areas presented in touch screen 190, as shown in Figs. 3B-3C; para[0071]);
a sensor that detects a first distance from a pointer to the first operator and a second distance from the pointer to the second operator (see e.g. in Fig. 1 “sensor module 170 may include a proximity sensor for detecting user's proximity to the electronic device 100” and “touch screen 190 may sense (e.g., detect) at least one touch through a user's body (e.g., a finger including a thumb) or a touch-possible input means (e.g., a stylus pen)”, and “may sense (e.g., detect) continuous movement of one of the at least one touch”; “the touch may include a contactless (e.g., a detectable distance between the touch screen 190 and the user's body or the touch-possible input means) touch as well as a contact between the touch screen 190 and the user's body or the touch-possible input means”; thus, the distance between, e.g., a finger and any of the of function keys 311, 312, 313, and 314, or other operable areas presented in the touch screen 190, as shown in Figs. 3C-3I, is detected by the touch screen 190; “The touch screen 190 may include, for example, a first touch panel 190a and a second touch panel 190b”; “The first touch panel 190a may measure a touch or approach of a part of the user's body”; “the first touch panel 190a may be of a resistive type, a capacitive type, an infrared type, an acoustic wave type, or the like”; “the second touch panel 190b may be of an Electromagnetic Resonance (EMR) measurement type”; para[0057]; para[0062]-para[0066]);
a projection output section of a projector that projects a first button image representing the first operator and a second button image representing the second operator, as the projection image, the projection output section functioning as a projection mechanism, the first button image functioning as a first operation image, the second button image functioning as a second operation image (para[0060]; para[0073]; see 177 in Figs. 1 and 3C-3H; “A projector module 177 projects and displays a rendered image”; “For example, the projector module 177 may include a light source for emitting light to be used in projection, a light-modulator for modulating light incident from the light source according to an image signal, and a lens unit for projecting the light incident from the light-modulator onto a screen”; “The projector module 177 generates and projects a projection image 320 which is the same as the application execution screen 310”; “The projection image 320 includes first through fourth projection function keys 321 through 324 corresponding to the first through fourth function keys 311 through 314”, that is, button images corresponding to the buttons in touch screen 190); and
one or a plurality of processors that control display of the first operation image and the second operation image (see controller 110 including Central Processing Unit (CPU) 111 which includes a various number of core processors (e.g., the CPU may include a single core, a dual core, a triple core, a quad core processor); “as illustrated in FIG. 3C, the controller 110 controls the projector module 177 to project the same image as the application execution screen 310 displayed on the touch screen 190”; “The projector module 177 generates and projects a projection image 320 which is the same as the application execution screen 310”; “The projection image 320 includes first through fourth projection function keys 321 through 324 corresponding to the first through fourth function keys 311 through 314”; para[0040]; para[0073]; para[0144]-para[0146]), wherein
However, Jakubiak does not appear to expressly disclose the one or plurality of processors cause a highlighting of the first operation image to be different from a highlighting of the second operation image when it is determined that the first distance is shorter than the second distance, based on an output from the sensor, the highlight functioning as a display mode, and when the pointer becomes closer to the first operator, the one or plurality of processors shortens a blinking cycle of the first operation image, and the blinking cycle of the first operation image becomes shorter as the pointer becomes closer to the first operator.
Hauenstein discloses one or plurality of processors cause a highlighting of a first operation image to be different from a highlighting of a second operation image when it is determined that a first distance from a pointer to the first operator is shorter than a second distance from the pointer to a second operator, based on an output from a sensor, the highlight functioning as a display mode, and when the pointer becomes closer to the first operator, the one or plurality of processors changes the first operation image, and the change of the first operation image is executed as the pointer becomes closer to the first operator (column 11, line 60 to column 12, line 16; column 14, lines 26-54; column 37, lines 61-64; column 43, lines 41-46 and 60-67; column 44, lines 6-28; column 44, line 50 to column 45, line 3; column 45,lines 32-45; regarding Figs. 1A, 2, 6B, 7A, 7D-7F and 7J-7K, “Display controller 156 receives and/or sends electrical signals from/to touch-sensitive display system 112”; “Touch-sensitive display system 112 displays visual output to the user”, and “The visual output optionally includes graphics, text, icons, video, and any combination thereof (collectively termed “graphics”)”, and “some or all of the visual output corresponds to user interface objects” that “ include, without limitation, a button, slider, icon, selectable menu item, switch, hyperlink, or other user interface control”; In Fig. 7A “the device detects inputs on a touch-sensitive surface 651 that is separate from the display 650, as shown in FIG. 6B”, e.g. a tablet; “FIGS. 7A-7U illustrate a process for interacting with a user interface object (e.g., including picking up, moving, and dropping off the user interface object) through proximity-based inputs and contact-based inputs by an input object (e.g., a finger or a stylus, such as stylus 203)”; “The input parameters of an input provided by the input object (and detected by the electronic device (e.g., device 100)) include a hover proximity parameter that is (or is calculated based on) a distance (e.g., distance 514) between a predetermined portion of the input object (e.g., the tip of the finger, or the tip of the stylus) and a touch-sensitive surface (e.g., touch-screen 112 or touch-sensitive surface 651)”); “the input parameters include the lateral position (e.g., (x, y) position 504) of the input object (e.g., finger or stylus 203) relative to the touch-sensitive surface”, and “the three-dimensional positional state and movement of the input object, e.g., as described with respect to FIGS. 5A and 5B”; In Fig. 7A “User interface 702 includes a number of user interface objects (e.g., represented by a row and a column of circles), including user interface object 704 (e.g., an application launch icon, a button, etc.)”; regarding Figs. 7B-7C, “In response to detecting stylus 203 hovering above a location on the touch-sensitive surface that corresponds to the location of user interface object 704, the device changes the appearance of user interface object 704”, and since this would correspondingly happen when hovering right above and closer to a location at the touch sensitive surface 651 corresponding to each object 704, it is clear that the controller 156 determines the distance between pointer 203 (stylus or finger) and each location at the touch sensitive surface 651 respectively corresponding to each object 704, in order to change the appearance of the object 704 corresponding to the shortest distance between the pointer 203 and corresponding location on the touch sensitive surface 651, to make that object 704 distinguishable; in addition, “FIGS. 7D-7F illustrate that, when stylus 203 is moved vertically up and down within the hover range, the device dynamically changes the appearance of user interface object 704 in accordance with the current hover distance of stylus 203”, which would also happen when stylus 203 is closer to another location at the touch sensitive surface 651 corresponding to another object 704; in addition, as shown in Figs. 7J-7K, “the device also changes the appearance of user interface object 704 (e.g., further enlarges user interface object 704 as compared to the size before stylus 203 made contact with the touch-sensitive surface, and highlighting user interface object 704) to indicate that user interface object is now being manipulated by a touch input, as opposed to a hover input”; the different high lightings would happen as the stylus 203 is closer to each respective location at the touch sensitive surface 651 corresponding to each object 704, in order to differentiate the display mode of the object 704 with the shortest distance to the pointer 203 from the display mode of the rest of the objects 704).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to combine the teachings in Jakubiak’s invention, with the teachings in Hauenstein’s invention, to have the one or plurality of processors cause a highlighting of the first operation image to be different from a highlighting of the second operation image when it is determined that the first distance is shorter than the second distance, based on an output from the sensor, the highlight functioning as a display mode, and when the pointer becomes closer to the first operator, the one or plurality of processors changes the first operation image, and the change of the first operation image is executed as the pointer becomes closer to the first operator, for the advantage of facilitating user interface interactions through proximity-based and contact-based inputs, while reducing a number, extent, and/or nature of inputs from a user and producing a more efficient human-machine interface (column 1, lines 20-24; column 2, lines 5-8).
However, Jakubiak and Hauenstein do not appear to expressly disclose the change of the first operation image comprising the one or plurality of processors shortens a blinking cycle of the first operation image, and the blinking cycle of the first operation image becomes shorter as the pointer becomes closer to the first operator.
Takamatsu discloses changing an operation image by shortening a blinking cycle of the operation image (para[0165]; “an operation may be made that… the blinking cycle of a speed-indicating image projected onto the windshield 110 is shortened”).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to combine the teachings in Jakubiak’s and Hauenstein’s combination, with the teachings in Takamatsu’s invention, to have when the pointer becomes closer to the first operator, the one or plurality of processors shortens a blinking cycle of the first operation image, and, the blinking cycle of the first operation image becomes shorter as the pointer becomes closer to the first operator, as a result of the combination, for the advantage of increasing a degree of feedback given to the user (para[0165]).
Regarding claim 2, Jakubiak, Hauenstein and Takamatsu disclose all the claim limitations as applied above (see claim 1). In addition, Jakubiak discloses the one or plurality of processors cause the projection mechanism to project a pointer image representing the pointer, as the projection image, based on an output from the sensor (para[0062]; para [0084]-para[0085]; “The touch screen 190 may sense (e.g., detect) continuous movement of one of the at least one touch”; Referring to Figs. 3G-3H, “The controller 110 moves and displays the pointer 361 to correspond to the leftward continuous touch gesture 371”, “the controller 110 controls the touch screen 190 to move the pointer 361 to the left and displays the pointer 361’ to correspond to the leftward continuous touch gesture 371”, “ and “The controller 110 controls the projector module 177 to move the projected pointer 362 to the left and displays the projected pointer 362”).
Regarding claim 3, Jakubiak, Hauenstein and Takamatsu disclose all the claim limitations as applied above (see claim 2). In addition, Jakubiak discloses the one or plurality of processors move the pointer image in accordance with a movement of the pointer, based on an output from the sensor (para[0062]; para[0084]-para[0085]; “The touch screen 190 may sense (e.g., detect) continuous movement of one of the at least one touch”; Referring to Figs. 3G-3H, “The controller 110 moves and displays the pointer 361 to correspond to the leftward continuous touch gesture 371”, “the controller 110 controls the touch screen 190 to move the pointer 361 to the left and displays the pointer 361’ to correspond to the leftward continuous touch gesture 371”, and “The controller 110 controls the projector module 177 to move the projected pointer 362 to the left and displays the projected pointer 362”).
Regarding claim 5, Jakubiak, Hauenstein and Takamatsu disclose all the claim limitations as applied above (see claim 1). In addition, Hauenstein discloses the one or plurality of processors change the display mode of the first operation image with time according to a direction of movement of the pointer, based on an output from the sensor (column 11, line 60 to column 12, line 16; column 14, lines 26-54; column 37, lines 61-64; column 41, lines 36-39; column 43, lines 41-46 and 60-67; column 44, lines 6-28; column 44, line 50 to column 45, line 3; column 45,lines 32-45; regarding Figs. 1A, 2, 6B, 7A, 7D-7F and 7J, “Display controller 156 receives and/or sends electrical signals from/to touch-sensitive display system 112”; “the response of the device to inputs detected by the device depends on criteria that include… time-based criteria”; “The input parameters of an input provided by the input object (and detected by the electronic device (e.g., device 100)) include a hover proximity parameter that is (or is calculated based on) a distance (e.g., distance 514) between a predetermined portion of the input object (e.g., the tip of the finger, or the tip of the stylus) and a touch-sensitive surface (e.g., touch-screen 112 or touch-sensitive surface 651)”); regarding Figs. 7B-7C, “In response to detecting stylus 203 hovering above a location on the touch-sensitive surface that corresponds to the location of user interface object 704, the device changes the appearance of user interface object 704 (e.g., enlarges user interface object 704 slightly)”, that is, according to motion of the pointer 203 towards the location on the touch-sensitive surface that corresponds to the location of user interface object 704 over time, based on an output from the touch-sensitive surface/tablet, thus, according to a direction of the movement of the pointer 203 over time; in addition, “FIGS. 7D-7F illustrate that, when stylus 203 is moved vertically up and down within the hover range, the device dynamically changes the appearance of user interface object 704 in accordance with the current hover distance of stylus 203” over time; in addition, as shown in Fig. 7J, “the device also changes the appearance of user interface object 704 (e.g., further enlarges user interface object 704 as compared to the size before stylus 203 made contact with the touch-sensitive surface, and highlighting user interface object 704) to indicate that user interface object is now being manipulated by a touch input, as opposed to a hover input”, thus, according to a direction of the movement of the pointer 203 in order to make contact with the touch-sensitive surface).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to have the one or plurality of processors change the display mode of the first operation image with time according to a direction of movement of the pointer, based on an output from the sensor, as also taught by Hauenstein, for the advantage of further facilitating user interface interactions through proximity-based and contact-based inputs and produce a more efficient human-machine interface (column 1, lines 20-24; column 2, lines 5-8).
Regarding claim 6, Jakubiak, Hauenstein and Takamatsu disclose all the claim limitations as applied above (see claim 1). In addition, Jakubiak discloses the one or plurality of processors accept a change operation for changing an arrangement of the first operation image and the second operation image, and change positions of the first operation image and the second operation image in relation to the projection target in response to the change operation (para[0074]; see Fig. 10; “Referring to Fig. 10, projector setting screen 380 includes… a rotate projection menu 387”; “the rotate projection menu 387 is for selecting landscape or portrait projection mode”).
Regarding claim 7, Jakubiak, Hauenstein and Takamatsu disclose all the claim limitations as applied above (see claim 1). In addition, Jakubiak discloses a main body in which the first operator and the second operator are arranged (see body of electronic device 100 in which function keys 311 through 314 are arranged, as shown in Fig. 3C), wherein a positional relationship between the first operation image and the second operation image is the same as a positional relationship between the first operator and the second operator in the main body (para[0060]; para [0073]; “The projector module 177 generates and projects a projection image 320 which is the same as the application execution screen 310”; “The projection image 320 includes first through fourth projection function keys 321 through 324 corresponding to the first through fourth function keys 311 through 314”; as shown in Fig. 3C, a positional relationship between function keys 321 through 324 in the projection image 320 is the same as a positional relationship between the first through fourth function keys 311 through 314 in electronic device 100).
Regarding claim 9, it is analogous to claim 1, except it is a method claim (see e.g. para[0003] of Jakubiak, and column 2, line 61 to column 3, line 24 of Hauenstein), and therefore, it is rejected for the same reasons as claim 1 above.
Regarding claim 10, Jakubiak, Hauenstein and Takamatsu disclose all the claim limitations as applied above (see claim 1). In addition, Hauenstein discloses the one or plurality of processors change the display mode of the first operation image with time according to the first distance, based on an output from the sensor (column 11, line 60 to column 12, line 16; column 14, lines 26-54; column 37, lines 61-64; column 41, lines 36-39; column 43, lines 41-46 and 60-67; column 44, lines 6-28; column 44, line 50 to column 45, line 3; column 45,lines 32-45; regarding Figs. 1A, 2, 6B, 7A, 7D-7F and 7J, “Display controller 156 receives and/or sends electrical signals from/to touch-sensitive display system 112”; “the response of the device to inputs detected by the device depends on criteria that include… time-based criteria”; “The input parameters of an input provided by the input object (and detected by the electronic device (e.g., device 100)) include a hover proximity parameter that is (or is calculated based on) a distance (e.g., distance 514) between a predetermined portion of the input object (e.g., the tip of the finger, or the tip of the stylus) and a touch-sensitive surface (e.g., touch-screen 112 or touch-sensitive surface 651)”); regarding Figs. 7B-7C, “In response to detecting stylus 203 hovering above a location on the touch-sensitive surface that corresponds to the location of user interface object 704, the device changes the appearance of user interface object 704 (e.g., enlarges user interface object 704 slightly)”, that is, according to motion of the pointer 203 towards the location on the touch-sensitive surface that corresponds to the location of user interface object 704 over time, based on an output from the touch-sensitive surface/tablet, thus, according to the distance between the pointer 203 and object 704 over time; in addition, “FIGS. 7D-7F illustrate that, when stylus 203 is moved vertically up and down within the hover range, the device dynamically changes the appearance of user interface object 704 in accordance with the current hover distance of stylus 203” over time; in addition, as shown in Fig. 7J, “the device also changes the appearance of user interface object 704 (e.g., further enlarges user interface object 704 as compared to the size before stylus 203 made contact with the touch-sensitive surface, and highlighting user interface object 704) to indicate that user interface object is now being manipulated by a touch input, as opposed to a hover input”, thus, according to the distance between the pointer 203 and object 704).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to have the one or plurality of processors change the display mode of the first operation image with time according to the first distance, based on an output from the sensor, as also taught by Hauenstein, for the advantage of further facilitating user interface interactions through proximity-based and contact-based inputs and produce a more efficient human-machine interface (column 1, lines 20-24; column 2, lines 5-8).
Claim 8 is rejected under 35 U.S.C. 103 as being unpatentable over Jakubiak et al. (US 2014/0298271), in view of Hauenstein et al. (US 10,318,034), and Takamatsu et al. (US 2017/0158056), as applied to claim 1 above, and further in view of Missig et al. (US 2014/0068520).
Regarding claim 8, Jakubiak, Hauenstein and Takamatsu disclose all the claim limitations as applied above (see claim 1). However, Jakubiak, Hauenstein and Takamatsu do not appear to expressly disclose the one or plurality of processors cause the projection mechanism to project a plurality of menu items as the projection image, accept a change operation of changing a level of the menu items, and switch the first operation image and the second operation image between display and non-display, according to the level.
Missig discloses one or plurality of processors cause a projection mechanism to project a plurality of menu items as a projection image (para[0018]; para[0020]-para[0021]; para[0023]; para[0029]; para[0036]; para[0048]-para[0050]; para[0054]; see Figs. 1-2 and 5; “System environment 100 includes a touchscreen device 102 communicatively coupled to a display device 104”; “Touchscreen device 102 can be an electronic device that is capable of sending and receiving content, and displaying a visual representation of the content on a display of the device”; “touchscreen device 102 and display device 104 can simultaneously display… interfaces… on each of their display devices”, and “user interfaces generated can be mirror images of each other or partial-mirror images”; “In response to receiving user input via the user interface on touchscreen device 102, the application in some embodiments can modify the content of the user interface being displayed on display device 104”; “Portable electronic device 102 and electronic device 104 of FIG. 1 can include similar components as those shown in computer system 200”; “Display 220 can display images generated by electronic device 200 and can include various image generation technologies, e.g., a… projection system”; “an application executing on one device can drive the displays of both a touchscreen device and a display device while receiving user input from the touchscreen device”; “Fig. 5 illustrates an example of a process 500 performed by a touchscreen device (e.g., touchscreen device 102) for interacting with a display device (e.g., display device 104) having a non-touch display”; “the touch-enabled display may present a user interface including a list of selectable user interface items” which can be mirrored in the non-touch display), accept a change operation of changing a level of the menu items (para[0051]-para[0052]; para[0054]; para[0075]; regarding Figs. 1-2, 5 and 12, “Process 500 can receive (at block 506) a selection of one of the first set of information items from the user via the touch-enabled display of the touchscreen device”; “Process 500 can enable (at block 508) a second set of information items associated with the selected information item to be presented on the display of the second electronic device while the first set of information items is presented on the touch-enabled display of the first electronic device”; “the second set of information items represents an additional level of information (e.g., a higher level of information, a lower level of information) related to the selected information item”; see e.g. Fig. 12), and switch a first operation image and a second operation image between display and non-display, according to the level (para[0018]; para[0020]-para[0021]; para[0023]; para[0029]; para[0036]; para[0048]-para[0052]; para[0054]; para[0075]; since “the application may generate another user interface that is a mirror image of the user interface displayed on the touch-enabled display of the first electronic device”, and “the touch-enabled display may present a user interface including a list of selectable user interface items” (e.g. claimed first operation image and second operation image), “In response to receiving a selection…, the application can cause display device 104 to present a lower level of information”, thus switching the list of selectable user interface items between display and non-display according to the level; see e.g. Fig. 12 in which the list 1208 is switched to not being displayed on the display device 1204 when item 1205 is selected from the list 1208 presented on touchscreen device 1202).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to combine the teachings in Jakubiak’s and Hauenstein’s and Takamatsu’s combination, with the teachings in Missig’s invention, to have the one or plurality of processors cause the projection mechanism to project a plurality of menu items as the projection image, accept a change operation of changing a level of the menu items, and switch the first operation image and the second operation image between display and non-display, according to the level, for the advantage of presenting content to the user in an intuitive manner that would provide better use of screen space across available displays (para[0036]).
Response to Arguments
Applicant's arguments filed on 12/4/2025 have been fully considered but they are not persuasive.
Regarding claims 1 and 9, the applicant argues on page 5 of the remarks that “Takamatsu does not disclose shortening blinking cycle in association with “as the pointer becomes closer to the first operator””, because “Takamatsu does not disclose shortening blinking cycle in association with distance”.
In response to applicant's arguments against the references individually, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986). As shown in the above rejection, Takamatsu is used in the combination as disclosing changing an operation image by shortening a blinking cycle of the operation image.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Inquiries
Any inquiry concerning this communication or earlier communications from the examiner should be directed to GLORYVID FIGUEROA-GIBSON whose telephone number is (571)272-5506. The examiner can normally be reached on 9am-5pm, Monday -Friday, Eastern Time.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chanh Nguyen can be reached on 571-272-7772. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/GLORYVID FIGUEROA-GIBSON/Patent Examiner, Art Unit 2623
/CHANH D NGUYEN/Supervisory Patent Examiner, Art Unit 2623