DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
Receipt is acknowledged that application claims priority to foreign application with application number JP2022-099048 with a priority date of June 20, 2022. Copies of certified papers required by 37 CFR 1.55 have been received.
Claim Status
This action is in response to the application filed on December 03, 2025. Claims 1-2, 4-6, 9-11, and 16-17 are amended. Thus, claims 1-17 are pending for examination in this application.
Response to Amendments
Applicant's remarks and amendments filed December 03, 2025 have been entered.
Applicant’s arguments regarding the 35 U.S.C. 112(f) interpretations previously set forth in the Non-Final Office Action mailed September 16, 2025, are persuasive. Accordingly, the 35 U.S.C. 112(f) interpretations are withdrawn in response.
Applicant’s arguments regarding the 35 U.S.C. 112(a) and 112(b) rejections previously set forth in the Non-Final Office Action mailed September 16, 2025, are persuasive. Accordingly, the 35 U.S.C. 112(a) and 112(b) rejections are withdrawn in response.
Applicant’s arguments regarding the 35 U.S.C. 101 rejections previously set forth in the Non-Final Office Action mailed September 16, 2025, are not persuasive. Accordingly, the 35 U.S.C. 101 rejections are upheld in response.
Response to Arguments
Argument: On pages 12-13, the applicant alleges, “However, nowhere does Kikuchi teach or suggest generating "detection result information including the defect information to which the identification information has been given" and generating "image shape information representing an outline of the from which the defect information was detected," as recited by amended independent claim 1.”
Response: The examiner respectfully disagrees.
Kikuchi was relied on to teaching generating “image shape information representing an outline of the image from which the defect information was detected,” Kikuchi, Paragraph [0107], teaches “when the crack image in the damage data DD6 is traced with a finger or a pen, the trace is added to the damage drawing data as damage data. Here, the shape and position of the crack image in the damage data DD6 in the CAD drawing 80 correspond to the actual shape and position of the crack when the bridge 1 is viewed from above (the opposite side from the photographed side), and therefore can be input easily and accurately,” the damage data is traced and the shape and position of the crack image correspond to the actual shape and position of the crack which is considered to be image shape information representing an outline of the image. Horita’828 was relied on just teaching generating “detection result information including the defect information to which the identification information has been given,” Horita’828, Paragraph [0111], teaches “In the confirmation of the inspection result and the creation of the report regarding the inspection result, it is premised that the damage identification information 60 and the captured image identification information 61 target the same damage 77.”
Thus it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to implement the teachings of Horita’828 into Kikuchi because by utilizing means of Horita’828 to confirm the inspection result and create the report regarding the inspection result that includes the damage identification information into the damage editing device of Kikuchi utilizing adding trace of damage data to the damage drawing data the shape and position of the crack because the concept of confirmation the inspection result has been given would be obvious to one of ordinary skill in the art to incorporate the use of the identification information to be used with a more specific data such as trace and shape data of Kikuchi. Thus it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to implement the teachings of Horita’828 into Kikuchi because by providing the ability to confirm the inspection result with more specific data can be used to repair structures and also recognize the state of the damage progression at the time of the next regular inspection.
Therefore, the combination of references teach the limitation.
Specification
The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed.
Claim Rejections - 35 USC § 112
The following is a quotation of the first paragraph of 35 U.S.C. 112(a):
(a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention.
The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112:
The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention.
Claims 1-17 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. Independent claim 1 contains the limitation, " an outline of the image." Subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. Examiner has read through the specification and there is not support for the limitations of " an outline of the image." The dependent claims do not alleviate the issues of the independent claim and are also rejected under 35 U.S.C. 112(a).
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1-17 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter.
The limitations of independent claim 1, includes "and image shape information representing an outline of the image from which the defect information was detected." The limitation is interpretated as a border or line around the image associated with image shape information. It is unclear given the current limitations where, image shape information representing “an outline” is determined. The dependent claims do not alleviate the issues of the independent claim and are also rejected under 35 U.S.C. 112(b).
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claim(s) 1-17 is/are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Regarding claims 1, 16, and 17, these claims recite the following limitations which are found to be abstract ideas not reciting a practical application or significantly more, with claim 1 being exemplary:
detect defect information based on an image inputted (abstract idea as a mental process as a human mind is capable of observing and evaluating (detecting) defect information on an image);
set identification information that is unique within the predetermined group to the defect information detected (abstract idea as a mental process as a human mind is capable of setting unique identification information with identified defects in a predetermined group);
register, update, and store the defect information detected from the image inputted (abstract idea as a mental process as a human mind is capable registering, updating, and storing defect information detected from the image);
generate detection result information including the defect information to which the identification information has been given and image shape information representing an outline of the image from which the defect information was detected (insignificant pre/post-solution extra activity);
This judicial exception is not integrated into a practical application for the following reasons. Claims 1, 16, and 17 all recite the additional element of “and output the detection result information,” however, this limitation also recites an abstract idea as a mental process as a human is capable of outputting a defect detection result.
Claim 17 further recites the additional element of “a non-transitory computer-readable storage medium.” While this limitation includes an additional element it is not sufficient to recite a practical application of the abstract ideas recited in claim 17 as it amounts to mere generic computer elements and thus amount to no more than a recitation of the words “apply it” (or an equivalent) or are no more than the mere instructions to implement an abstract idea or other exception on a computer. See MPEP 2106.05(f).
Further, the claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception because when considered separately and in combination, the above recited additional element from claim 17 does not add significantly more (also known as an “inventive concept”) to the exception. Rather, the additional elements disclosed above perform well-understood, routine, conventional computer functions.
Therefore, independent claims 1, 16, and 17 are directed towards an abstract idea without a practical application or significantly more.
Regarding claims 2 and 4, the limitations are merely directed towards abstract ideas as a mental process, as the human mind is capable of adding image information to arrange the defect information in the image and give unique identification information to the outer shape information of the image, without reciting a practical application or significantly more.
Regarding claims 3, 5-12, and 14-15 the limitations are merely directed towards insignificant pre/post-solution extra activity that nonetheless do not integrate the abstract idea recited from claim 1 into a practical application.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-12, and 16-17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kikuchi et al, WO 2019021719 in view of Horita’828 et al, US 20230112828.
Regarding claim 1, Kikuchi teaches an information processing apparatus comprising:
a processor; and a memory, including instructions stored thereon, which when executed by the processor cause the apparatus to (see Kikuchi Paragraphs [0047-0048], “CPU 30 is a CPU (central processing unit) 30 that executes programs and is composed of one
or more CPUs. The storage unit 50 stores a program and information required to execute the program. The storage unit 50 is configured by a memory device.”):
input one or more images belonging to a predetermined group (see Kikuchi, Paragraph [0083], “The "input" of the inspection result information may be a "selective input" in which a selection is made from predetermined candidates,” the selection input is considered to be the input unit; predetermined candidates are considered to be belonging to a predetermined group);
detect defect information based on an image inputted by the input unit (see Kikuchi, Paragraph [0040], “The photographed images acquired by the photographing device 10 are input to the damage data editing device 20,” and “to detect damage appearing in the photographed image, and generates damage data based on the detection results (step S11: damage data generation step)” the damage data editing device 20 is considered to be the detection unit; );
register, update, and store the defect information detected from the image inputted (see Kikuchi, Paragraph [0083], “The damage diagram data creation unit 42 adds the input inspection result information to the damage diagram data. For example, when the type of damage and the evaluation category of the degree of damage (also called "rank information") are input via the operation unit 28, the damage diagram data creation unit 42 adds them to the damage diagram data,” the type of damage and evaluation category (“rank information”) are input inspection result information and is considered to be registered, updated, and stored when added to the damage diagram data);
and image shape information representing an outline of the image from which the defect information was detected (see Kikuchi, Paragraph [0107], “when the crack image in the damage data DD6 is traced with a finger or a pen, the trace is added to the damage drawing data as damage data. Here, the shape and position of the crack image in the damage data DD6 in the CAD drawing 80 correspond to the actual shape and position of the crack when the bridge 1 is viewed from above (the opposite side from the photographed side), and therefore can be input easily and accurately,” the damage data is traced and the shape and position of the crack image correspond to the actual shape and position of the crack which is considered to be image shape information representing an outline of the image);
and output the detection result information (see Kikuchi, Paragraph [0085], “The output unit 44 outputs the mirrored and projectively transformed damage data. Furthermore, when the damage drawing data creation unit 42 creates the damage drawing data, the output unit 44 outputs the damage drawing data,” output unit 44 is considered to be an output unit).
Kikuchi does not expressively teach
set identification information that is unique within the predetermined group to the defect information detected by the detection unit;
generate detection result information including the defect information to which the identification information has been given
However, Horita’828 teaches in a similar invention in the same field of endeavor teaches
set identification information that is unique within the predetermined group to the defect information detected by the detection unit (see Horita’828, Paragraph [0071], ‘The captured image identification information 61 is a file name or a part of the file name (hereinafter referred to as “file name”) of the captured image 103 obtained by imaging the damage 77, and includes, for example, a number, a character or a symbol, or characters that are a combination thereof. The captured image identification information 61 includes captured image identification information 61A, 61B, 61C, 61D, 61E, and 61F. For example, the captured image identification information 61A includes a character string of “1510” and indicates a part of the file name “DSCF1510.jpg”. Similarly, the captured image identification information 61B, 61C, 61D, 61E, and 61F include “1511”, “1512”, “1513”, “1514”, and “1515”, respectively”);
generate detection result information including the defect information to which the identification information has been given (see Horita’828, Paragraph [0111], “In the confirmation of the inspection result and the creation of the report regarding the inspection result, it is premised that the damage identification information 60 and the captured image identification information 61 target the same damage 77”)
The combination of Kikuchi and Horita’828 are analogous art because they are both in the same field of endeavor of inspecting damage to a structure. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention for the captured image identification information to include a character string that indicates part of the file name; for a pattern applied to the damage identification information to be determined by a user; to show the association between the damage identification information and the captured image; the relative position of the damage identification information can be specified with the structural drawing as a reference; and the use of the identification information to be used with a more specific data in the inspection support device of Horita’828 in the damage data editing device of Kikuchi so that the captured image and the damage identification information are automatically organized in association with each other (see Horita’828 Paragraph [0022]).
Regarding claim 2, Kikuchi in view of Horita’828 further teaches the information processing apparatus according to claim 1, wherein the instructions when executed by the processor, further cause the apparatus to:
to add to the detection result, image information for arranging the defect information in the image (see Kikuchi, Paragraph [0055], “the user uses a mouse (operation unit 28) attached to the computer to input commands regarding editing of the damage data. The editing unit 40 then edits the damage data based on the editing command. The editing unit 40 also accepts various editing operations such as addition, deletion, and modification.”).
The rationale of claim 1 has been applied herein.
Regarding claim 3, Kikuchi in view of Horita’828 further teaches the information processing apparatus according to claim 2,
wherein the image information is outer shape information of the image (see Kikuchi, Paragraph [0107], “when the crack image in the damage data DD6 is traced with a finger or a pen, the trace is added to the damage drawing data as damage data. Here, the shape and position of the crack image in the damage data DD6 in the CAD drawing 80 correspond to the actual shape and position of the crack when the bridge 1 is viewed from above (the opposite side from the photographed side), and therefore can be input easily and accurately”).
The rationale of claim 2 has been applied herein.
Regarding claim 4, Kikuchi in view of Horita’828 further teaches the information processing apparatus according to claim 3, wherein the instructions when executed by the processor, further cause the apparatus to:
give unique identification information to the outer shape information of the image (see Kikuchi, Paragraph [0106], “The items that are not automatically entered by the damage diagram data creation unit 42 can be input by the user via the operation unit 28 . For example, if water leakage and free lime are visible in the damage data DD6, the user can input the type of damage, "water leakage + free lime," via the operation unit 28”).
The rationale of claim 3 has been applied herein.
Regarding claim 5, Kikuchi in view of Horita’828 further teaches the information processing apparatus according to claim 1, wherein the instructions when executed by the processor, further cause the apparatus to:
to transform first coordinate information representing a position of the defect information in the image to second coordinate information (Kikuchi, Paragraph [0120], “Secondly, there is a mode in which the direction of the mirroring axis is determined based on the angle between the coordinate axes of a global coordinate system (the "first coordinate system") based on the building and the coordinate axes of a local coordinate system (the "second coordinate system") based on the imaging device 10 that captured the image”).
The rationale of claim 1 has been applied herein.
Regarding claim 6, Kikuchi in view of Horita’828 further teaches the information processing apparatus according to claim 5, wherein the instructions when executed by the processor, further cause the apparatus to:
offset coordinate information of defect information for each of the images in a predetermined pattern (see Kikuchi, Fig. 3, and Paragraph [0051], “FIG. 3 exemplifies damage patterns of cracking, peeling, reinforcing bar exposure, free lime, and water leakage”).
The rationale of claim 5 has been applied herein.
Regarding claim 7, Kikuchi in view of Horita’828 further teaches the information processing apparatus according to claim 6,
wherein a user is able to select the predetermined pattern (see Horita’828, Fig. 11 and Paragraph [0101], ‘The color, line type, and pattern applied to the damage identification information 60 are determined in advance by the user U”).
The rationale of claim 6 has been applied herein.
Regarding claim 8, Kikuchi in view of Horita’828 further teaches the information processing apparatus according to claim 5,
wherein the image is an image in which a structure to be inspected is captured, and the second coordinate information is coordinate information representing a position of the image in drawing information of the structure to be inspected (see Kikuchi, Fig. 13 and Paragraph [0104], “The alignment between the CAD drawing 80 and the damage data DD6 is performed using a known technique. For example, the CAD drawing 80 and the damage data DD6 are aligned based on the coordinate data of the CAD drawing 80 and the coordinate data of the damage data DD6”).
The rationale of claim 5 has been applied herein.
Regarding claim 9, Kikuchi in view of Horita’828 further teaches the information processing apparatus according to claim 1, wherein when defect information to which identification information has already been given is detected again, the instructions when executed by the processor, further cause the apparatus to:
give new identification information (see Horita’828, Fig. 12A-12B and Paragraph [0089], “Two captured image identification information 61 correspond to captured images 103 different from each other, respectively. For example, one captured image 103 is a close-up image that shows the state of the damage 77, and the other captured image 103 is a distant view image that shows the position of the damage 77”).
The rationale of claim 1 has been applied herein.
Regarding claim 10, Kikuchi in view of Horita’828 further teaches the information processing apparatus according to claim 1, wherein the instructions when executed by the processor, further cause the apparatus to:
give the same identification information to defect information considered to be the same as defect information to which identification information has already been given (see Kikuchi, Paragraph [0087], “damage data indicating damage such as cracks, water leakage, and free lime for each lattice, the name of the component (in this example, "deck slab"), the element number (Ds0101 to Ds0104, Ds0201 to Ds0204), the type of damage ("crack," "water leakage," "free lime," etc.),”).
The rationale of claim 1 has been applied herein.
Regarding claim 11, Kikuchi in view of Horita’828 further teaches the information processing apparatus according to claim 1, wherein the instructions when executed by the processor, further cause the apparatus to:
output a first detection result for each of the images, and a second detection result for which a detection result of each of the images has been merged for the predetermined group (see Horita’828, Figs.16A-16C and Paragraph [0135], “FIGS. 16A to 16C show the association between the damage identification information 60 and the captured image 103,” the damage type is considered to be the predetermined group; in Fig. 16C the images are merged based on the damage type).
The rationale of claim 1 has been applied herein.
Regarding claim 12, Kikuchi in view of Horita’828 further teaches the information processing apparatus according to claim 1,
wherein defect information of the first detection result is represented in coordinate information representing a position of the defect information in the image, and defect information of the second detection result is represented in coordinate information representing a position of the image in drawing information of a structure to be inspected (see Horita’828, Paragraph [0106], “the damage identification information recognition unit 52 recognizes the position of the damage identification information 60 on the image data 80. The damage identification information recognition unit 52 recognizes, for example, the relative position of the damage identification information 60 with respect to the structural drawing 42 included in the image data 80. The position of the damage identification information 60 can be specified with the structural drawing 42 as a reference”).
The rationale of claim 1 has been applied herein.
As per Claim 16, Claim 16 claims an information processing method comprising the same limitations as claimed in Claim 1. Therefore the rejection and rationale are analogous to that made in Claim 1.
As per Claim 17, Claim 17 claims a non-transitory computer-readable storage medium storing a program that causes a computer to execute an information processing method comprising the same limitations as claimed in Claim 1. Therefore the rejection and rationale are analogous to that made in Claim 1.
Kikuchi teaches in Paragraph [0048], “The storage unit 50 stores a program and information required to execute the program. The storage unit 50 is configured by a memory device.”
Claim(s) 13-14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kikuchi et al, WO 2019021719 in view of Horita’828 et al, US 20230112828 in further view of Horita’696 et al, US 20220120696.
Regarding claim 13, Kikuchi in view of Horita’828 further teaches the information processing apparatus according to claim 1,
wherein the information processing apparatus includes a first viewer configured to display the first detection result to be superimposed on the image so as to be editable, (see Kikuchi, Paragraph [0104], “The damage drawing data DD6 generated by the damage drawing data generating unit 42 is superimposed on the CAD drawing 80 and displayed on the display unit 26. FIG. 16 shows an example in which the damage data DD6 shown in FIG. 15 is superimposed on the CAD drawing 80 shown in FIG”)
Kikuchi in view of Horita’828 does not expressively teach
and a second viewer configured to display the second detection result to be superimposed on drawing information of a structure to be inspected so as to be editable
However, Horita’696 teaches in a similar invention in the same field of endeavor teaches
and a second viewer configured to display the second detection result to be superimposed on drawing information of a structure to be inspected so as to be editable (see Horita’696, Fig. 20 and Paragraph [0049], “A user confirms the first and second detection results displayed on the display unit 3, performs an input of a correction and the like via the operation unit (first editing reception unit and second editing reception unit) 5, and performs a correction or a change of the first and second detection results” and Paragraph [0118], “Here, as the captured image P to be superimposed and displayed as described above, a single captured image may be displayed or the panorama composition image obtained by performing panorama composition of a plurality of the captured images may be displayed by superimposing the repair diagram or the damage diagram on the panorama composition image”).
The combination of Kikuchi, Horita’828, and Horita’696 are analogous art because they are all in the same field of endeavor of inspecting damage to a structure. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention for the second detection result to be superimposed on the damage diagram and the first and second detection result are displayed side by side at the same time in the repair diagram generation device of Horita’696 in the damage data editing device of Kikuchi in view of Horita’828 to accurately and efficiently generate a repair diagram showing a repair region to be repaired and a repair method from a captured image of a structure (see Horita’696, Paragraph [0007]).
Regarding claim 14, Kikuchi in view of Horita’828 in further view of Horita’696 further teaches the information processing apparatus according to claim 13,
wherein content edited in the first viewer is reflected in the second detection result and content edited in the second viewer is reflected in the first detection result (see Horita’696, Fig. 20 and Paragraph [0122]-[0123], “FIG. 20 is a diagram showing a case in which the first detection result and the second detection result are displayed side by side at the same time.”).
The rationale of claim 13 has been applied herein.
Claim(s) 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kikuchi et al, WO 2019021719 in view of Horita’828 et al, US 20230112828 in further view of Dick et al, US 20200164904.
Regarding claim 15, Kikuchi in view of Horita’828 does not expressively teach the information processing apparatus according to claim 11,
wherein the image is a partial image in which a portion of a structure to be inspected is captured, the first detection result includes defect information detected for each partial image, the second detection result includes defect information for which the defect information detected for each partial image has been merged for an entirety of the structure to be inspected.
However, Dick teaches in a similar invention in the same field of endeavor teaches
wherein the image is a partial image in which a portion of a structure to be inspected is captured, the first detection result includes defect information detected for each partial image, the second detection result includes defect information for which the defect information detected for each partial image has been merged for an entirety of the structure to be inspected (see Dick, Paragraph [0006], “the identified one or more defects in the portion of the surface of the first rail and the identified one or more defects in the portion of the surface of the second rail; and cause the electronic display device to display (i) at least a portion of the first image, (ii) at least a portion of the second image, (iii) the determined information associated with the identified one or more defects in the portion of the surface of the first rail, (iv) the determined information associated with the identified one or more defects in the portion of the surface of the second rail, or (v) any combination of (i), (ii), (iii), or (iv)”).
The combination of Kikuchi, Horita’828, and Dick are analogous art because they are all in the same field of endeavor of inspecting damage to a structure. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to identify one or more defects in at least a portion of the first image a portion of the second image in the system for visualizing and quantifying surface damage of Dick in the damage data editing device of Kikuchi in view of Horita’828 to identify any defects that exist within each elongated portion of the rail surface (see Dick, Abstract).
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DOMINIQUE JAMES whose telephone number is (703)756-1655. The examiner can normally be reached 9:00 am - 6:00 pm EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Emily Terrell can be reached at (571)270-3717. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DOMINIQUE JAMES/Examiner, Art Unit 2666 /EMILY C TERRELL/Supervisory Patent Examiner, Art Unit 2666