DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
2. This Office Action is in response to amendments and remarks filed on 01/12/2026. Claims 1-11 are currently pending.
Claim Rejections - 35 USC § 102
3. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless -
(a)(l) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
4. Claims 1, 3, 4, 8, 9, 11 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Simske et al., (US 2015/0097936A1).
Regarding claims 1, 8, 9, Simske et al., discloses an information processing apparatus comprising: an information processing apparatus comprising: at least one memory that is configured to store instructions (Fig.2 and paragraph [0043], “a memory 167”); and
at least one first processor that is configured to execute the instructions ([0043], “Controller 150 may include a processor 165 for processing for processing machine-readable instructions, such as processor-readable (e.g., computer-readable) instructions. These machine-readable instructions may be stored in a memory 167”):
to detect a target-to-target distance between an imaging target and a light irradiation unit (133 including beam135) that irradiates light for scanning the imaging target (see Fig.1 and [0041], “controller 150 may be configured to perform a focusing method, e.g., in response to determining that target region 122 is not in focus, to bring target region 122 into focus. Adjusting a distance d (FIG. 1) from afocal optical system 126 to target region 122”); and start irradiation with the light by the light irradiation unit ([0014], and [0019], “Afocal optical system 126 facilitates capturing a fingerprint from target region 122 when target region 122 is at a distance. For example, positioning a finger so that crossing point 142 lands on a predetermined location of target region. During operation, target region 122 reflects the light from beams 135 to afocal optical system 126”, showing the beams 135 are triggered for fingerprint capture ) in a case where the target-to-target distance is a desired distance (“crossing point 142”, paragraph [0019], “positioning a finger so that crossing point 142 lands on a predetermined location of target region 122”), output guide information (alignment beams 140, Fig.1) at a desired position ([0019]-Beams 140 may cross each other at a crossing point 142 that is aligned with afocal optical system 126, and [0039]- the feedback alignment method may be used in conjunction with positioning the finger so that crossing point 142 lands on a predetermined point of target region 122) in the air (Fig.1, since the finger is positioned at crossing point 142, and [0009]- the finger is in mid-air separated by an air gap 124, thus the crossing point 142 is located in the air), and for guiding the imaging target to the desired position in the air ([[0039], “the feedback alignment method may be used in conjunction with positioning the finger so that crossing point 142 lands on a predetermined point of target region 122…[0042], “the controller 150 may instruct…to move its finger closer to or further away from afocal optical system 126 until it determines that target region 122 is in focus).
Regarding claim 3, Simske et al., as discussed in claim 1, discloses the at least one first processor (150, Fig.2) being configured to execute the instructions to output guide information for guiding the imaging target to a desired position ([0036], “controller 150 may be configured to perform a feedback alignment method… that properly aligns target region 122 with afocal optical system 126), and the guide information includes at least one of visual information (“display 155”, [0038]), auditory information, and tactile information.
Regarding claim 4, Simske et al., as discussed in claim 3, disclose wherein the visual information being an image (the crossing point 142, Fig.1) visibly shown (visible light beams that land on the finger, see Fig.1 and [0017], “visible light”) at the desired position ([0019], “positioning a finger so that crossing point 142 lands on a predetermined location of target region 122, e.g., the center of target region 122, may properly align target region 122”).
Regarding claim 11, Simske et al., as discussed in claim 1, disclose wherein the at least one first processor being configured to execute the instructions to: output two guiding light beams to the desired position in the air (Fig.1, [0009] and [0018], light sources 130 may be configured to emit alignment beams 140 in a gap 124, e.g., of air), wherein
one point is formed by the two guiding light beams in a case where the imaging target is positioned at the desired position in the air ([0019], Beams 140 may cross each other at a crossing point 142 that is aligned with afocal optical system 126), and
two points are formed by the two guiding light beams in a case where the imaging target is deviated from the desired position in the air (this limitation is inherently included, see [0039], “ the feedback alignment method may be used in conjunction with positioning the finger so that crossing point 142 lands on a predetermined point of target region 122”, showing that if the finger is not at the predetermined point or at correct position, the beam hits the finger at two different spots which “move its finger closer to or further away”, [0042]; also Fig.1 shows that each beam hits a different spot).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim 2 is rejected under 35 U.S.C. 103 as being unpatentable over Simske et al., in view of Peel et al., (US 2021/0333699 A1).
Regarding claim 2, although Simske et al., as discussed in claim 1, disclose the at least one first processor configured to execute the instructions to move the optical system 126 ([0041], controller 150 may be configured to perform a focusing method…by moving afocal optical system 126) to a position suitable for scanning of the imaging target ([0041], to bring target region 122 into focus) based on the target-to-target distance ([0041] “Adjusting a distance d (FIG. 1) from afocal optical system 126 to target region 122”), and the light irradiation unit is coupled to the afocal system 126, Simske et al., do not explicitly disclose that moving the light irradiation unit as claimed. Peel et al., disclose moving the light irradiation unit to a position suitable for scanning of the imaging target ([0073], “movements of the light source relative to the lens to modify a perspective of the projected image”). Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Simske et al., by utilizing the teaching of Peel et al., to better adjust the optical path, getting higher quality finger print captures.
Claims 5 and 6 are rejected under 35 U.S.C. 103 as being unpatentable over Simske et al., in view of Mimura et al., (JP2009251837A, cited in IDS).
Regarding claim 5, Simske et al., as discussed in claim 4, do not disclose the at least one first processor being configured to execute the instructions to change at least one of a color and a shape of the image in accordance with the target-to-target distance as claimed. Mimura et al., disclose a processor to execute the instructions (steps 1001, 1006, Fig.10) to change at least one of a color and a shape of the image (Fig. 2, pattern 201 and pattern 202 is displayed when the distance from the sensor is too far or too close , showing the pattern changes depending on finger distance) in accordance with the target-to-target distance (distance from the sensor 101 to finger, Fig. 1). Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Simske et al., by utilizing the teaching of Mimura et al., to improve imaging focus and finger capture quality.
Regarding claim 6, Simske et al., as discussed in claim 3, do not disclose the first output operation and the second output operation as claimed. Mimura et al., disclose one first processor being configured to execute the instructions to perform: a first output operation of outputting first guide information ([0018], “ too right”/ “too left” messages, or red/blue pattern) for guiding the imaging target into a desired position range ([0018], “step 1005: A message such as “It is too right” is displayed, and the process returns to Step 1001. Step 1006: A message such as “Too left” is displayed, and the process returns to Step 1001”), and a second output operation of outputting second guide information (201, Fig.2) for guiding the imaging target to the desired position ([0011], “Reference numeral 201 denotes a pattern displayed on the finger when the finger is at an appropriate distance”),
perform the first output operation at a start of an operation ([0017]-[0018], at the step 1001, Lights of two colors, red and blue, are projected from the light source 703) and switches the first output operation to the second output operation in a case where the imaging target enters the desired position range ([0017], “ Step 1003: Analyze the color distribution of the pattern image… If there is, the process branches to step 1004”). Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Simske et al., by utilizing the teaching of Mimura et al., to improve imaging focus and finger capture quality.
Claim 7 is rejected under 35 U.S.C. 103 as being unpatentable over Simske et al., in view of Yamamoto (US 2023/0121799 A1).
Regarding claim 7, Simske et al., as discussed in claim 1, does not disclose an initial position of the light irradiation unit being determined on the basis of statistical information on a position of the imaging target at a start of an operation of detecting the target-to-target distance in a past as claimed. Yamamoto discloses an initial position of the light irradiation unit being determined (Figs.2 and 9, an initial position deciding unit 64 of the light source 20, and as shown in FIG. 19 , the initial position deciding unit 64 creates a histogram based on the acquired one or more in-focus positions ) on the basis of statistical information (based on the n-focus position recording unit 63) on a position of the imaging target at a start of an operation of detecting the target-to-target distance in a past (see Fig.18). Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Simske et al., by utilizing the teaching of Yamamoto, to reduce or minimize alignment issues during scanning.
Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Simske et al., in view of Hasegawa (US 2014/0240582 A1).
Regarding claim 10, Simske et al., as discussed in claim 1, do not disclose the retroreflective plate as claimed. Hasegawa discloses a retroreflective plate ([0048]). In combination, the output an image displayed by a display that would be retroreflected by a retroreflective plate and would be formed in the desired position in the air for guiding the imaging target to the desired position in the air as claimed. Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Simske et al., by utilizing the teaching of Hasegawa, for creating sharp images in a specific direction ([0048], Hasegawa).
Response to Arguments
5. Applicant’s arguments with respect to the claim(s) have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Conclusion
6. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MAI THI NGOC TRAN whose telephone number is (571)-272- 3456. The examiner can normally be reached Monday-Friday: 9:00-5:30pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner's supervisor, GEORGIA EPPS can be reached on (571)-272-2328. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/M.T.T./Examiner, Art Unit 2878
/THANH LUU/Primary Examiner, Art Unit 2878