DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant’s arguments have been fully considered; however, the Examiner finds that the arguments require importing limitations into claims. The Examiner suggests amending the claims with these additional limitations to distinguish from Prakash.
Applicant argues that “matching” is used in the sense of matching features in images as explained at paragraph [0062]. The Examiner does not find that the specification has defined or limited the term “matching” to mean “matching features of interest in images” as applicant intends. Paragraph [0062] provides examples what processing of image data may comprise. Examples are not found to be definitions and thus the claims are not limited to the examples. For instance, paragraph [0062] states:
Processing of the imaging data may comprise detecting features of interest in the respective data of the two or more imaging devices 6015, 6025. Processing of the imaging data may comprise matching features of interest, for example computed using a feature detector, for example using a Shi-Tomasi detector, between the data of a first imaging device and a second imaging device. Processing of the imaging data may comprise, using data of features that match between images, for example that correlate above a given threshold, estimating the relative position and orientation, for example using one or more three-dimensional computer vision methods, for example using epipolar geometry, of the first and the second imaging devices with respect to the measurement location 6500.
The passage may describe what “matching features” may comprise, but it is not found to redefine the term. The term “matching” is broader than “matching features of interest in an image”
Nothing is found that “matching”:
is comparing intensity values of each sparkle or is matching locations of each sparkle,
means obtaining locations of sparkles and matching the locations of sparkles,
excludes the overlapping of images so that the sparkles of one image overlap sparkles of the other image,
requires the claim to use a feature detector,
defines a feature of interest to be “sparkle,”
requires the importation from the specification of “…may comprise, using data of features that match between images, for example that correlate above a given threshold, estimating the relative position and orientation, for example using one or more three-dimensional computer vision methods, for example using epipolar geometry, of the first and the second imaging devices with respect to the measurement location 6500.”
Applicant further emphasizes that “matching” is “using data of features that match between images.” The phrase is not imported into the claim, nor does this phrase clarify how the data that match is used. The phrase describes the data is used of features that match, not how the matching is acheived.
Save for Applicant’s argument, nothing is found in the record that requires “matching” to be matching features in image as intended by Applicant. As such, the Examiner is not persuaded that the claim excludes the overlapping images as taught by Prakash which would result in matching sparkles to overlap, i.e., match with each other.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 38-41, 43-44, 48, 49, and 51-53 are rejected under 35 U.S.C. 103 as being unpatentable over in view of Prakash (US 2016/0005187) in view of Fukamizu (US 7,667,856), Beymore (US 2014/0078293), and Kozko et al. (US 9,007,587).
With respect to claim 38, Prakash shows a process and device for matching color comprising:
a first imaging device (15) comprising an image sensor (inherent) and a lens (inherent) having an optical axis;
a first illumination source (12; Fig. 2) having an illumination axis intersecting the first imaging device lens' optical axis at an angle of 45 degrees, the optical axes of the first imaging device lens and the first illumination source defining a first measurement plane;
a second illumination source (12; Fig. 2) having an illumination axis intersecting the first imaging device lens' optical axis at the intersection of the first illumination source's optical axis and the first imaging device lens' optical axis, wherein the second illumination source's optical axis is disposed at an angle in a range of 15 degrees to 75 degrees with respect to the first imaging device lens' optical axis (see 75 degrees in Fig. 2 Prakash), and
a second imaging device (15), spaced from the first imaging device, comprising an image sensor and a lens having an optical axis (inherent) intersecting the first illumination source's optical axis at the intersection (on surface 11 at z axis) of the first illumination source's optical axis and the first imaging device lens' optical axis, wherein the first illumination source's optical axis is disposed at a specular angle ([0089]-[0090]) with respect to the second imaging device lens' optical axis;
a computer processor ([0023]);
a non-volatile memory ([0023],[0050]) comprising computer-readable instructions which, when executed by the computer processor, cause the mobile device to:
acquire data from the first and second imaging devices and derive reflectance information of the surface of interest; and
wherein the reflectance information comprises an aspecular color reflectance information ([0089]), specular reflectance information ([0089]), and sparkle reflectance information ([0056]);
wherein a first field of view corresponding to the first imaging device and a second field of view corresponding to the second imaging device form an overlapping region (See Figs. 1A and 1B showing the cameras viewing the same point and thus there would be overlap. See discussion of Fukamizu below also), and
wherein the mobile device is configured derive the sparkle reflectance information by processing image data of the overlapping region to derive effect pigment reflectance information for the surface of interest, wherein processing the image data of the overlapping region comprises matching sparkles detected by the first imaging device with sparkles detected by the second imaging device (See para. [0096] “The measured sparkle values can be obtained at one or more sparkle illumination angles, one or more sparkle viewing angles, or a combination thereof. In one example, measured sparkle values can be obtained at about 15° sparkle illumination angle. In another example, measured sparkle values can be obtained at about 45° sparkle illumination angle. In yet another example, measured sparkle values can be obtained at about 75° sparkle illumination angle. In a further example, measured sparkle values can be obtained at a combination of one or more of about 15°, about 25°, about 45°, and about 75° sparkle illumination angles. In yet one example, the measured sparkle values can be obtained at a sparkle viewing angle selected from about 15°, about 25°, about 45°, about 75°, or a combination thereof, the sparkle viewing angle being an aspecular angle.” The sparkle values obtained from the combination of viewing angles are views of the same overlapping region).
Prakash does not show:
1) that it is the optical axis of the first illumination source and the second illumination source at the intersection,
2) the second measurement plane is different from the first measurement plane, and
3) form a three-dimensional model of the surface of interest based on the reflectance information;
Obviousness of optical axis of the illumination sources at the intersection:
Fukamizu shows an optical characteristic measuring apparatus wherein the optical axis (L1) of an illumination source (11, 13, 13a) is aimed at an intersection (CT).
Before the effective filing date of the claimed invention, it would have been obvious to aim the optical axis of the illumination sources of Prakash at the intersection in order to optimize the illumination of the area of the object being measured. One of ordinary skill in the art would recognize this would improve the intensity of the illumination or allow a lower power source to be used and reduce stray light from illuminating the area at undesirable angles.
Obviousness of the two measurement planes being on different planes:
Beymore shows characterizing multi-angular color, opacity, pigment, and texture wherein a first measurement path and a second illumination path define a second measurement plane different from the first measurement plane. See Fig. 4 which shows two illumination paths at 170 deg. and 105 deg., two rings of illumination paths, first ring at 45 deg. elevation and second ring at 30 deg. elevation.
PNG
media_image1.png
308
609
media_image1.png
Greyscale
Before the effective filing date of the claimed invention, it would have been obvious to add more cameras and light sources in a ring arrangement as shown by Beymore to the device of Prakash in order to determine additional properties such as opacity, pigment, and texture. In other words, the illumination sources and cameras would not all lie in the same plane.
Obviousness of forming a three-dimensional model of the surface of interest:
Kozko shows measuring color of a surface and teaches that measured color is often used to determine a shape of a surface (col. 1:36-45). Before the effective filing date of the claimed invention, it would have been obvious use the measured reflectance data, i.e. color, to form a three-dimensional model, i.e. shape, of the surface of interest.
39. The mobile device of claim 38, wherein the optical axes of the second imaging device lens and the second illumination source define a third measurement plane different from the first and second measurement planes (para. [0167] Prakash; the illumination and viewing angles being different based on Prakash).
40. The mobile device of claim 38, wherein the second imaging device lens' optical axis is in the first measurement plane (Fig. 1A).
41. The mobile device of claim 38, wherein the first and second measurement planes are orthogonal to each other (90 degrees paras. [0083],[0097] Prakash).
43. The mobile device of claim 38, wherein the mobile device is configured to process image data of the overlapping region to form three-dimensional data of the surface of interest within the overlapping region (the cameras are shown to point at Z=0 and therefor the imaged regions overlap; “said specimen color data comprise specimen color data values measured at said three or more color viewing angles” para. [0006]).
44. The mobile device of claim 38, wherein the mobile device is configured to process image data of the overlapping regions to derive surface texture appearance information for the surface of interest (the cameras are shown to point at Z=0 and therefor the imaged regions overlap; “said specimen color data comprise specimen color data values measured at said three or more color viewing angles” para. [0006]; “appearance includes texture” para [0045]).
48. The mobile device of claim 38, wherein the first and second illumination sources are mounted on a lighting accessory (6100, 6200) attached to the mobile device (inherent the sources are attached to something) attached to the measurement device (45, para. [0282]).
49. The mobile device of claim 48, wherein the lighting accessory includes a controller to illuminate the first and second illumination sources independently of each other (one of ordinary skill would understand that the Prakash’s “varying illumination angles” is referring to operating the light sources at different angles in a separate manner and thus there would be a controller).
51. With respect to claim 51, Prakash the sources are disposed to provide a plurality of illumination angles comprised in a range from 10 degrees to 75 degrees with respect to the optical axis of one or more of the imaging devices but does not show the first and second illumination sources are comprised in a plurality of collimated illumination sources.
Fukamizu shows the illumination sources that are collimated. Before the effective filing date of the claimed invention, it would have been obvious to use collimated light sources in order to efficiently use the light output from the source to the spot of measurement. One of ordinary skill in the art would recognize this will allow a lower power use by the illumination source and reduce stray light illuminating the spot at undesirable angles.
52. The mobile device of claim 38, wherein the computer-readable further comprise instructions which, when executed by the computer processor, cause the mobile device to acquire a burst of images from the first imaging device and to form a HDR image (see para. [0167]).
53. With respect to claim 53, Prakash does not explicitly state that the computer has instructions to cause the mobile device to acquire images from the first imaging device and the second imaging device simultaneously or separately. There being only two choices, both options would have been obvious as well as anticipated. Before the effective filing date of the claimed invention, it would have been obvious to make the computer programmed to obtain images simultaneously in order to quickly obtain image data.
Claims 46 and 47 are rejected under 35 U.S.C. 103 as being unpatentable over Prakash, Fukamizu, and Beymore as applied to claim 38 above, and further in view of Ehbets.
Prakash shows all the limitations as discussed for claim 1 but does not explicitly show the use of white LEDs as a light source.
Ehbets shows a spectrophotometer that uses white LEDs (para. [0038]) as an illumination source. At the time of filing of the claimed invention, it would have been obvious to use a white LED for nothing more than the expected result of illuminating the specimen for color detection. As to claim 14, Ehbets states a white LED emits light “over the whole visible spectral range” and thus would have red, green, and/or blue. As to claim 36, one of ordinary skill in the art would understand that the output lens of LED 13 shown in 1 collimates the light emitted by the LED diode.
Claim 50 is rejected under 35 U.S.C. 103 as being unpatentable over Prakash, Fukamizu, and Beymore as applied to claim 38 above, and further in view of Ishimaru (US 2013/0215428).
Prakash shows all the limitations as discussed for claim 1 but does not explicitly show an image sensor positioned in the imaging device’s Fourier transform plane.
Ishimaru shows a spectroscopic measurement device wherein the light receiving surface is placed in the optical Fourier-conversion plane of a cylindrical lens (para. [0094]).
At the time of filing of the claimed invention, it would have been obvious to use the spectroscopic measurement device of Ishimaru for its high accuracy and wide band sensitivity (para. [0007]-[0012]).
Conclusion
All claims are identical to or patentably indistinct from, or have unity of invention with claims in the application prior to the entry of the submission under 37 CFR 1.114 (that is, restriction (including a lack of unity of invention) would not be proper) and all claims could have been finally rejected on the grounds and art of record in the next Office action if they had been entered in the application prior to entry under 37 CFR 1.114. Accordingly, THIS ACTION IS MADE FINAL even though it is a first action after the filing of a request for continued examination and the submission under 37 CFR 1.114. See MPEP § 706.07(b). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Hwa Andrew S Lee whose telephone number is (571)272-2419. The examiner can normally be reached Mon-Fri 9am-5:30pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Uzma Alam can be reached at 571-272-3995. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Hwa Andrew Lee/Primary Examiner, Art Unit 2877