DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant’s arguments with respect to claim(s) 1 have been considered but are moot because the new ground of rejection does not rely on any reference citations applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
MEYNANTS in [0006] discloses that the light emission may instead be pulsed in such a manner that illumination is effected after a rolling reset at a moment when all pixels are sensitive. Therefore, MEYNANTS teaches the light projector unit is toggled into the activated pattern state when all pixel lines in the plurality of pixel lines are concurrently exposed such that occurrences of the activated pattern state coincide at least in part with specific time periods of the specific capture cycle during which the plurality of pixel lines are concurrently exposed.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1-2, 4-7, 19, 25, 38-40, 42-44, 65-68 are rejected under 35 U.S.C. 103 as being unpatentable over MEYNANTS et al. (US 20200292306 A1) in view of Saphier et al. (US 20190388193 A1).
Regarding claim 1. MEYNANTS discloses A scanner for generating 3D data relating to a surface of a target object (abstract, a rolling shutter of the image sensor is configured to expose areas of the image sensor in accordance with the activation of the light sources, so that the pixels in an exposed area are illuminated and the pixels that are outside the exposed area are shielded from illumination; [0001] capture of 3D image information), the scanner comprising:
a. a scanner frame structure on which is mounted a set of imaging modules ([0009] The 3D camera system; [0017] 2 synchronized CMOS image sensors; [0054] the two image sensors in the stereovision system) including:
i. a light projector unit for projecting a structured light pattern onto the surface of the target object ([0012] projection lens providing the predefined light pattern; figure 1, [0033] a light pattern that is projected by the array 1 of addressable light sources 3 on an object that is to be examined); and
ii. one or more rolling shutter cameras for capturing data conveying a set of images including reflections of the structured light pattern projected onto the surface of the target object ([0033] FIG. 2 is a top view of a rolling-shutter image sensor, which is employed to detect the image of a light pattern that is projected by the array 1 of addressable light sources 3 on an object that is to be examined; [0017] 2 synchronized CMOS image sensors; [0054] the two image sensors in the stereovision system), wherein the one or more rolling shutter cameras have sensor surfaces defining a plurality of pixel lines ([0009] an image sensor comprising a two-dimensional array of pixels, which are configured for the detection of a predefined light pattern, and a rolling shutter of the image sensor; [0017] 2 synchronized CMOS image sensors; [0054] the two image sensors in the stereovision system), wherein different pixel lines in the plurality of pixel lines sequentially begin to be exposed over a specific capture cycle (figure 2, [0034] The rolling shutter 4 allows to expose a selected area of the array of pixels to the light pattern that is projected. The exposed area 4* may especially include a group of successive pixel rows; [0035] the application of the read pointer 5 and the reset pointer 6 to control the time interval when the pixels are activated is schematically indicated by arrows; [0036] The window formed by the rolling shutter is scrolled over the array of pixels, thus changing the area of the image sensor that is exposed to incident light; figure 6, [0042] the time intervals of illumination (represented by the bars designated FLASH) and readout for different rows of pixels with illumination are synchronized to exposure for each row); and
b. one or more processors in communication with the set of imaging modules for receiving and processing the data conveying the set of images ([0042] the pixels of that row are read out and the pixel data is transferred to an image processor), wherein the one or more processors are further configured to send control signals to the light projector unit to cause the light projector unit to intermittently project the structured light pattern in accordance with a specific sequence by causing the light projector unit to toggle between an activated pattern state and a deactivated pattern state during the specific capture cycle ([0040] The operation of the array 1 of addressable light sources 3 and the array of pixels in the image sensor 2 is to be synchronized, so that each of the exposed pixels is illuminated, while pixels that are not exposed are not necessarily illuminated), wherein:
i. during the activated pattern state, the light projector unit projects the structured light pattern ([0040] The operation of the array 1 of addressable light sources 3 and the array of pixels in the image sensor 2 is to be synchronized, so that each of the exposed pixels is illuminated; [0033] a light pattern that is projected by the array 1 of addressable light sources 3 on an object that is to be examined); and
ii. during the deactivated pattern state, the light projector unit:
1. omits to project the structured light pattern ([0040] The operation of the array 1 of addressable light sources 3 and the array of pixels in the image sensor 2 is to be synchronized, so that pixels that are not exposed are not necessarily illuminated); or
2. projects a substantially attenuated version of the structured light pattern;
iii. wherein the one or more processors are configured to send control signals to cause the light projector unit to toggle into the activated pattern state when all pixel lines in the plurality of pixel lines are concurrently exposed ([0006] The light emission may instead be pulsed in such a manner that illumination is effected after a global or rolling reset at a moment when all pixels are sensitive) such that occurrences of the activated pattern state coincide at least in part with specific time periods of the specific capture cycle during which the plurality of pixel lines are concurrently exposed ([0006] The light emission may instead be pulsed in such a manner that illumination is effected after a global or rolling reset at a moment when all pixels are sensitive; [0040] The operation of the array 1 of addressable light sources 3 and the array of pixels in the image sensor 2 is to be synchronized, so that each of the exposed pixels is illuminated); and
iv. wherein occurrences of the deactivated pattern state coincide at least in part with other time periods of the specific capture cycle distinct from said specific time periods, wherein during the other time periods:
specific subsets of pixel lines in the plurality of pixel lines are not exposed ([0040] The operation of the array 1 of addressable light sources 3 and the array of pixels in the image sensor 2 is to be synchronized, so that pixels that are not exposed are not necessarily illuminated); and
at least one pixel line in the plurality of pixel lines omitted from the specific subsets is exposed ([0040] The operation of the array 1 of addressable light sources 3 and the array of pixels in the image sensor 2 is to be synchronized, so that each of the exposed pixels is illuminated (by other part of the array 1 of addressable light sources 3 than the deactivated part of the array 1)).
However, MEYNANTS does not explicitly disclose
a set of cameras positioned alongside the light projector unit.
Saphier discloses
a set of cameras positioned alongside light projector units (figure 1, [0243] a handheld wand with a plurality of structured light projectors and cameras disposed within a probe at a distal end of the handheld wand; [0271] A plurality of structured light projectors 22 and a plurality of cameras 24 are coupled to a rigid structure 26 disposed within a probe 28 at a distal end 30 of the handheld wand).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the inventions of MEYNANTS and Saphier, to position the set of cameras alongside the light projector unit, in order to better scan the object.
Regarding claim 2. MEYNANTS discloses The scanner as defined in claim 1, wherein the specific capture cycle comprises:
a. the specific time periods during which the pixel lines in the plurality of pixel lines are concurrently exposed ([0006] The light emission may instead be pulsed in such a manner that illumination is effected after a global or rolling reset at a moment when all pixels are sensitive; figure 6, the time intervals of illumination and readout for different rows of pixels with illumination synchronized to exposure for each row; figure 4, the reset pointer 6 already starts exposing the next frame; [0040] The time interval during which the pixels are exposed is controlled by the time when the corresponding rows are reset and the time when the corresponding rows are read out); and
b. the other time periods, distinct from the specific time periods, during which specific subsets of pixel lines in the plurality of pixel lines are not exposed (figure 6, the time intervals of illumination and readout for different rows of pixels with illumination synchronized to exposure for each row; figure 4, the reset pointer 6 already starts exposing the next frame; [0040] The time interval during which the pixels are exposed is controlled by the time when the corresponding rows are reset and the time when the corresponding rows are read out).
Regarding claim 4. MEYNANTS discloses The scanner as defined claim 1, wherein the specific capture cycle comprises one of a plurality of specific capture cycles of the one or more rolling shutter cameras ([0040] The time interval during which the pixels are exposed is controlled by the time when the corresponding rows are reset and the time when the corresponding rows are read out).
Regarding claim 5. MEYNANTS discloses The scanner as defined in claim 4, wherein the one or more processors are further configured to:
a. send a reset signal to the one or more rolling shutter cameras to start a new specific capture cycle of the plurality of specific capture cycles during which the different pixel lines in the plurality of pixel lines are sequentially exposed ([0040] The time interval during which the pixels are exposed is controlled by the time when the corresponding rows are reset and the time when the corresponding rows are read out (10 rows in the example given above));
b. following a first delay period after the sending of the reset signal, send an activation control signal to the light projector unit to cause the light projector unit to toggle into the activated pattern state during the new specific capture cycle ([0040] The operation of the array 1 of addressable light sources 3 and the array of pixels in the image sensor 2 is to be synchronized, so that each of the exposed pixels is illuminated. At each reset and start of exposure of the next row(s) of pixels, a new group of light sources 3 may be activated, so that the group of emitting light sources 3* may change from row to row, or from each set of rows to each set of rows); and
c. following a second delay period after sending of the activation control signal to the light projector unit, sending a deactivation control signal to the light projector unit to cause the light projector unit to toggle into the deactivated pattern state during the new specific capture cycle ([0040] The operation of the array 1 of addressable light sources 3 and the array of pixels in the image sensor 2 is to be synchronized, so that pixels that are not exposed are not necessarily illuminated).
Regarding claim 6. MEYNANTS discloses The scanner as defined in claim 4, wherein the one or more processors are further configured to:
a. following a cycle delay period after an end of the specific capture cycle, send a reset signal to the one or more rolling shutter cameras to start a new specific capture cycle ([0040] The time interval during which the pixels are exposed is controlled by the time when the corresponding rows are reset and the time when the corresponding rows are read out (10 rows in the example given above)).
Regarding claim 7. MEYNANTS discloses The scanner as defined in claim 1, wherein the light projector unit includes a light source configured for emitting light with wavelengths in a specific wavelength range ([0033] The dimensions of the pixels may be typically about 2.5 μm for the wavelength of 940 nm).
Regarding claim 19. MEYNANTS in view of Saphier discloses The scanner as defined in claim 1, wherein the one or more rolling shutter cameras include a rolling shutter color camera for generating image data to derive texture information associated with the surface of the target object (Saphier [0039] at least one of the cameras is configured to capture two-dimensional color images of the object using illumination from the uniform light projector; MEYNANTS abstract, an image sensor comprising pixels, which are configured for the detection of a predefined light pattern, and a rolling shutter of the image sensor).
The same motivation has been stated in claim 1.
Regarding claim 25. MEYNANTS in view of Saphier The scanner as defined in claim 19, wherein the light projector unit is a first light projector unit projecting light of a first type including the structured light pattern, and wherein the scanner comprises a second light projector unit including a second projector light source configured for projecting light of a second type onto the surface on the target object (MEYNANTS [0012] projection lens providing the predefined light pattern; figure 1, [0033] a light pattern that is projected by the array 1 of addressable light sources 3 on an object that is to be examined; Saphier [0009] projecting a “coded” light pattern and imaging the illuminated scene from one or more points of view; [0013] Each of the structured light projectors transmits light using a light source, such as a laser diode. Each light projector may be configured to project a pattern of light defined by a plurality of projector rays when the light source is activated; [0012] The non-coded structured light patterns may include uniform patterns of spots; [0039] the apparatus further includes at least one uniform light projector, configured to project white light onto an object being scanned, and at least one of the cameras is configured to capture two-dimensional color images of the object using illumination from the uniform light projector).
The same motivation has been stated in claim 1.
Regarding claim 38. The same analysis has been stated in claim 1.
Furthermore, MEYNANTS in view of Saphier discloses A scanning system for generating 3D data relating to a surface of a target object (MEYNANTS abstract, a rolling shutter of the image sensor is configured to expose areas of the image sensor in accordance with the activation of the light sources, so that the pixels in an exposed area are illuminated and the pixels that are outside the exposed area are shielded from illumination; [0001] capture of 3D image information), the scanning system comprising:
a. the scanner as defined in claim 1 (see the rejection for claim 1); and
b. a computing system in communication with said scanner (MEYNANTS [0042] the pixels of that row are read out and the pixel data is transferred to an image processor; Saphier [0137] using the processor to run a surface reconstruction algorithm), the computing system being configured for:
i. performing a 3D reconstruction process of the surface of the target object using the data conveying the set of images including the reflections of the structured light pattern captured by the scanner (MEYNANTS [0042] the pixels of that row are read out and the pixel data is transferred to an image processor; Saphier [0019] an image of the three-dimensional intraoral surface is constructed; Saphier [0137] using the processor to run a surface reconstruction algorithm that combines at least one image captured using illumination from the structured light projectors with a plurality of images captured using illumination from the uniform light projector to generate a three-dimensional image of the intraoral three-dimensional surface); and
ii. rendering, on a graphical user interface displayed on a display device, a visual representation of at least portion of the surface of the target object resulting from the 3D reconstruction process (Saphier [0019] The resultant image, while generally displayed on a two-dimensional screen, contains data relating to the three-dimensional structure of the scanned object, and thus may typically be manipulated so as to show the scanned object from different views and perspectives).
The same motivation has been stated in claim 1.
Regarding claim 39. The same analysis has been stated in claim 1.
Regarding claim 40. The same analysis has been stated in claim 2.
Regarding claim 42. The same analysis has been stated in claim 4.
Regarding claim 43. The same analysis has been stated in claim 5.
Regarding claim 44. The same analysis has been stated in claim 6.
Regarding claim 65. (New) MEYNANTS discloses The scanner as defined in claim 1, wherein the one or more processors are configured to send control signals to cause the light projector unit to toggle into the activated pattern state in response to all pixel lines in the plurality of pixel lines being concurrently exposed ([0006] The light emission may instead be pulsed in such a manner that illumination is effected after a global or rolling reset at a moment when all pixels are sensitive).
Regarding claim 66. (New) MEYNANTS discloses The scanner as defined in claim 1, wherein the one or more processors are configured to send control signals to cause the light projector unit to toggle into the deactivated pattern state when one or more pixel lines of the plurality of pixel lines cease being exposed ([0006] The light emission may instead be pulsed in such a manner that illumination is effected after a global or rolling reset at a moment when all pixels are sensitive).
Regarding claim 67. The same analysis has been stated in claim 65.
Regarding claim 68. The same analysis has been stated in claim 66.
Claims 8, 53-60, 63-64, 69-70 are rejected under 35 U.S.C. 103 as being unpatentable over MEYNANTS et al. (US 20200292306 A1) in view of Saphier et al. (US 20190388193 A1) as applied above in claims 1 and 7, and further in view of KUSAFUKA et al. (US 20230099211 A1).
Regarding claim 8. MEYNANTS in view of KUSAFUKA discloses The scanner as defined in claim 7, wherein the one or more rolling shutter cameras include at least one rolling shutter geometric camera (MEYNANTS [0017] 2 synchronized CMOS image sensors; [0054] the two image sensors in the stereovision system; [0054] The distance is calculated from the relative position of the two image sensors in the stereovision system), said at least one rolling shutter geometric camera being configured for:
a. allowing light with wavelengths in the specific wavelength range to pass through onto the sensor surfaces (MEYNANTS [0033] The dimensions of the pixels may be typically about 2.5 μm for the wavelength of 940 nm; KUSAFUKA [0063] For the camera 11 that is an infrared camera, an infrared filter may be installed on an optical path from the camera 11. The infrared filter blocks light in the visible light region and transmits light in the infrared region); and
b. substantially attenuating light with wavelengths outside the specific wavelength range (MEYNANTS [0033] The dimensions of the pixels may be typically about 2.5 μm for the wavelength of 940 nm; KUSAFUKA [0063] For the camera 11 that is an infrared camera, an infrared filter may be installed on an optical path from the camera 11. The infrared filter blocks light in the visible light region and transmits light in the infrared region).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the inventions of MEYNANTS and Saphier with the invention of KUSAFUKA, to use a filter to only allow the light with wavelengths in the specific wavelength range to pass through onto the sensor surfaces, in order to better scan the object.
Regarding claim 53. The same analysis has been stated in claims 1, 7-8.
Regarding claim 54. The same analysis has been stated in claims 1, 2.
Regarding claim 55. The same analysis has been stated in claim 5.
Regarding claim 56. The same analysis has been stated in claim 5.
Regarding claim 57. MEYNANTS discloses The scanner as defined in claim 53, wherein the specific wavelength range is an infrared wavelength range, a white light wavelength range, or a blue light wavelength range ([0033] The dimensions of the pixels may be typically about 2.5 μm for the wavelength of 940 nm (infrared)).
Regarding claim 58. MEYNANTS discloses The scanner as defined in claim 53, wherein the at least one rolling shutter geometric camera comprises at least two rolling shutter geometric cameras ([0017] 2 synchronized CMOS image sensors; [0054] the two image sensors in the stereovision system).
Regarding claim 59. MEYNANTS in view of KUSAFUKA discloses the at least one rolling shutter geometric camera includes a near infrared camera (MEYNANTS [0033] The dimensions of the pixels may be typically about 2.5 μm for the wavelength of 940 nm (infrared)), the near infrared camera including an infrared filter configured to let infrared light pass and to substantially attenuate light in spectrums outside infrared (KUSAFUKA [0063] For the camera 11 that is an infrared camera, an infrared filter may be installed on an optical path from the camera 11. The infrared filter blocks light in the visible light region and transmits light in the infrared region).
Regarding claim 60. The same analysis has been stated in claim 1.
Regarding claim 63. MEYNANTS in view of Saphier discloses The scanner as defined in claim 53, further comprising one or more processors in communication with the set of imaging modules and configured to:
a. receive and process the data conveying the set of images including the reflections of the structured light pattern to perform a 3D reconstruction process of the surface of the target object (MEYNANTS [0042] the pixels of that row are read out and the pixel data is transferred to an image processor; Saphier [0019] an image of the three-dimensional intraoral surface is constructed; Saphier [0137] using the processor to run a surface reconstruction algorithm that combines at least one image captured using illumination from the structured light projectors with a plurality of images captured using illumination from the uniform light projector to generate a three-dimensional image of the intraoral three-dimensional surface); or
b. transmit the data conveying the set of images including the reflections of the structured light pattern to a remote computing system distinct from the scanner, the remote computing system being configured for performing the 3D reconstruction process of the surface of the target object using the data conveying the set of images including the reflections of the structured light pattern (Saphier figure 1, figure 28A, [0368] Applications of the invention described herein can take the form of a computer program product accessible from a computer-usable or computer-readable medium (e.g., a non-transitory computer-readable medium) providing program code for use by or in connection with a computer or any instruction execution system, such as processor 96 or processor 1058).
The same motivation has been stated in claim 1.
Regarding claim 64. Saphier discloses The scanner as defined in claim 53, wherein the scanner is a handheld scanner (abstract, An apparatus for intraoral scanning includes an elongate handheld wand that has a probe).
The same motivation has been stated in claim 1.
Regarding claim 69. The same analysis has been stated in claim 65.
Regarding claim 70. The same analysis has been stated in claim 66.
Claims 20-22, 24, 45-46 are rejected under 35 U.S.C. 103 as being unpatentable over MEYNANTS et al. (US 20200292306 A1) in view of Saphier et al. (US 20190388193 A1) as applied above in claims 1 and 19, and further in view of May et al. (US 20130258462 A1).
Regarding claim 20. May discloses a rolling shutter color camera comprises a liquid crystal device (LCD) shutter (figure 11, [0076] The shutter 96 may be located between the lens 98 and the camera, the shutter may either be mechanical or electronic, such as an LCD shutter printed on a surface of lens 98).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the inventions of MEYNANTS and Saphier with the invention of May, to use a liquid crystal device (LCD) shutter for the rolling shutter color camera, in order to better scan the object.
Regarding claim 21. May discloses The scanner as defined in claim 20, wherein the rolling shutter color camera comprises:
a. a sensor (figure 11, camera); and
b. a lens, wherein the liquid crystal device (LCD) shutter is positioned between the sensor and the lens (figure 11, [0076] The shutter 96 may be located between the lens 98 and the camera, the shutter may either be mechanical or electronic, such as an LCD shutter printed on a surface of lens 98).
The same motivation has been stated in claim 20.
Regarding claim 22. May discloses The scanner as defined claim 20, wherein the one or more processors are further configured to send control signals to the LCD shutter for toggling the LCD shutter between an open state and a closed state, wherein in the open state the LCD shutter is translucent and wherein in the closed state the LCD shutter is at least partially opaque ([0076] a shutter 96 is provided which is arranged to alternately occlude the light exiting from the left and right regions of the ocular 94 preferably at a rapid rate such as 60 times per second or higher (for video), under the control of a signal from video processing circuitry).
The same motivation has been stated in claim 20.
Regarding claim 24. MEYNANTS in view of Saphier and May discloses The scanner as defined in claim 22, wherein toggling the LCD shutter between the open state and the closed state at least partially coincides with toggling the light projector unit between the activated pattern state and the deactivated pattern state so that:
a. the LCD shutter is in the open state at least partially concurrently while the light projector unit is in the activated pattern state; and
b. the LCD shutter is in the closed state at least partially concurrently while the light projector unit is in the deactivated pattern state (MEYNANTS [0040] The operation of the array 1 of addressable light sources 3 and the array of pixels in the image sensor 2 is to be synchronized, so that each of the exposed pixels is illuminated, while pixels that are not exposed are not necessarily illuminated; Saphier [0039] at least one of the cameras is configured to capture two-dimensional color images of the object using illumination from the uniform light projector; May [0076] a shutter 96 is provided which is arranged to alternately occlude the light exiting from the left and right regions of the ocular 94 preferably at a rapid rate such as 60 times per second or higher (for video), under the control of a signal from video processing circuitry).
The same motivation has been stated in claim 20.
Regarding claim 45. The same analysis has been stated in claims 19, 20 and 22.
Regarding claim 46. The same analysis has been stated in claim 24.
Claim 28 is rejected under 35 U.S.C. 103 as being unpatentable over MEYNANTS et al. (US 20200292306 A1) in view of Saphier et al. (US 20190388193 A1) as applied above in claims 1, 19 and 25, and further in view of Alvarez et al. (US 20230174970 A1).
Regarding claim 28. Alvarez discloses a filter for blocking at least in part wavelengths of light corresponding to wavelength of light sources ([0277] using a proper selection of multiple band pass filters for excitation and emission, e.g., for the light sources and for the cameras, a microscope can capture the desired fluorescence signals, e.g., the desired colors or wavelength ranges, with other wavelengths effectively blocked out).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the inventions of MEYNANTS and Saphier with the invention of Alvarez, to comprise a filter in the rolling shutter color camera for blocking at least in part wavelengths of light corresponding to wavelength of light projected by the first light projector unit, in order to better scan the object when using the rolling shutter color camera.
Claims 61-62 are rejected under 35 U.S.C. 103 as being unpatentable over MEYNANTS et al. (US 20200292306 A1) in view of Saphier et al. (US 20190388193 A1) as applied above in claims 1 and 7, and further in view of KUSAFUKA et al. (US 20230099211 A1) and May et al. (US 20130258462 A1).
Regarding claim 61. The same analysis has been stated in claims 19-22.
Regarding claim 62. The same analysis has been stated in claim 24.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to XIAOLAN XU whose telephone number is (571)270-7580. The examiner can normally be reached Mon. to Fri. 9am-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, SATH V. PERUNGAVOOR can be reached at (571) 272-7455. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/XIAOLAN XU/ Primary Examiner, Art Unit 2488