Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAIL OFFICE ACTIONS
The United States Patent & Trademark Office appreciates the response filed for the current application that is submitted on 12/14/2023. The United States Patent & Trademark Office reviewed the following documents submitted and has made the following comments below.
Amendment
Applicant submitted amendments on 1/21/2026. The Examiner acknowledges the amendment and has reviewed the claims accordingly.
Applicant Arguments:
In regards to Argument 1, Applicant states Zhang, Border, and Morioka, either alone or in combination, do not describe or suggest the amended claim language (Remarks, page 11, paragraph 2).
In regards to Argument 2, Applicant states Border, either alone or in combination with Zhang, does not describe or suggest using a first method, obtain first information that is information pertaining to a position of the subject from a measurement unit associated with the subject (Remarks, page 12, paragraph 3).
In regards to Argument 3, Applicant states Border, either alone or in combination with Zhang does not also describe also using a second method different from the first method, obtain second information that is information pertaining to the position of the subject determined from the captured image of the subject (Remarks, page 12, paragraph 3).
In regards to Argument 4, Applicant states Border does not describe or suggest to a person of ordinary skill in the art to receive the GPS or similar tracking information from a measurement unit associated with the subject and determine the positional relationship of the subject with respect to the camera as a first method for tracking (Remarks, page 12, paragraph 3).
In regards to Argument 5. Applicant states Border does not describe or suggest to a person of ordinary skill in the art to utilize a second method based on the recognition of the subject in the captured images, as a different method, for tracking the subject captured by the camera (Remarks, page 13, paragraph 1).
In regards to Argument 6, Applicant states Border involves a totally different use of the cameras, for a very different purpose (ie. capturing the eye glint of eye of a VR or AR headgear wearer), a person of ordinary skill in the art would not understand the need or purpose of obtaining subject information (Remarks, page 13, paragraph 1).
In regards to Argument 7, Applicant states even if combined with Zhang and Border, Morioka does not describe or suggest to perform at least one of changing the angle of view in a first direction or changing a magnitude of the angle of view based on the first information (Remarks, page 13, paragraph 1).
In regards to Argument 8, Applicant states even if combined with Zhang and Border, Morioka does not describe or suggest to perform at least one of changing the angle of view in a second direction different from the first direction or changing a magnitude of the angle of view, based on the second information (Remarks, page 14, paragraph 2).
In regards to Argument 9, Applicant states even if combined with Zhang and Border, Morioka does not describe or suggest first information is information pertaining to a position of the subject from a measurement unit associated with the subject and the second information that is information pertaining to the position of the subject determined from the captured image of the subject (Remarks, page 14, paragraph 2), thus, Applicant is stating that to remove the rejection of 35 USC 103.
In regards to Argument 10, Applicant states Besley does not resolve the aforementioned deficiencies of Zhang, Border and Morioka with respect to amended claim language, thus, Applicant is stating that to remove the rejection of 35 USC 103 (Remarks, page 15, paragraph 1), include into argument that mentions Besley.
Examiner’s Responses:
In regards to Argument 1-10, Applicant’s arguments and amendments, see Remarks, filed 1/21/2026, with respect to the rejections of claims 1-17 under 35 U.S.C. 103 have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration of the amendment, a new ground(s) of rejection is made in view Kato in view of Wakamatsu in further view of Higuchi.
With regarding the new rejection for Claim 1 and other independents, the Examiner would like to highlight that Kato et al. (US Patent Pub No. US 20160358019A1, hereafter referred to as Kato) teaches an image capture apparatus that has a GPS measurement unit as a first information obtainment unit and a direction obtaining unit by way of the image capturing unit as a second information obtainment unit. The Examiner combines Kato with Wakamatsu (US Patent Pub No. US 20170163894A1, hereafter referred to as Wakamatsu). Wakamatsu teaches a blur prevention characteristics changing unit that changes the state of the angle of view several times through extracting, zooming, panning, and tilting, based on measurement data and position data collected. When a camera is operated to capture a subject within an angle of view and the subject moves out of the angle of view, it may be difficult for a photographer to perform fine adjustment to a capture the subject within an angle of view or position the subject within the image at the central position of a photographed image. When a subject tracking computation amount is combined with a blur computation amount, subject tracking is controlled while correcting an image blur, ensuring that the subject is captured and can be identified within the image using location, position, and angle capturing information (paragraph 4-6, Wakamatsu). POSITA would combine Kato and Wakamatsu since there is a need to detect and capture a subject in a photo as accurately as possible, which would mean removing the presence of image blur.
The details of the rejection are below.
Claim Rejections - 35 USC § 103
16. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness
rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention
pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or
nonobviousness.
17. Claims 1-4 and 7-15 are rejected under 35 U.S.C. 103 as being unpatentable over Kato et al. (US Patent Pub No. US 20160358019A1, hereafter referred to as Kato) in view of Wakamatsu (US Patent Pub No. US 20170163894A1, hereafter referred to as Wakamatsu).
18. Regarding Claim 1, Kato teaches an image capturing apparatus comprising: an image capturing device that captures an image of a subject (Fig. 2, paragraph 46-50 and 148, Kato teaches the photography of a subject using an optical lens unit configured by a focus lens and a zoom lens, where optoelectronic conversion device configured by a complementary metal oxide semiconductor, CMOS, forms an image with light incident through the optical lens unit.),
PNG
media_image1.png
438
648
media_image1.png
Greyscale
and at least one processor or circuit (paragraph 48 and 157, Kato teaches a plurality of circuits including peripheral circuits included in the optical lens unit to adjust setting parameters such as focus, exposure, and white balance.) configured to function as: a first obtainment unit that (paragraph 71, Kato teaches a location obtainment unit as a first information obtaining unit.), using a first method, obtains first information that is information pertaining to a position of the subject (paragraph 71, Kato teaches latitude, longitude, and altitude location information obtained from the first information obtaining unit.), received from a measurement unit associated with the subject (paragraph 71-74, Kato teaches a location obtaining unit that obtains location of the image capture apparatus by the GPS unit.), a second obtainment unit (paragraph 72, Kato teaches a direction obtaining unit as a second information obtaining unit.), obtains second information that is information pertaining to the position of the subject determined from the captured image of the subject (paragraph 72- 74, Kato teaches a direction obtaining unit obtaining the direction of the image capture apparatus based on an orientation of a geomagnetic field detected by the geomagnetic sensor unit.).
Kato does not teach changing unit that changes an angle of view of the image capturing device wherein the changing unit performs at least one of changing the angle of view in a first direction or changing a magnitude of the angle of view, based on the first information and then performs at least one of changing the angle of view in a second direction different from the first direction or changing a magnitude of the angle of view, based on the second information.
Wakamatsu is in the same field of art of image capturing control for object tracking. Further, Wakamatsu teaches a changing unit that changes an angle of view of the image capturing device (Fig 8B and paragraph 106-114, Wakamatsu teaches a blur prevention characteristics changing unit that changes the blur prevention characteristics according to the determination result on the subject detection state obtained by the detection state determination unit, which tracks and changes the device capturing angle to ensure the subject is placed back in the frame when outside the frame.)
PNG
media_image2.png
302
308
media_image2.png
Greyscale
wherein the changing unit performs at least one of (Under BRI, the Examiner interprets the claims to state only one since it states “at least one of”) changing the angle of view in a first direction or changing a magnitude of the angle of view, based on the first information (paragraph 36-39, 81, 106-108, 114, Wakamatsu teaches the blur prevention characteristics changing unit using panning determination data of the panning determination unit, which is information about angular velocity and blur correction angles and tracking data, obtained from the tracking control unit which is configured to perform tracking control of the subject based on the position information of the subject acquired by the subject detection unit, to determine if there is a need for the imaging apparatus to pan. The Examiner interprets the blur prevention characteristics changing unit determining if there is a need for the imaging apparatus to pan and then doing so as changing the angle of view in a first direction or changing a magnitude of the angle of view based on the first direction.) and then performs at least one of (Under BRI, the Examiner interprets the claims to state only one since it states “at least one of”) changing the angle of view in a second direction different from the first direction or changing a magnitude of the angle of view, based on the second information (paragraph 66, Wakamatsu teaches the second state or second performance of the image blur correction effect, in which the effect in the second state is stronger than in the first state, including the panning determination, which would be stronger/different from the first, an example of changing the angle of view.).
Therefore it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the invention of Kato by incorporating the method of changing the angle of view and its magnitude based on two different pieces of information using a changing unit that is taught by Wakamatsu to make an invention that can track the subject in an image frame based on the global position and object recognition features; thus one of ordinary skill in the art would be motivated to combine the references since there is a need to suppress and correct an image blur of instances of high magnitude such as a handshake or face detection or human detection when a subject is a person known (paragraph 4, Wakamatsu).
Thus, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filing date.
19. In regards to Claim 2, Kato in view of Wakamatsu teaches wherein the first obtainment unit obtains the first information by communicating with a measurement device with which the subject is provided (paragraph 71, Kato teaches a detection by the GPS unit, where a GPS unit is a measurement device and detection by a GPS unit is an example of communicating with a measurement device.).
20. In regards to Claim 3, Kato in view of Wakamatsu teaches wherein the first obtainment unit obtains the first information using a global positioning system (GPS) (paragraph 41, 44, 53, and 71, Kato teaches the location obtaining unit as a first information obtaining unit that obtains a location of the image capture apparatus detected by the GPS unit.).
21. In regards to Claim 4, Kato in view of Wakamatsu teaches wherein the second obtainment unit obtains the second information from an image captured by the image capturing device (paragraph 66-67, 72, 122, 133, Kato teaches the direction obtaining unit as the second information obtaining unit that obtains direction of the image captured by the image capturing unit based on an orientation of geomagnetic field detected by the geomagnetic sensor unit.).
22. In regards to Claim 7, Kato in view of Wakamatsu teaches wherein the changing unit drives a device that changes the angle of view of the image capturing device mechanically (paragraph 88-89, 96-98, Wakamatsu teaches a mechanism for rotating a lens barrel for the performance of the image blur correction and automatic subject tracking, which is a mechanical process, in the imaging apparatus.).
23. In regards to Claim 8, Kato in view of Wakamatsu teaches wherein the changing unit changes the angle of view of the image capturing device by cropping a part of the image (Fig. 4, paragraph 46, 50-53, 84, 111-113, Wakamatsu teaches the optical blur prevention control to determine the position and size of a partial area, zooming as changing the size of the extracted area compared to the full sensor image, and pan/tilting as moving coordinates of the extraction area within the larger image frames.).
PNG
media_image3.png
366
676
media_image3.png
Greyscale
24. In regards to Claim 9, Kato in view of Wakamatsu teaches wherein the changing unit further changes the angle of view based on the second information after changing the angle of view based on the first information (paragraph 66, Wakamatsu teaches the second state or second performance of the image blur correction effect, in which the effect in the second state is stronger than in the first state, including the panning determination, which would be stronger/different from the first, an example of changing the angle of view.).
25. In regards to Claim 10, Kato in view of Wakamatsu teaches wherein when the position of the subject based on the first information is a position outside the angle of view of the image capturing device (paragraph 29, 126, 135, Wakamatsu teaches the subject of the imaging apparatus moving which causes the camera to refocus based on adjusting the position of the subject image and the angle of view, which is performed by the tracking control unit, even when the subject moves away from the center of the image, the initial angle of view.), the changing unit changes the angle of view of the image capturing device such that the subject is at a position within the angle of view of the image capturing device (paragraph 126-127, 132-133, 145, Wakamatsu teaches how when the subject moves away from the center of the image, known as the angle of view, the tracking control is performed based on the tracking correction amount calculated by the tracking amount calculation unit such that the subject returns to the center of the image.).
26. In regards to Claim 11, Kato in view of Wakamatsu teaches wherein when a distance from the image capturing device to the subject based on the first information exceeds a predetermined distance, the changing unit changes the magnitude of the angle of view (paragraph 38-39, 127, Wakamatsu teaches how the adjustment to the tracking correction amount when the subject moves exceeds a correctable amount of the correction lens of the blur prevention characteristics changing unit and how the thrust toward the center of the control range may be changed gradually according to the magnitude of panning. The Examiner interprets this as the magnitude of the angle of view changing when a predetermined distance has been exceeded.).
27. In regards to Claim 12, Kato in view of Wakamatsu teaches an attitude detection device that detects an attitude of the image capturing device (paragraph 73, Kato teaches an attitude obtaining unit that obtains attitude of the image capture apparatus based on three axis acceleration in the image capture apparatus detected by the acceleration sensor unit., where an elevation angle or depression angle of the image capture device is detected based on a horizontal direction.), wherein the changing unit changes the angle of view based on the attitude detected by the attitude detection device (paragraph 89-90, Wakamatsu teaches the attitude of the lens barrel, an example of an attitude detection device and an aspect of the blur prevention characteristics changing unit being controlled in the pitch direction and the yaw direction by the imaging mechanism. The Examiner interprets this as the changing unit changing the angle of view based on the attitude detected by the attitude detection device.).
28. In regards to Claim 15, Kato in view of Wakamatsu teaches wherein the first obtainment unit obtains information pertaining to a position of the image capturing apparatus (paragraph 71, Kato teaches a location obtaining unit that obtains location of the image capture apparatus by the GPS unit.), and wherein the changing unit performs the at least one of (Under BRI, the Examiner interprets the claims to state only one since it states “at least one of”) changing the angle of view in a first direction or changing a magnitude of the angle of view (paragraph 36-39, 81, 106-108, 114, Wakamatsu teaches the blur prevention characteristics changing unit using panning determination data of the panning determination unit, which is information about angular velocity and blur correction angles and tracking data, obtained from the tracking control unit which is configured to perform tracking control of the subject based on the position information of the subject acquired by the subject detection unit, to determine if there is a need for the imaging apparatus to pan. The Examiner interprets the blur prevention characteristics changing unit determining if there is a need for the imaging apparatus to pan and then doing so as changing the angle of view in a first direction or changing a magnitude of the angle of view based on the first direction.), based on the first information in relation to said obtained information pertaining to the position of the image capturing apparatus (paragraph 36-39, 81, 106-108, 114, Wakamatsu teaches the blur prevention characteristics changing unit using panning determination data of the panning determination unit, which is information about angular velocity and blur correction angles and tracking data, obtained from the tracking control unit which is configured to perform tracking control of the subject based on the position information of the subject acquired by the subject detection unit.).
29. Regarding Claim 13, Kato teaches a method of controlling an image capturing apparatus, the image capturing apparatus including an image capturing device that captures an image of a subject (Fig. 2, paragraph 46-50 and 148, Kato teaches the photography of a subject using an optical lens unit configured by a focus lens and a zoom lens, where an optoelectronic conversion device configured by a complementary metal oxide semiconductor, CMOS, forms an image with light incident through the optical lens unit.), and the method comprising: obtaining, using a first method, first information that is information pertaining to a position of the subject (paragraph 71, Kato teaches latitude, longitude, and altitude location information obtained from the first information obtaining unit.), received from a measurement unit associated with the subject (paragraph 71-74, Kato teaches a location obtaining unit that obtains location of the image capture apparatus by the GPS unit.); obtaining, using a second method different from the first method (paragraph 72, Kato teaches a direction obtaining unit as a second information obtaining unit, different from the first location obtaining unit), second information that is information pertaining to the position of the subject determined from the captured image of the subject (paragraph 72-74, Kato teaches a direction obtaining unit obtaining the direction of the image capture apparatus based on an orientation of a geomagnetic field detected by the geomagnetic sensor unit.).
Kato does not teach changing an angle of view of the image capturing device, wherein the changing includes performing at least one of changing the angle of view in a first direction or changing a magnitude of the angle of view, based on the first information, and then performing at least one of changing the angle of view in a second direction different from the first direction or changing a magnitude of the angle of view, based on the second information.
Wakamatsu is in the same field of art of image capturing control for object tracking. Further, Wakamatsu teaches changing an angle of view of the image capturing device (Fig. 8B and paragraph 106-114, Wakamatsu teaches a blur prevention characteristics changing unit that changes the blur prevention characteristics according to the determination result on the subject detection state obtained by the detection state determination unit, which tracks and changes the device capturing angle to ensure the subject is placed back in the frame when outside the frame.), wherein the changing includes performing at least one (Under BRI, the Examiner interprets the claims to state only one since it states “at least one of”) of changing the angle of view in a first direction or changing a magnitude of the angle of view, based on the first information (paragraph 36-39, 81, 106-108, 114, Wakamatsu teaches the blur prevention characteristics changing unit using panning determination data of the panning determination unit, which is information about angular velocity and blur correction angles and tracking data, obtained from the tracking control unit which is configured to perform tracking control of the subject based on the position information of the subject acquired by the subject detection unit, to determine if there is a need for the imaging apparatus to pan. The Examiner interprets the blur prevention characteristics changing unit determining if there is a need for the imaging apparatus to pan and then doing so as changing the angle of view in a first direction or changing a magnitude of the angle of view based on the first direction.), and then performing at least one (Under BRI, the Examiner interprets the claims to state only one since it states “at least one of”) of changing the angle of view in a second direction different from the first direction or changing a magnitude of the angle of view, based on the second information (paragraph 66, Wakamatsu teaches the second state or second performance of the image blur correction effect, in which the effect in the second state is stronger than in the first state, including the panning determination, which would be stronger/different from the first, an example of changing the angle of view.).
Therefore it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the invention of Kato by incorporating the method of changing the angle of view of the imaging device with a changing unit that is taught by Wakamatsu to make an invention that can track the subject in an image frame based on object recognition features including global positioning; thus one of ordinary skill in the art would be motivated to combine the references since there is a need to suppress and correct an image blur of instances of high magnitude such as a handshake or face detection or human detection when a subject is a person known (paragraph 4, Wakamatsu).
Thus, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filing date
30. In regards to Claim 16, Kato in view of Wakamatsu teaches obtaining information pertaining to a position of the image capturing apparatus (paragraph 71, Kato teaches a location obtaining unit that obtains location of the image capture apparatus by the GPS unit.), and wherein the changing includes performing at least one of (Under BRI, the Examiner interprets the claims to state only one since it states “at least one of”) changing the angle of view in a first direction or changing a magnitude of the angle of view (paragraph 36-39, 81, 106-108, 114, Wakamatsu teaches the blur prevention characteristics changing unit using panning determination data of the panning determination unit, which is information about angular velocity and blur correction angles and tracking data, obtained from the tracking control unit which is configured to perform tracking control of the subject based on the position information of the subject acquired by the subject detection unit, to determine if there is a need for the imaging apparatus to pan. The Examiner interprets the blur prevention characteristics changing unit determining if there is a need for the imaging apparatus to pan and then doing so as changing the angle of view in a first direction or changing a magnitude of the angle of view based on the first direction.), based on the first information in relation to said obtained information pertaining to the position of the image capturing apparatus (paragraph 36-39, 81, 106-108, 114, Wakamatsu teaches the blur prevention characteristics changing unit using panning determination data of the panning determination unit, which is information about angular velocity and blur correction angles and tracking data, obtained from the tracking control unit which is configured to perform tracking control of the subject based on the position information of the subject acquired by the subject detection unit.).
31. Regarding Claim 14, Kato teaches a non-transitory computer-readable storage medium storing a program for causing a computer to execute each of steps of a method of controlling an image capturing apparatus, (paragraph 19, Kato teaches a non-transitory computer readable storage medium for storing a program that causes a computer to implement an image capture control function, where the program is executed in the image capture apparatus having an image capture unit.) the image capturing apparatus including an image capturing device that captures an image of a subject (Fig. 2, paragraph 46-50 and 148, Kato teaches the photography of a subject using an optical lens unit configured by a focus lens and a zoom lens, where an optoelectronic conversion device configured by a complementary metal oxide semiconductor, CMOS, forms an image with light incident through the optical lens unit.), and the method comprising: obtaining, using a first method (paragraph 71, Kato teaches a location obtainment unit as a first information obtaining unit.), first information that is information pertaining to a position of the subject (paragraph 71, Kato teaches latitude, longitude, and altitude location information obtained from the first information obtaining unit.), received from a measurement unit associated with the subject (paragraph 71-74, Kato teaches a location obtaining unit that obtains location of the image capture apparatus by the GPS unit.), obtaining using a second method different from the first method (paragraph 72, Kato teaches a direction obtaining unit as a second information obtaining unit, different from the first location obtaining unit.), second information that is information pertaining to the position of the subject determined from the captured image of the subject (paragraph 72-74, Kato teaches a direction obtaining unit obtaining the direction of the image capture apparatus based on an orientation of a geomagnetic field detected by the geomagnetic sensor unit.).
Wakamatsu is in the same field of art of image capturing control for object tracking. Further, Wakamatsu teaches changing an angle of view of the image capturing device (Fig. 8B and paragraph 106-114, Wakamatsu teaches a blur prevention characteristics changing unit that changes the blur prevention characteristics according to the determination result on the subject detection state obtained by the detection state determination unit, which tracks and changes the device capturing angle to ensure the subject is placed back in the frame when outside the frame.) wherein the changing includes performing at least one of (Under BRI, the Examiner interprets the claims to state only one since it states “at least one of”) changing the angle of view in a first direction or changing a magnitude of the angle of view, based on the first information (paragraph 36-39, 81, 106-108, 114, Wakamatsu teaches the blur prevention characteristics changing unit using panning determination data of the panning determination unit, which is information about angular velocity and blur correction angles and tracking data, obtained from the tracking control unit which is configured to perform tracking control of the subject based on the position information of the subject acquired by the subject detection unit, to determine if there is a need for the imaging apparatus to pan. The Examiner interprets the blur prevention characteristics changing unit determining if there is a need for the imaging apparatus to pan and then doing so as changing the angle of view in a first direction or changing a magnitude of the angle of view based on the first direction.) and then performing at least one of (Under BRI, the Examiner interprets the claims to state only one since it states “at least one of”) changing the angle of view in a second direction different from the first direction or changing a magnitude of the angle of view, based on the second information (paragraph 66, Wakamatsu teaches the second state or second performance of the image blur correction effect, in which the effect in the second state is stronger than in the first state, including the panning determination, which would be stronger/different from the first, an example of changing the angle of view.).
Therefore it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the invention of Kato by incorporating the system of changing the angle of view based on two different pieces of information using a changing unit that is taught by Wakamatsu to make an invention that can track the subject in an image frame based on the global position and object recognition features; thus one of ordinary skill in the art would be motivated to combine the references since there is a need to suppress and correct an image blur of instances of high magnitude such as a handshake or face detection or human detection when a subject is a person known (paragraph 4, Wakamatsu).
Thus, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filing date.
32. In regards to Claim 17, Kato in view of Wakamatsu teaches wherein the computer program executes additional step of obtaining information pertaining to a position of the image capturing apparatus (paragraph 19, Kato teaches a non-transitory computer readable storage medium for storing a program that causes a computer to implement an image capture control function, where the program is executed in the image capture apparatus having an image capture unit.), and wherein the changing includes performing at least one of (Under BRI, the Examiner interprets the claims to state only one since it states “at least one of”) changing the angle of view in first direction or changing a magnitude of the angle of view (paragraph 36-39, 81, 106-108, 114, Wakamatsu teaches the blur prevention characteristics changing unit using panning determination data of the panning determination unit, which is information about angular velocity and blur correction angles and tracking data, obtained from the tracking control unit which is configured to perform tracking control of the subject based on the position information of the subject acquired by the subject detection unit, to determine if there is a need for the imaging apparatus to pan. The Examiner interprets the blur prevention characteristics changing unit determining if there is a need for the imaging apparatus to pan and then doing so as changing the angle of view in a first direction or changing a magnitude of the angle of view based on the first direction.), based on the first information in relation to said obtained information pertaining to the position of the image capturing apparatus (paragraph 36-39, 81, 106-108, 114, Wakamatsu teaches the blur prevention characteristics changing unit using panning determination data of the panning determination unit, which is information about angular velocity and blur correction angles and tracking data, obtained from the tracking control unit which is configured to perform tracking control of the subject based on the position information of the subject acquired by the subject detection unit.).
33. Claims 5-6 are rejected under 35 U.S.C. 103 as being unpatentable over Kato et al. (US Patent Pub No. US 20160358019A1, hereafter referred to as Kato) in view of Wakamatsu (US Patent Pub No. US 20170163894A1, hereafter referred to as Wakamatsu) in further view of Higuchi (US Patent Pub No. US 20130083169A1, hereafter referred to as Higuchi).
34. Regarding Claim 5, Kato in view of Wakamatsu teaches the method of Claim 1 for capturing an image of a subject using two obtainment units and a changing unit that changes the angle of view.
Kato in view of Wakamatsu does not teach wherein the first direction is a horizontal direction.
Higuchi is in the same field of art of image capturing control for object tracking. Further, Higuchi teaches wherein the first direction is a horizontal direction (Fig. 4 and paragraph 38-41, Higuchi teaches first the determination if the parallax direction is a horizontal direction only. The Examiner interprets this as the first direction being a horizontal direction.).
PNG
media_image4.png
473
610
media_image4.png
Greyscale
Therefore it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the invention of Kato and Wakamatsu by incorporating the method of determining if the parallax direction is only the horizontal direction and if a subsequent identification of the vertical direction is necessary that is taught by Higuchi to make an invention that can obtain multi-viewpoint images in at least two-dimensions; thus one of ordinary skill in the art would be motivated to combine the references since there is a need to detect a position of a point of view to generate images of left and right point of views (paragraph 4-5, Higuchi).
Thus, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filing date.
35. Regarding Claim 6, Kato in view of Wakamatsu teaches the method of Claim 1 for capturing an image of a subject using two obtainment units and a changing unit that changes the angle of view.
Kato in view of Wakamatsu does not teach wherein the second direction is a vertical direction.
Higuchi is in the same field of art of image capturing control for object tracking. Further, Higuchi teaches wherein the second direction is a vertical direction (Fig. 4 and paragraph 38-41, Higuchi teaches selecting an image capturing unit in a vertical direction if it is determined that the parallax direction is a horizontal direction only. The Examiner interprets this as the second direction being a vertical direction).
Therefore it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the invention of Kato and Wakamatsu by incorporating the method of determining if a subsequent identification of the vertical direction is necessary that is taught by Higuchi to make an invention that can obtain multi-viewpoint images in at least two-dimensions; thus one of ordinary skill in the art would be motivated to combine the references since there is a need to detect a position of a point of view to generate images of left and right point of views (paragraph 4-5, Higuchi).
Thus, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filing date.
Conclusion
36. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
37. Any inquiry concerning this communication or earlier communications from the examiner should be directed to LOUIS NWUHA whose telephone number is (571)272-0219. The examiner can normally be reached Monday to Friday 8 am to 5 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Oneal Mistry can be reached at 3134464912. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/LOUIS NWUHA/Examiner, Art Unit 2674
/ONEAL R MISTRY/Supervisory Patent Examiner, Art Unit 2674