DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant's arguments with respect to claims 1-7 and 9-14 have been considered but are moot in view of the new ground(s) of rejection.
It is noted that the amendments change the scope of the claim.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-3 and 9-14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Watanabe (U.S. PG Publication No. 2012/0268608) in view of Ishii et al. (“Ishii”) (U.S. PG Publication No. 2019/0191098).
In regards to claim 1, Watanabe describes an imaging system as described in ¶0030-0034 in view of FIG. 3 and 4 with two modes of operation, a normal mode in which a user may use a control device with a control interface in order to pan or tilt a camera platform, as well as a tracking mode in which an object is tracked and the target is set to the center coordinates of imaging. Watanabe, however, shows this occurring with only one target at a time. In a similar endeavor Ishii teaches as seen in ¶0003 an imaging device which may track a target object from images captured by a camera, while ¶0055-0057 that indeed a plurality of objects may be tracked, and in order to do so a determination must be made by creating an order of priority as further described In at least ¶0064 and 0130. As such, the system must decide any one of objects for tracking. Furthermore, in order to track an object within a 3D-space, the three-dimensional position of an object must be stored and calculated as described in at least ¶0038 and 0064, with ¶0215-0216 specifying that memory is used to store the object position information along with the priority information and other information. It is obvious to one of ordinary skill in the art to incorporate the teachings of Ishii into Watanabe as it allows for the ability of tracking more than one object, and as such the system may choose to track any one of a plurality of objects, and with information on its position stored for reference for said tracking, as it allows for proper efficiency of tracking with such knowledge.
Therefore together Watanabe and Ishii teach a control apparatus (See ¶0024 in view of FIG. 4 of Watanabe) for a system configured to transition between a first state for changing an angle of view of an image pickup apparatus according to an operation of a user and a second state for changing the angle of view so that any of objects is displayed at a target position on an image (See ¶0030 and 0033-0034 in view of FIG. 3 of Watanabe wherein the system may operate in a normal mode by which a user may control the panning or tilting of the camera, and a tracking mode, the switch between modes being executed through the use of a mode switching button, this is taken in view of ¶0003 and 0055-0057 of Ishii), the control apparatus comprising:
a memory storing instructions and records positions of a plurality of objects (See ¶0025 in view of FIG. 1 of Watanabe, this is taken in view of ¶0038, 0064 and 0215-0216 of Ishii); and
a processor configured to execute the instructions (See ¶0025 in view of FIG. 1 of Watanabe) to:
acquire a position of the object on the image (See at least ¶0027 and 0033-0034 of Watanabe), and
determine the target position based on the position of the object on the image in a case where the system transitions from the first state to the second state (See ¶0033-0034 in view of FIG. 1-3 of Watanabe).
It would have been obvious to a person of ordinary skill in the art, and before the effective filing date of the claimed invention, to incorporate the teaching of Ishii into Watanabe because it allows for potential tracking of multiple objects, along with their corresponding positions being stored in memory as described in at least ¶0038, 0064 and 0215-0216, thus providing improvement in tracking since it allows for positional information to be used in tracking.
In regards to claim 2, Watanabe teaches the control apparatus according to claim 1, wherein the processor is configured to determine as the target position the position of the object on the image when the system transitions from the first state to the second state (See ¶0033-0034 in view of FIG. 1-3 wherein part of the transition from the normal mode to the tracking mode is the setup of the output location for the object on the image).
In regards to claim 3, Watanabe teaches the control apparatus according to claim 1, wherein the processor is configured to determine a center position of the image as the target position in a case where the object is moving when the system transitions from the first state to the second state, and to determine as the target position the position of the object on the image when the system transitions from the first state to the second state in a case where the object is stationary (See ¶0033-0034 in view of FIG. 1-3 wherein the system determines the target position via user input, which thus may include a center position, regardless of whether the object is stationary or moving, as the system is enabled to do either regardless of object status; applicant is informed that such a rejection may be made simply because the claim states comprising and thus allows for such an interpretation).
In regards to claim 9, Watanabe teaches the control apparatus of claim 1, wherein the processor is configured to acquire the image (See ¶0025 in view of FIG. 1).
In regards to claim 10, Watanabe teaches the control apparatus according to claim 1, wherein the processor is configured to detect the position of the object using the image (See at least ¶0027 in view of FIG. 1).
In regards to claim 11, the claim is rejected under the same basis as claim 1 by Watanabe in view of Ishii.
In regards to claim 12, Watanabe and Ishii teach a system comprising:
an image pickup apparatus including an imaging unit configured to capture an object (See ¶0033-0034 in view of FIG. 1-3 of Watanabe);
an operation apparatus configured to operate the image pickup apparatus (See ¶0025 in view of FIG. 1 of Watanabe); and
the control apparatus according to claim 1 (See the rejection of claim 1).
In regards to claim 13, the claim is rejected under the same basis as claim 1 by Watanabe in view of Ishii.
In regards to claim 14, the claim is rejected under the same basis as claim 13 by Watanabe in view of Ishii wherein the computer-readable medium is taught as seen in ¶0025-0027, 0047-0049 and FIG. 1.
Claim(s) 4 is/are rejected under 35 U.S.C. 103 as being unpatentable over Watanabe (U.S. PG Publication No. 2012/0268608) in view of Ishii et al. (“Ishii”) (U.S. PG Publication No. 2019/0191098) and Kinoshita (U.S. PG Publication No. 2015/0373414).
In regards to claim 4, Watanabe fails to teach the control apparatus according to claim 1, wherein in the second state, the angle of view is changed so that a size of the object in the image approaches a target size, wherein the processor is configured to acquire the size of the object in the image, and wherein the processor is configured to determine as the target size the size of the object in the image when the system transitions from the first state to the second state.
In a similar endeavor Kinoshita teaches wherein in the second state, the angle of view is changed so that a size of the object in the image approaches a target size, wherein the processor is configured to acquire the size of the object in the image, and wherein the processor is configured to determine as the target size the size of the object in the image when the system transitions from the first state to the second state (See ¶0325 where when the system is placed into a tracking process [thus transitions into this process as incorporated by Watanabe’s teaching] the system may perform automatic zooming such that the object is within a predetermined size, and thus such a zooming may also require an adjustment in the angle of view as well based on the distance change).
It would have been obvious to a person of ordinary skill in the art, and before the effective filing date of the claimed invention, to incorporate the teaching of Kinoshita into Watanabe because it allows for the proper adjustment of zoom and angle of view in order to appropriately track a selected object as seen in ¶0325, thus improving accuracy of the system.
Claim(s) 5, 6 and 8 is/are rejected under 35 U.S.C. 103 as being unpatentable over Watanabe (U.S. PG Publication No. 2012/0268608) in view of Ishii et al. (“Ishii”) (U.S. PG Publication No. 2019/0191098) and Fukuda et al. (“Fukuda”) (U.S. PG Publication No. 2021/0370178).
In regards to claim 5, Watanabe fails to teach the control apparatus according to claim 1, wherein the processor is configured to determine as the target position one of a plurality of positions of the object acquired and recorded before the system transitions from the first state to the second state.
In a similar endeavor Fukuda teaches wherein the processor is configured to determine as the target position one of a plurality of positions of the object acquired and recorded before the system transitions from the first state to the second state (See ¶0014 and 0138 wherein the position and orientation of an object is tracked through time [up to a predetermined amount of time from the current time], this is taken in view of Watanabe’s teachings in ¶0033-0034 with regards to the position of the object in the current time from a switch of states).
It would have been obvious to a person of ordinary skill in the art, and before the effective filing date of the claimed invention, to incorporate the teaching of Fukuda into Watanabe because it allows for continual updates on the arrangement information of objects as described in ¶0138 at desired timing, thus allowing for continual recording of important tracking information if need be.
In regards to claim 6, Watanabe fails to teach the control apparatus according to claim 5, the processor is configured to determine as the target position the position of the object that was recorded a predetermined time before the system transitions from the first state to the second state.
In a similar endeavor Fukuda teaches the processor is configured to determine as the target position the position of the object that was recorded a predetermined time before the system transitions from the first state to the second state (See ¶0014 and 0138 wherein the position and orientation of an object is tracked through time [up to a predetermined amount of time from the current time], this is taken in view of Watanabe’s teachings in ¶0033-0034 with regards to the position of the object in the current time from a switch of states).
It would have been obvious to a person of ordinary skill in the art, and before the effective filing date of the claimed invention, to incorporate the teaching of Fukuda into Watanabe because it allows for continual updates on the arrangement information of objects as described in ¶0138 at desired timing, thus allowing for continual recording of important tracking information if need be.
In regards to claim 8, Watanabe fails to teach the control apparatus according to claim 5, wherein the memory records positions of the plurality of objects.
In a similar endeavor Fukuda teaches wherein the memory records positions of the plurality of objects (See ¶0138).
It would have been obvious to a person of ordinary skill in the art, and before the effective filing date of the claimed invention, to incorporate the teaching of Fukuda into Watanabe because it allows for continual updates on the arrangement information of objects as described in ¶0138 at desired timing, thus allowing for continual recording of important tracking information if need be.
Claim(s) 7 is/are rejected under 35 U.S.C. 103 as being unpatentable over Watanabe (U.S. PG Publication No. 2012/0268608) in view of Ishii et al. (“Ishii”) (U.S. PG Publication No. 2019/0191098) and Fukuda et al. (“Fukuda”) (U.S. PG Publication No. 2021/0370178), in further view of Suzuki (U.S. PG Publication No. 2021/0400196).
In regards to claim 7, Watanabe fails to teach the control apparatus according to claim 6, wherein the predetermined time is based on a difference between a time when the system is instructed to transition from the first state to the second state, and a time when the system transitions from the first state to the second state.
In a similar endeavor Suzuki teaches wherein the predetermined time is based on a difference between a time when the system is instructed to transition from the first state to the second state, and a time when the system transitions from the first state to the second state (See ¶0041 in view of FIG. 3-5 wherein during the automatic tracking mode transition function, a detection frame is generated based on the detected moving objects among the superimposed detection frames as the tracking target, specifically in that a detection frame specified by an operation may be set as the tracking frame, thus it is recorded in memory even while the transition to another state may take more time, this is taken in view of Watanabe’s teachings).
It would have been obvious to a person of ordinary skill in the art, and before the effective filing date of the claimed invention, to incorporate the teaching of Suzuki into Watanabe because it allows for the identification of the specifically desired frame at which was selected, and not the frame at which the system is finally enabled to switch because of processing time, as such enabling for a more accurate system.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to EDEMIO NAVAS JR whose telephone number is (571)270-1067. The examiner can normally be reached M-F, ~ 9 AM -6 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Joseph Ustaris can be reached at 5712727383. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
EDEMIO NAVAS JR
Primary Examiner
Art Unit 2483
/EDEMIO NAVAS JR/Primary Examiner, Art Unit 2483