Prosecution Insights
Last updated: April 19, 2026
Application No. 18/180,606

POSITION SPECIFICATION DEVICE, POSITION SPECIFICATION METHOD, PROGRAM, AND POSITION SPECIFICATION SYSTEM

Final Rejection §102§103
Filed
Mar 08, 2023
Examiner
WAMBST, DAVID ALEXANDER
Art Unit
2663
Tech Center
2600 — Communications
Assignee
Fujifilm Corporation
OA Round
2 (Final)
67%
Grant Probability
Favorable
3-4
OA Rounds
2y 11m
To Grant
99%
With Interview

Examiner Intelligence

Grants 67% — above average
67%
Career Allow Rate
18 granted / 27 resolved
+4.7% vs TC avg
Strong +47% interview lift
Without
With
+47.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 11m
Avg Prosecution
25 currently pending
Career history
52
Total Applications
across all art units

Statute-Specific Performance

§101
4.5%
-35.5% vs TC avg
§103
56.6%
+16.6% vs TC avg
§102
21.5%
-18.5% vs TC avg
§112
16.1%
-23.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 27 resolved cases

Office Action

§102 §103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment The Amendment filed September 5 2025 has been entered and considered. Claims 1, 7, and 10 have been amended. Claims 5 and 6 have been cancelled. New claims 12-17 have been added. In light of the amendment the prior art rejections of claim 1 and 10 are withdrawn as moot. The new grounds of rejection set forth in the present action were necessitated by Applicants’ claim amendments. The amendment to claim 7 which has been rewritten in independent form does not overcome the rejection under 35 U.S.C. 102 previously set forth. This rejection is maintained and restated below; accordingly, this action is made final. Response to Arguments Applicant’s arguments with respect to claims 1 and 10 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Applicant's arguments filed 9/5/25 with regard to claim 7 have been fully considered but they are not persuasive. Applicant argues that the prior art does not disclose the newly added amendments to the newly independent claim 7. Remarks of 9/5/25 at Pg. 8. Applicant argues (Pg. 8): For example, Applicant submits that Sasaki does not teach or suggest an ability to move the position reference moving object to a position within an angle of view of the camera in a case in which the position reference moving object is not present within the angle of view of the camera. PNG media_image1.png 675 815 media_image1.png Greyscale Examiner responds: Sasaki teaches that the ground control point UAV (position reference moving object) moves to the photographing area of the photographing UAV (Pg. 6). This indicates that the ground control point UAV is not already present within an angle of view of the camera. They further disclose that when the ground control point UAV is not within a photographing area of the photographing UAV, but there is still a subject to photograph, the ground control point UAV will move to the next shooting range (Fig. 6, reprinted below, machine translated, the ground control point UAV is explicitly stated as being moved to a shooting range (S107), which necessarily indicates that the ground control point UAV is not already within the shooting range (or angle of view) of the camera until it is moved). Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 7-8 and 12-17 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Sasaki (Previously cited). Regarding claim 7, Sasaki teaches a position specification device comprising: a memory that stores a command to be executed by a processor; and the processor that executes the command stored in the memory (Pg. 2, “The UAV for photography 100 includes a camera 101, a GNSS position specifying device (GNSS receiver) 102 using GNSS, an IMU (inertial measurement device) 103, an altimeter 104, a control device 105, a storage device 106, and a communication device 107.”; Pg. 4, “Each functional unit of the control device 105 shown in FIG. 4 includes, for example, a CPU”), wherein the processor acquires an image of a ground surface (Pg. 2, “The camera 101 performs aerial photography during flight. In this example, the camera 101 performs shooting on the ground.”) including a position reference moving object including a visual identifier (Pg. 4, “The ground control point UAV 200 is used as a ground control point when the UAV 100 for photographing is photographed… The control point display 201 includes a display that can be distinguished from the control point display of the other control point UAV 200. The control point display 201 is arranged above the control point UAV 200 so that it can be easily seen when looking down the control point UAV 200 from above.”), the image being captured by a camera provided in an imaging flying object that flies over the sky (Pg. 2, “The camera 101 performs aerial photography during flight. In this example, the camera 101 performs shooting on the ground.”), detects the identifier from the image, acquires position information of the position reference moving object during capturing of the image (Pg. 4, “By identifying a plurality of control point displays 201 appearing in the image, information on the control points and their positions are linked, and the relationship between the plurality of control points in the image and their positions in the map coordinate system is specified.”), and specifies a position of the ground surface in the image from the detected identifier and the position information (Pg. 4, “By identifying a plurality of control point displays 201 appearing in the image, information on the control points and their positions are linked, and the relationship between the plurality of control points in the image and their positions in the map coordinate system is specified. This is the same as the case of using an orientation target (anti-air sign) installed on a normal ground.”, the positions are determined as being the same as if it was installed on normal ground, necessitating that the ground position is known), wherein the processor moves the position reference moving object to a position within an angle of view of the camera in a case in which the position reference moving object is not present within the angle of view of the camera (Fig. 6, reprinted below, machine translated, showcases that the control point UAV is moved into a shooting range (S107), necessarily indicating that the control point UAV is not already within the shooting range (or angle of view) of the camera until it is moved. This step is continuously repeated, further indicating that the control point UAV is not always within the angle of view of the camera and needs to be moved to a shooting range in order PNG media_image1.png 675 815 media_image1.png Greyscale to be photographed). Regarding claim 8, Sasaki teaches all of the elements of claim 7, as stated above, as well as wherein the processor moves the position reference moving object (Pg. 10, “Next, it is determined whether or not the setting as a control point installation machine is given in each control point UAV 200 (step S408). Then, the orientation point UAV 200 to which the setting as the orientation point installation machine is moved to the imaging range where the orientation point is installed (step S409).”, a UAV is moved into the imaging range), which has a smallest number of times the position information is acquired among a plurality of the position reference moving objects, to the position within the angle of view of the camera (Pg. 11, “Note that when the photographing end signal is received, the setting as the orientation point installation body is canceled, and the orientation point installation body can be used in other imaging ranges (step S412). The control point UAV200 whose setting as the control point installation machine body is canceled in step S412, and the control point UAV200 that is not selected as the control point installation machine body in step S407, that is, the setting as the control point installation machine body is given. If there is a next shooting range, the unfixed control point UAV 200 repeats the processes in and after step S407, and if there is no next shooting range, the process ends (step S413).”, once a UAV is designated as the “control point”, it is moved into the imaging range. This process is repeated, setting UAVs which have not been used as the control point to be moved for imaging, necessarily indicating a “smallest number” of times the position is acquired among a plurality of position reference moving objects, as it is transferred to a UAV which has 0 acquired images). Regarding claim 12, Sasaki teaches all of the elements of claim 7, as stated above, as well as wherein the identifier includes a color defined for each position reference moving object (Pg. 4, “As the orientation point display 201, a color code target that displays a code with a combination of colors and figures can be used.”). Regarding claim 13, Sasaki teaches all of the elements of claim 7, as stated above, as well as wherein the identifier includes a figure defined for each position reference moving object (Pg. 4, “In addition, numbers, characters, two-dimensional barcodes, patterns, combinations of various figures, colors, and the like can be used as the orientation point display 201.”). Regarding claim 14, Sasaki teaches all of the elements of claim 7, as stated above, as well as wherein the identifier includes a two-dimensional barcode in which the position information is encoded (Pg. 4, “In addition, numbers, characters, two-dimensional barcodes, patterns, combinations of various figures, colors, and the like can be used as the orientation point display 201.”). Regarding claim 15, Sasaki teaches all of the elements of claim 7, as stated above, as well as wherein the position reference moving object is a flying object that flies at an altitude lower than an altitude of the imaging flying object (Pg. 2, “In this example, the photography UAV 100 flies further above where the ground control point UAV 200 flies, and photographs an area including the ground control point UAV 200 from above.”), and the position information includes altitude information (Pg. 3, “The altimeter 104 measures the atmospheric pressure and measures the altitude of the UAV 100 for photographing.”). Regarding claim 16, the recited method performs variably the same function as that of claim 7. It is rejected under the same analysis. Regarding claim 17, the recited non-transitory, computer-readable tangible recording medium (Pg. 2, “The UAV for photography 100 includes… a control device 105, a storage device 106.”; Pg. 4, “Each functional unit of the control device 105 shown in FIG. 4 includes, for example, a CPU”, a CPU and storage device are disclosed, this is taken to mean a computer-readable medium as memory) performs variably the same function as that of claim 7. It is rejected under the same analysis. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-4 and 9-11 are rejected under 35 U.S.C. 103 as being unpatentable over Sasaki in view of Jinsen et al. (Japanese Patent Publication No. 2005207862A, foreign translation provided). Regarding claim 1, Sasaki teaches a position specification device comprising: a memory that stores a command to be executed by a processor; and the processor that executes the command stored in the memory (Pg. 2, “The UAV for photography 100 includes a camera 101, a GNSS position specifying device (GNSS receiver) 102 using GNSS, an IMU (inertial measurement device) 103, an altimeter 104, a control device 105, a storage device 106, and a communication device 107.”; Pg. 4, “Each functional unit of the control device 105 shown in FIG. 4 includes, for example, a CPU”), wherein the processor acquires an image of a ground surface (Pg. 2, “The camera 101 performs aerial photography during flight. In this example, the camera 101 performs shooting on the ground.”) including a position reference moving object including a visual identifier (Pg. 4, “The ground control point UAV 200 is used as a ground control point when the UAV 100 for photographing is photographed… The control point display 201 includes a display that can be distinguished from the control point display of the other control point UAV 200. The control point display 201 is arranged above the control point UAV 200 so that it can be easily seen when looking down the control point UAV 200 from above.”), the image being captured by a camera provided in an imaging flying object that flies over the sky (Pg. 2, “The camera 101 performs aerial photography during flight. In this example, the camera 101 performs shooting on the ground.”), detects the identifier from the image, acquires position information of the position reference moving object during capturing of the image (Pg. 4, “By identifying a plurality of control point displays 201 appearing in the image, information on the control points and their positions are linked, and the relationship between the plurality of control points in the image and their positions in the map coordinate system is specified.”), and specifies a position of the ground surface in the image from the detected identifier and the position information (Pg. 4, “By identifying a plurality of control point displays 201 appearing in the image, information on the control points and their positions are linked, and the relationship between the plurality of control points in the image and their positions in the map coordinate system is specified. This is the same as the case of using an orientation target (anti-air sign) installed on a normal ground.”, the positions are determined as being the same as if it was installed on normal ground, necessitating that the ground position is known), wherein the position information includes altitude information (Pg. 4, “The control point UAV 200 includes… an altimeter 204”), and the processor acquires elevation angle information of the camera during capturing of the image (Pg. 2, “The UAV for photography 100 includes… an IMU (inertial measurement device) 103”; Pg. 3, “Further, information regarding the attitude of the UAV 100 for taking a picture during flight can be obtained from the output of the IMU 103.”, the pitch component of the attitude information provides the elevation angle of the camera during image capture), and specifies a position of the ground surface immediately below the position reference moving object in the image (Pg. 4, “By identifying a plurality of control point displays 201 appearing in the image, information on the control points and their positions are linked, and the relationship between the plurality of control points in the image and their positions in the map coordinate system is specified.”). Sasaki does not explicitly disclose specifying a position of the ground surface immediately below the position reference moving object in the image based on the altitude information and the elevation angle information. However, they do specify a position of the ground surface immediately below the position reference moving object (Pg. 4). Jinsen teaches specifying a position of the ground surface of the position reference moving object in the image based on the altitude information and the elevation angle information (Pg. 7, “Thereafter, the control device 24 of the ground device 20 transmits the altitude information and position information of the drone 10 transmitted from the drone 10, the visual axis direction information of the image sensor 13, and the altitude of the terrain acquired in the terrain information acquisition step. Based on the information, position information of the vehicle A is calculated (target position information calculation step). Through the above process group, position information (latitude, longitude, altitude) of the vehicle A can be acquired.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Sasaki to incorporate the teachings of Jinsen to include specifying a position of the ground surface immediately below the position reference moving object in the image based on the altitude information and the elevation angle information. Sasaki teaches acquiring altitude information and elevation angle information, as well as specifying a position of the ground surface immediately below the position of the reference moving object, but they do not explicitly disclose performing this calculation based on the altitude information and the elevation angle. Jinsen provides a method for calculating the position of a vehicle that is being captured by a drone utilizing the visual axis direction information (elevation angle) of the image sensor and the altitude of the terrain in which the vehicle is on. One of ordinary skill in the art would recognize that substituting the altitude information of the terrain as used in Jinsen with the known altitude information of the control point UAV from Sasaki would be a routine swap of known elements. Including the method of Jinsen in the system of Sasaki allows for the positioning of the control point UAV to be validated, providing critical redundancy and improving the overall system robustness and reliability. Regarding claim 2, Sasaki as modified above teaches all of the elements of claim 1, as stated above, as well as wherein the identifier includes a color defined for each position reference moving object (Pg. 4, “As the orientation point display 201, a color code target that displays a code with a combination of colors and figures can be used.”). Regarding claim 3, Sasaki as modified above teaches all of the elements of claim 1, as stated above, as well as wherein the identifier includes a figure defined for each position reference moving object (Pg. 4, “In addition, numbers, characters, two-dimensional barcodes, patterns, combinations of various figures, colors, and the like can be used as the orientation point display 201.”). Regarding claim 4, Sasaki as modified above teaches all of the elements of claim 1, as stated above, as well as wherein the identifier includes a two-dimensional barcode in which the position information is encoded (Pg. 4, “In addition, numbers, characters, two-dimensional barcodes, patterns, combinations of various figures, colors, and the like can be used as the orientation point display 201.”). Regarding claim 9, Sasaki as modified above teaches all of the elements of claim 1, as stated above, as well as the position reference moving object (Pg. 4, “The ground control point UAV 200 is used as a ground control point when the UAV 100 for photographing is photographed.”); and the imaging flying object including the camera (Pg. 2, “The photography UAV 100 includes a camera 101”). Regarding claim 10, the recited method performs variably the same function as the device of claim 1. It is rejected under the same analysis. Regarding claim 11, the recited non-transitory computer-readable tangible recording medium (Pg. 2, “The UAV for photography 100 includes… a control device 105, a storage device 106.”; Pg. 4, “Each functional unit of the control device 105 shown in FIG. 4 includes, for example, a CPU”, a CPU and storage device are disclosed, this is taken to mean a computer-readable medium as memory) performs variably the same function as that of claim 1. It is rejected under the same analysis. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to DAVID A WAMBST whose telephone number is (703)756-1750. The examiner can normally be reached M-F 9-6:30 EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Gregory Morse can be reached at (571)272-3838. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DAVID ALEXANDER WAMBST/Examiner, Art Unit 2663 /GREGORY A MORSE/Supervisory Patent Examiner, Art Unit 2698
Read full office action

Prosecution Timeline

Mar 08, 2023
Application Filed
May 30, 2025
Non-Final Rejection — §102, §103
Sep 05, 2025
Response Filed
Nov 24, 2025
Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597278
IMAGE AUTHENTICITY DETECTION METHOD AND DEVICE, COMPUTER DEVICE, AND STORAGE MEDIUM
2y 5m to grant Granted Apr 07, 2026
Patent 12524892
SYSTEMS AND METHODS FOR IMAGE REGISTRATION
2y 5m to grant Granted Jan 13, 2026
Patent 12437437
DIFFUSION MODELS HAVING CONTINUOUS SCALING THROUGH PATCH-WISE IMAGE GENERATION
2y 5m to grant Granted Oct 07, 2025
Patent 12423783
DIFFERENTLY CORRECTING IMAGES FOR DIFFERENT EYES
2y 5m to grant Granted Sep 23, 2025
Patent 12380566
METHOD OF SEPARATING TERRAIN MODEL AND OBJECT MODEL FROM THREE-DIMENSIONAL INTEGRATED MODEL AND APPARATUS FOR PERFORMING THE SAME
2y 5m to grant Granted Aug 05, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
67%
Grant Probability
99%
With Interview (+47.4%)
2y 11m
Median Time to Grant
Moderate
PTA Risk
Based on 27 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month