Prosecution Insights
Last updated: April 19, 2026
Application No. 18/609,323

INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD

Non-Final OA §103
Filed
Mar 19, 2024
Examiner
BARNES JR, CARL E
Art Unit
2178
Tech Center
2100 — Computer Architecture & Software
Assignee
Ricoh Company Ltd.
OA Round
1 (Non-Final)
32%
Grant Probability
At Risk
1-2
OA Rounds
4y 4m
To Grant
57%
With Interview

Examiner Intelligence

Grants only 32% of cases
32%
Career Allow Rate
65 granted / 202 resolved
-22.8% vs TC avg
Strong +25% interview lift
Without
With
+25.2%
Interview Lift
resolved cases with interview
Typical timeline
4y 4m
Avg Prosecution
32 currently pending
Career history
234
Total Applications
across all art units

Statute-Specific Performance

§101
14.3%
-25.7% vs TC avg
§103
62.6%
+22.6% vs TC avg
§102
9.0%
-31.0% vs TC avg
§112
8.7%
-31.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 202 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). The certified copy has been filed. Information Disclosure Statement The information disclosure statement (IDS) submitted on 03/19/2024, 01/08/2025 was filed. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Drawings The drawings are objected to as failing to comply with 37 CFR 1.84(p)(5) because they include the following reference character(s) not mentioned in the description: Fig. 26 elements 10a, 10b, 10x, 900a, 900b, 900x, 700a, 700b, 700x, 800a, 800b, 800x are not recited in the disclosure filed on 03/19/2024. Corrected drawing sheets in compliance with 37 CFR 1.121(d), or amendment to the specification to add the reference character(s) in the description in compliance with 37 CFR 1.121(b) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance. Specification The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 1-7 are rejected under 35 U.S.C. 103 as being unpatentable over OKAMOTO(WO 2022180453 A1. Filed Date: Jan. 13, 2022) in view of Cosby (US 20220318723 A1, Filed Date: Aug. 21, 2019) Regarding independent claim 1, OKAMOTO teaches: An information processing method for acquiring location information of an object to be tracked, performed by an information processing apparatus, the method comprising: (OKAMOTO − [pdf page 8, 0030, 0192] The on-premises server 50 according to the present embodiment serves as an information processing apparatus…, process the information about the position of the objects 30 carried by the forklifts 10.) PNG media_image1.png 79 640 media_image1.png Greyscale PNG media_image2.png 553 945 media_image2.png Greyscale acquiring location information of the object based on location management information in which identification information of the object, the location information of the object, (OKAMOTO − [pdf page 20, 0147, 0155-0158] Fig. 13A and 13B, obtaining the information of the positions of the objects 30; [0156] The location map 61 as illustrated in Fig. 13A may be said to indicate the information about the positions of the object 30.) and time information indicating a time at which the location information of the object is acquired are associated with each other; (OKAMOTO − [pdf page 20, 0147] Fig. 13A and 13B, obtaining the information of the positions of the objects 30; [0155-0158] A position table 63 as illustrated in FIG. 13B is a table including the three-dimensional coordinates of a point group indicating the positions of the objects 30 and a time stamp 64 indicating the times at which the three-dimensional coordinates are obtained.) PNG media_image3.png 406 627 media_image3.png Greyscale acquiring the time information associated with the acquired location information of the object as time information indicating a time [before movement] of the object starts, based on the location management information; (OKAMOTO − [pdf page 10, 0052 ] The operation to start obtaining the information about the positions of the objects 30 is performed by a user such as an administrator, using, for example, the pointing device 512 as illustrated in FIG. 3. [pdf page 28, 0212] In FIG. 23, a fast-forward key 165 and a scroll bar 166 are illustrated. A function to keep track of the location may be provide.) PNG media_image4.png 110 629 media_image4.png Greyscale PNG media_image5.png 132 634 media_image5.png Greyscale tracking, by image processing, (fixed camera 60) the object in a captured moving image of surroundings of the object based on the location information of the object, (OKAMOTO − [pdf page 26, 0198] The on-ceiling object position acquisition unit 501 according to the present embodiment obtains the information about the positions of the objects 30 conveyed by means other than the forklift 10, based on the images captured by the fixed camera 60, and sends the information about the position of the objects to the on-premises server 50a through the transmitter 502. The information about the positions of the objects 30 that is obtained by the on-ceiling object position acquisition unit 501 based on the images captured by the fixed camera 60 is an example of the second position information.) wherein the tracking includes tracking the object in a part of the captured moving image starting from a moving image time corresponding to the acquired time information; (OKAMOTO – [pdf pages 10-11, 0056-0058] the time acquisition unit 55 obtains the data indicating the time at which the receiver 51 received the spherical image, identification data and outputs the data indicating the time to the object position acquisition unit 56; [pdf page 26, 0198] The on-ceiling object position acquisition unit 501 according to the present embodiment obtains the information about the positions of the objects 30 conveyed by means other than the forklift 10, based on the images captured by the fixed camera 60, and sends the information about the position of the objects to the on-premises server 50a through the transmitter 502. The information about the positions of the objects 30 that is obtained by the on-ceiling object position acquisition unit 501 based on the images captured by the fixed camera 60 is an example of the second position information.) PNG media_image6.png 62 657 media_image6.png Greyscale PNG media_image7.png 145 650 media_image7.png Greyscale and acquiring current location information of the object indicating a location of the object at an end of the movement based on a result of the tracking. (OKAMOTO – [pdf page 28] In FIG. 23, a fast-forward key 165 and a scroll bar 166 are illustrated. A function to keep track of the location may be provided. the object 30 of particular ID number to be searched for lastly placed at a particular location.) PNG media_image8.png 248 652 media_image8.png Greyscale OKAMOTO does not explicitly recite: a time before movement of the object starts, However, Cosby teaches: acquiring the time information associated with the acquired location information of the object as time information indicating a time before movement of the object starts, (Cosby – [0022] the first loading event may be identified as the point in time when the inventory item initially begins to move as a unit with the movement of the forklift.) Accordingly, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have combined the teaching of OKAMOTO, and Cosby as each inventions relates using imaging devices for tracking objects. One of ordinary skill in the art would have been motivated to make the modification to reduce manual labor intensive inventory processing and track position of automatically. Regarding independent claim 2, OKAMOTO teaches: An information processing method for acquiring location information of an object to be tracked, performed by an information processing apparatus, the method comprising: (OKAMOTO − [pdf page 8, 0030, 192] The on-premises server 50 according to the present embodiment serves as an information processing apparatus…, process the information about the position of the objects 30 carried by the forklifts 10.) acquiring location information indicating a pre-movement position where the object has been located before movement of the object starts, based on location management information in which identification information of the object, location information of the object, first time information indicating a time before the movement of the object starts, (OKAMOTO – [0192, pdf page 25] In addition to the forklift 10, a pallet carrying machine called a pallet jack or pallet truck that can be easily handled by a person may be used in warehouses. In the present embodiment, even when the positions of the objects 30 are changed by a pallet jack with which the spherical camera 20 is not provided, the positions of the objects 30 can be recognized and tracked based on the images captured by the fixed camera 60. [pdf page 0155-0158] Fig. 13A and 13B, obtaining the information of the positions of the objects 30; [0156] The location map 61 as illustrated in Fig. 13A may be said to indicate the information about the positions of the object 30.) Examiner note: pre-movement is manual pallet jack before forklift movement. PNG media_image9.png 115 627 media_image9.png Greyscale and second time information indicating a time at which the movement of the object ends are associated with each other; (OKAMOTO − [pdf page 10, 0052] The operation to start obtaining the information about the positions of the objects 30 is performed by a user such as an administrator, using, for example, the pointing device 512 as illustrated in FIG. 3. [pdf page 28, 0212] In FIG. 23, a fast-forward key 165 and a scroll bar 166 are illustrated. A function to keep track of the location may be provide.) acquiring the first time information based on the location management information; (OKAMOTO − [pdf page 10, 0052] The operation to start obtaining the information about the positions of the objects 30 is performed by a user such as an administrator, using, for example, the pointing device 512 as illustrated in FIG. 3. [pdf page 28, 0212] In FIG. 23, a fast-forward key 165 and a scroll bar 166 are illustrated. A function to keep track of the location may be provide.) tracking, by image processing, the object in a captured moving image of surroundings of the object based on the location information of the object, (OKAMOTO − [pdf page 26, 0198] The on-ceiling object position acquisition unit 501 according to the present embodiment obtains the information about the positions of the objects 30 conveyed by means other than the forklift 10, based on the images captured by the fixed camera 60, and sends the information about the position of the objects to the on-premises server 50a through the transmitter 502. The information about the positions of the objects 30 that is obtained by the on-ceiling object position acquisition unit 501 based on the images captured by the fixed camera 60 is an example of the second position information.) wherein the tracking includes tracking the object in a part of the captured moving image starting from a moving image time corresponding to the first time information; (OKAMOTO – [pdf pages 10-11, 0056-0058] the time acquisition unit 55 obtains the data indicating the time at which the receiver 51 received the spherical image, identification data and outputs the data indicating the time to the object position acquisition unit 56; [pdf page 26, 0198] The on-ceiling object position acquisition unit 501 according to the present embodiment obtains the information about the positions of the objects 30 conveyed by means other than the forklift 10, based on the images captured by the fixed camera 60, and sends the information about the position of the objects to the on-premises server 50a through the transmitter 502. The information about the positions of the objects 30 that is obtained by the on-ceiling object position acquisition unit 501 based on the images captured by the fixed camera 60 is an example of the second position information.) and acquiring current location information of the object indicating a location of the object at an end of the movement of the object based on a result of the tracking. (OKAMOTO – [pdf page 28] In FIG. 23, a fast-forward key 165 and a scroll bar 166 are illustrated. A function to keep track of the location may be provided. the object 30 of particular ID number to be searched for lastly placed at a particular location.) OKAMOTO does not explicitly recite: a time before movement of the object starts, However, Cosby teaches: first time information indicating a time [before the movement] of the object starts, (Cosby – [0022] the first loading event may be identified as the point in time when the inventory item initially begins to move as a unit with the movement of the forklift.) Accordingly, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have combined the teaching of OKAMOTO, and Cosby as each inventions relates using imaging devices for tracking objects. One of ordinary skill in the art would have been motivated to make the modification to reduce manual labor intensive inventory processing and track position of automatically. Regarding dependent claim 3, depends on claim 2, OKAMOTO teaches: further comprising: acquiring the second time information based on the location management information; tracking, by image processing, the object in a partial moving image based on the location information of the object, the partial moving image from the moving image time corresponding to the first time information to another moving image time corresponding to the second time information in the captured moving image; and acquiring current location information of the object indicating a location of the object at an end of the movement based on a result of the tracking. (OKAMOTO − [pdf page 10, 0052 ] The operation to start obtaining the information about the positions of the objects 30 is performed by a user such as an administrator, using, for example, the pointing device 512 as illustrated in FIG. 3. [pdf page 28, 0212] In FIG. 23, a fast-forward key 165 and a scroll bar 166 are illustrated. A function to keep track of the location may be provide.) Regarding dependent claim 4, depends on claim 2, OKAMOTO teaches: further comprising updating the location management information based on the current location information of the object. (OKAMOTO − [pdf page 10, 0052 ] The operation to start obtaining the information about the positions of the objects 30 is performed by a user such as an administrator, using, for example, the pointing device 512 as illustrated in FIG. 3. [pdf page 28, 0212] In FIG. 23, a fast-forward key 165 and a scroll bar 166 are illustrated. A function to keep track of the location may be provide.) Regarding dependent claim 5, depends on claim 1, OKAMOTO teaches: wherein the captured moving image includes a plurality of captured moving images captured by a plurality of image capturing devices whose relative positions with each other are determined in advance, and the information processing method further comprises, in a case where a first moving image of the plurality of captured moving images does not include the object after the object is moved, identifying a second moving image among the plurality of captured moving images to track the object, the second moving image including the object after the object is moved. (OKAMOTO – [pdf page 0031] information about the position of the object 30 can be obtained using the spherical images captured by the multiple spherical cameras 20. [pdf page 26 0193] In the present embodiment, it is assumed that a plurality of fixed cameras 60 are arranged, and the multiple fixed cameras 60 may collectively be referred to as the fixed camera 60. [pdf page 27, 210] the positions of the multiple fixed cameras 60 are indicated on the camera location map screen 163.) Regarding dependent claim 6, depends on claim 1, OKAMOTO teaches: further comprising calculating location information of the object before the movement of the object starts, based on a relative position between an image capturing device that generates the captured moving image of the surroundings of the object and a location of a marker on the object. (OKAMOTO – [0192, pdf page 25] In addition to the forklift 10, a pallet carrying machine called a pallet jack or pallet truck that can be easily handled by a person may be used in warehouses. In the present embodiment, even when the positions of the objects 30 are changed by a pallet jack with which the spherical camera 20 is not provided, the positions of the objects 30 can be recognized and tracked based on the images captured by the fixed camera 60. [pdf page 0155-0158] Fig. 13A and 13B, obtaining the information of the positions of the objects 30; [0156] The location map 61 as illustrated in Fig. 13A may be said to indicate the information about the positions of the object 30.) Examiner note: pre-movement is manual pallet jack before forklift movement. Regarding independent claim 7, is directed to an apparatus. Claim 7 have similar/same technical features/limitations as claim 2 and the claims are rejected under the same rational. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Nomasa US 20200074676 A1, calculates a position of mobile object using imaging device. Any inquiry concerning this communication or earlier communications from the examiner should be directed to CARL E BARNES JR whose telephone number is (571)270-3395. The examiner can normally be reached Monday-Friday 9am-6pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Stephen Hong can be reached at (571) 272-4124. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /CARL E BARNES JR/Examiner, Art Unit 2178 /STEPHEN S HONG/Supervisory Patent Examiner, Art Unit 2178
Read full office action

Prosecution Timeline

Mar 19, 2024
Application Filed
Jan 22, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12584932
SLIDE IMAGING APPARATUS AND A METHOD FOR IMAGING A SLIDE
2y 5m to grant Granted Mar 24, 2026
Patent 12541640
COMPUTING DEVICE FOR MULTIPLE CELL LINKING
2y 5m to grant Granted Feb 03, 2026
Patent 12536464
SYSTEM FOR CONSTRUCTING EFFECTIVE MACHINE-LEARNING PIPELINES WITH OPTIMIZED OUTCOMES
2y 5m to grant Granted Jan 27, 2026
Patent 12530765
SYSTEMS AND METHODS FOR CALCIUM-FREE COMPUTED TOMOGRAPHY ANGIOGRAPHY
2y 5m to grant Granted Jan 20, 2026
Patent 12530523
METHOD, APPARATUS, SYSTEM, AND COMPUTER PROGRAM FOR CORRECTING TABLE COORDINATE INFORMATION
2y 5m to grant Granted Jan 20, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
32%
Grant Probability
57%
With Interview (+25.2%)
4y 4m
Median Time to Grant
Low
PTA Risk
Based on 202 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month