DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer.
Claims 1-13 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-17 of U.S. Patent No. 12,260,586. Although the claims at issue are not identical, they are not patentably distinct from each other because claims 1-17 of Patent No. 12,260,586 contains every element of claims 1-13 of the instant application and thus anticipate the claim(s) of the instant application. Claims 1-13 of the instant application therefore is/are not patently distinct from the earlier patent claims and as such is/are unpatentable over obvious-type double patenting. A later application claim is not patentably distinct from an earlier claim if the later claim is anticipated by the earlier claim.
Regarding claim 1,
Instant Application
Is met by U.S. Patent 12,260,586 claim 1
An information processing method, comprising:
An information processing method comprising:
setting, by a data processing unit of an information processing device, a real object in an image as a marker,
setting, by a data processing unit of an information processing device, a real object in an image as a marker,
wherein the image is captured by a camera of the information processing device;
wherein the image is captured by a camera of the information processing device;
generating, by the data processing unit, marker data associated with the marker; and
generating, by the data processing unit, marker reference coordinates in a marker reference coordinate system in which a configuration point of the marker is an origin;
storing, by the data processing unit, the marker data in a storage unit of the information processing device,
storing, by the data processing unit, marker data associated with the marker in a storage unit of the information processing device,
wherein the marker data associates with a position and a posture of the camera at a marker registration time point of the marker, and
wherein the marker data includes a position and posture of the camera at a marker registration time point of the marker, and
the position and the posture of the camera are in a coordinate system.
the position and the posture of the camera are in the marker reference coordinate system; and
transmitting, by the data processing unit, based on the stored marker data, first position data in the marker reference coordinate system to a mobile device.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
High et al. (U.S. Patent Application Publication 2017/0132566) discloses an information processing method executed in an information processing device, the information processing method comprising: setting, by a data processing unit, a marker included in a camera-captured image of the information processing device, and generating marker reference coordinates with a configuration point of the set marker as an origin (Fig. 6; paragraph [0054] – in step 604, a global location information is received, from a portable user interface unit 110 associated with the customer, of a current location of the user interface unit and that designates a delivery location where the customer would like the product delivered - typically, the delivery control circuit receives this information and confirms the viability of the requested delivery location - further, the delivery control circuit can determine a flight path from a selected launch location - the launch location may be from a distribution center, a shopping facility (e.g., retail sales facility, or any other type of facility in which products are displayed and/or sold, etc.), a selected location based on one or more deliveries within a given geographic region or area, and the like; paragraph [0055] – in step 606, a delivery is initiated by an unmanned delivery aircraft of the product to the delivery location defined by the global location information received from the user interface unit - the delivery control system may evaluate the delivery location based on information provided by the delivery aircraft, and can validate the delivery location and communicate an authorization to deliver the product once validated; paragraph [0056] - in some embodiments, one or more images and/or video are received at the control circuit 202 of the delivery control system 102 that were captured by a camera of the user interface unit 110 – further, in some instances, the one or more images and/or video is captured at approximately the time the global location information is identified by the user interface unit – the one or more images and/or vide are evaluated, and a viability of the delivery location is confirmed for a delivery by the delivery aircraft – the delivery can be initiated and/or authorization to complete the delivery can be communicated when the delivery location is confirmed as viable - typically, the evaluation of the one or more images further includes confirming there is a vertical clearance of at least a threshold diameter extending above the delivery location - some embodiments evaluate more than one image - as such two or more images may be received that are directed away from each other such that a first image is captured of a ground at the delivery location and a second image is captured in a directed that is substantially 180 degrees from the direction captured by the first image and generally directed toward the sky - the confirmation of the viability of the delivery location can include confirming the ground has a delivery area that has at least a delivery area threshold upon which the product can be deposited – the global location information and the one or more images may be received from the user interface unit in response to a selection by the customer of a single option presented to the user through a software application implemented on the user interface unit that causes the user interface unit to identify the global location information and activate the camera to capture the image; the delivery location would be the origin); and transmitting, by the data processing unit, position data on the marker reference coordinates to another device by using the marker reference coordinates as coordinates shared with the another device (Fig. 6; paragraph [0054] – in step 604, a global location information is received, from a portable user interface unit 110 associated with the customer, of a current location of the user interface unit and that designates a delivery location where the customer would like the product delivered - typically, the delivery control circuit receives this information and confirms the viability of the requested delivery location - further, the delivery control circuit can determine a flight path from a selected launch location - the launch location may be from a distribution center, a shopping facility (e.g., retail sales facility, or any other type of facility in which products are displayed and/or sold, etc.), a selected location based on one or more deliveries within a given geographic region or area, and the like; paragraph [0055] – in step 606, a delivery is initiated by an unmanned delivery aircraft of the product to the delivery location defined by the global location information received from the user interface unit - the delivery control system may evaluate the delivery location based on information provided by the delivery aircraft, and can validate the delivery location and communicate an authorization to deliver the product once validated; paragraph [0056] - in some embodiments, one or more images and/or video are received at the control circuit 202 of the delivery control system 102 that were captured by a camera of the user interface unit 110 – further, in some instances, the one or more images and/or video is captured at approximately the time the global location information is identified by the user interface unit – the one or more images and/or vide are evaluated, and a viability of the delivery location is confirmed for a delivery by the delivery aircraft – the delivery can be initiated and/or authorization to complete the delivery can be communicated when the delivery location is confirmed as viable - typically, the evaluation of the one or more images further includes confirming there is a vertical clearance of at least a threshold diameter extending above the delivery location - some embodiments evaluate more than one image - as such two or more images may be received that are directed away from each other such that a first image is captured of a ground at the delivery location and a second image is captured in a directed that is substantially 180 degrees from the direction captured by the first image and generally directed toward the sky - the confirmation of the viability of the delivery location can include confirming the ground has a delivery area that has at least a delivery area threshold upon which the product can be deposited – the global location information and the one or more images may be received from the user interface unit in response to a selection by the customer of a single option presented to the user through a software application implemented on the user interface unit that causes the user interface unit to identify the global location information and activate the camera to capture the image; the delivery location would be the origin).
Jones et al. (U.S. Patent Application Publication 2019/0213438) discloses an information processing method comprising: setting, by a data processing unit, a real object included in a camera-captured image of the information processing device as a marker (Fig. 12; paragraph [0172] – in some implementations, a marker can be made small and placed at an inconspicuous location – for example, a QR code 1202 can be placed on a fire detector 1204 that is mounted on the ceiling – the robot 102 is provided with a high resolution camera or a zoom lens that enables the robot 102 to detect the marker on or near the ceiling – as the robot 102 moves in the home 300, the simultaneous localization and mapping (SLAM) sensors will track the locations of the objects on or near the ceiling, including the markers (e.g., the QR code 1202); paragraph [0173] – for example, when the augmented reality module 140 is used to determine coordinates of the robot 102 and the objects, the robot management program 142 prompts the user 10 to scan the markers, such as the QR code 1202 on the ceiling – the augmented reality module 140 determines the coordinates of the markers on the ceiling and uses that information to assist in sharing the virtual space coordinate system with the robot 102 – this way, when the user 10 identifies an object in the virtual space, and the augmented reality module 140 determines the coordinates of the object in the virtual space, the robot can 102 can determine which object is being identified by the user 10; paragraph [0174] - for example, using the augmented reality module 140, the user 10 can walk around the home 300, point the camera 132 of the mobile computing device 104 at various objects, and the images of the objects appear on the touch screen display of the mobile computing device 104 - the user 10 taps on an object in the image, such as a chair, and provides the label “Chair” through the user interface 136 – the augmented reality module 140 determines the coordinates of the chair in the virtual space - the mobile computing device 104 sends the virtual space coordinates of the chair and the label “Chair” to the robot 102 - using coordinate transformation or triangulation, the robot 102 determines the robot space coordinates of the object being labeled as “chair” - the next time the robot 102 navigates near the chair, the robot 102 knows that the object is associated with the label “chair” provided by the user 10).
Any inquiry concerning this communication or earlier communications from the examiner should be directed to HEATHER R JONES whose telephone number is (571)272-7368. The examiner can normally be reached Mon. - Fri.: 9:00am - 5:00pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, William Vaughn can be reached at (571)272-3922. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/HEATHER R JONES/Primary Examiner, Art Unit 2481
March 3, 2026