DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Double Patenting
Claims 3 and 20 of this application is patentably indistinct from claims 5 and 20 of Application No. 18/414,318. Pursuant to 37 CFR 1.78(f), when two or more applications filed by the same applicant or assignee contain patentably indistinct claims, elimination of such claims from all but one application may be required in the absence of good and sufficient reason for their retention during pendency in more than one application. Applicant is required to either cancel the patentably indistinct claims from all but one application or maintain a clear line of demarcation between the applications. See MPEP § 822.
It is noted that the difference between the currently filed application comprises a network interface controller configured to provide access to a network; and one obtains status data corresponding to the object and generates an overlay based on the status data, whereas Application 18/414,318 discloses obtains control data corresponding to the object and generate an augmented reality image by superimposing the control data over the image. However as noted in claim 3 of the currently filed application overlay data corresponds to an augmented overlay configured to be superimposed on the image, and claim 5 of Application 18/414,318 discloses receive the control data from the one or more controllers in response to the one or more status requests, therefore the copending applications are not patentably distinct.
18627282
1. A device, comprising: a processor; a network interface controller configured to provide access to a network; and a memory communicatively coupled to the processor, wherein the memory comprises an augmented reality management logic that is configured to: receive an image and position data; identify one or more objects visible in the image based on the position data; obtain status data corresponding to the one or more objects; and generate, based on the status data, overlay data associated with the image.
3. The device of claim 1, wherein the overlay data corresponds to an augmented overlay configured to be superimposed on the image.
20. A method comprising: receiving an image and position data; identifying one or more objects visible in the image based on the position data; obtaining status data corresponding to the one or more objects; and generating, based on the status data, overlay data associated with the image.
18414318
1. A device, comprising: a processor; a memory communicatively coupled to the processor; and an augmented reality logic, configured to: receive an image; determine device position data; identify one or more objects visible in the image based on the device position data; obtain control data corresponding to the one or more objects; and generate an augmented image by superimposing the control data on the image.
5. The device of claim 4, wherein the augmented reality logic is further configured to: identify one or more controllers associated with the one or more objects; transmit one or more status requests to the one or more controllers; and receive the control data from the one or more controllers in response to the one or more status requests.
19. A method comprising: receiving an image; determining device position data indicative of position and orientation of a device; identifying one or more objects visible in the image based on the device position data; obtaining control data corresponding to the one or more objects; generating an augmented image by superimposing the control data on the image; and displaying the augmented image.
20. The method of claim 19, further comprising: identifying one or more controllers associated with the one or more objects; transmitting one or more status requests to the one or more controllers; and receiving the control data from the one or more controllers in response to the one or more status requests.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Townend et al., U.S. RE50653 E.
Regarding claim 1, Townend discloses a device, comprising: a processor; a network interface controller configured to provide access to a network; and a memory communicatively coupled to the processor, wherein the memory comprises an augmented reality management logic (col. 12, lines 5-15) that is configured to: receive an image and position data (col. 2, lines 52-55, receiving image data including image data relating to a network asset, the image data also including information regarding a plurality of identifiers; positioned at a predetermined location relative to the network asset); identify one or more objects visible in the image based on the position data (col. 2, lines 56-59, determining an identity of the network asset based at least in part on recognition of the plurality of identifiers and a relative position); obtain status data corresponding to the one or more objects (col. 11, lines 33-36, communicate the identification and location of the network asset to the technician, or to communicate various other information to the technician as required for interaction with the asset (e.g., warranty, error, status, or instruction information)); and generate, based on the status data, overlay data associated with the image (col. 2, lines 60-62, generating overlay information identifying a location and an identity of the network asset in the image data).
Regarding claim 2, Townend discloses wherein the augmented reality management logic is further configured to transmit the overlay data to an augmented reality display (col. 11, lines 30-32, overlay information if then generated and provided to the display of the mobile device).
Regarding claim 3, Townend discloses wherein the overlay data corresponds to an augmented overlay configured to be superimposed on the image (FIG. 7; which Examiner interprets as augmented overlay configured as superimposed on the image).
Regarding claim 4, Townend discloses wherein the overlay data is configured to facilitate a generation of an augmented overlay to be superimposed on the image (col. 9, lines 10-20, the display presents to the user an interface including overlay information that includes labels for the various type of identified equipment; an overlay label can be included, either upon selection of the device type button, or in response to tapping on the display).
Regarding claim 5, Townend discloses wherein identifying the one or more objects visible in the image comprises: accessing an object identification database (col. 5, lines 15-17, network asset tracking engine includes an asset identifier database); transmitting an identification request to indicate the device position data to the object identification database (col. 5, lines 23-25, records are illustrated in the asset identifier database; specifically, each asset can have associated with it a particular location an properties); and receiving identification data from the object identification database in response to the identification request, wherein the one or more objects are identified based on the identification data (col. 5, lines 42-45, identification of the asset can correspond to an image recognition process relating to both the asset and to a location identifier).
Regarding claim 6, Townend discloses wherein the augmented reality management logic is further configured to: receive periodic status data of a plurality of objects (col. 11, lines 50-52, provides for continual (or periodic) updating of overlay data as image data is received from the mobile device); and store the periodic status data of the plurality of objects (col. 5, lines 1-4, management tracking engine can be stored at one or both location (i.e., either at a server or on a mobile device).
Regarding claim 7, Townend discloses wherein obtaining the status data comprises retrieving, from the stored periodic status data, the status data corresponding to the one or more objects (col. 9, lines 29-31, various misconfigured or error-state systems can be highlighted by accessing error reports stored at the server when generating the overlay information).
Regarding claim 8, Townend discloses wherein obtaining the status data comprises: identifying at least one controller associated with the one or more objects (col. 9, lines 42-46, a user can tap on a particular asset that is a device to display full warranty information associated with the device, or to get further information about the panel or blade, or port of such a device); transmitting a status request to the at least one controller (col. 10, lines 40-42, captured image data is then transferred to a network asset tracking engine, either at the mobile device such as device 102, or a server at a remote location); and receiving the status data corresponding to the one or more objects from the at least one controller in response to the status request (col. 47-50, supplemental information can be presented to the user as well, such as instructional videos or diagrams that illustrate how to accomplish various routing and/or maintenance tasks).
Regarding claim 9, Townend discloses wherein the status data comprises power consumption data or version data of at least one of the one or more objects (col. 6, lines 26-30, additionally, the overlay information can include, for example, additional details regarding operation of the particular asset; such as a rack system or switch, information such as a traffic load, power consumption; col. 9, lines 17-18, show the identity of the device, including a device type, software version).
Regarding claim 10, Townend discloses wherein the status data comprises communication port statistics data or client statistics data of at least one of the one or more objects (col. 6, lines 31-35, a display of memory usage, available services at the device (or a particular port of the device); additionally a number of ports or jacks actively in user, or system utilization, could be used).
Regarding claim 11, Townend discloses wherein the status data comprises operational status data of at least one of the one or more objects (col. 6, lines 26-28, overlay information can include, for example, additional details regarding operation of the particular asset).
Regarding claim 12, Townend discloses wherein the operational status data comprises at least one of: a link status, a speed status, a fault status, or a data transfer status of at least one of the one or more objects (col. 9, lines 23-31, can display information about routing from a panel, such as an identifier of another panel to which the current panel is connected; additionally, various misconfigured or error-state systems).
Regarding claim 13, Townend discloses wherein the augmented reality management logic is further configured to: generate at least one control signal corresponding to the one or more objects based on the status data; and transmit the at least one control signal to the one or more objects (col. 9, lines 60-62, options can exist for a user to select and reset one or more settings of a particular device via remote control through server at the network management facility).
Regarding claim 14, Townend discloses a device, comprising: a processor; a network interface controller configured to provide access to a network; and a memory communicatively coupled to the processor, wherein the memory comprises an augmented reality management logic (col. 12, lines 5-15) that is configured to: receive an image and position data (col. 2, lines 52-55, receiving image data including image data relating to a network asset, the image data also including information regarding a plurality of identifiers associated with the network asset and positioned at a predetermined location relative to the network asset); identify one or more objects visible in the image based on the position data (col. 2, lines 56-59, determining an identity of the network asset based at least in part on recognition of the plurality of identifiers and a relative position); obtain status data corresponding to the one or more objects (col. 11, lines 34-36, communicate various other information to the technician as required for interaction with the asset (e.g., warranty, error, status, or instructional information); translate the status data into one or more visual indicators (col. 10, lines 8-11, different images can be used on each panel (or different images can be used on opposite sides of the same panel); and generate, based on the one or more visual indicators, overlay data associated with the image (col. 10, lines 10-15, overlay information to that illustrated could be generated associated with the arrangement; using the images within the arrangement).
Regarding claim 15, Townend discloses wherein the overlay data corresponds to an augmented overlay configured to be superimposed on the image (col. 5, lines 11-14, generate an overlay indicating such information to a user by displaying combined image data and overlay information on the display of the mobile device).
Regarding claim 16, Townend discloses wherein the augmented overlay comprises the one or more visual indicators (col. 9, lines 7-8, overlay information and image data can be still image data and static overlay information; Figures 4-8).
Regarding claim 17, Townend discloses wherein at least one visual indicator of the one or more visual indicators is a graphical element (col. 9, lines 32-33, additional information associated with the asset in the detailed information area can include a textual or graphical description of a full routing arrangement including the particular asset; Figure 8; col. 10, lines 3, associated set of identifiers; lines 8-10, different images can be used on each panel).
Regarding claim 18, Townend discloses wherein the graphical element is configured to be at least partially superimposed on at least one of the one or more objects visible in the image (Figure 6 and 7).
Regarding claim 19, Townend discloses wherein the augmented reality management logic is further configured to: access a mapping database (col. 5, lines 50, accesses information in the asset identifier database); and identify the one or more visual indicators corresponding to the status data in the mapping database, wherein the status data is translated in response to identification of the one or more visual indicators in the mapping database (col. 5, lines 57-67, a specific telecommunications jack on a particular panel may require service; this related asset to the asset directly identified by the asset identified can be detected by comparison to the asset identifiers, and based on a known relative location of the asset to the asset identifiers as included in the asset identifier database; for example, if three or more identifiers are viewable within the image, the image analysis module could apply one or more triangulation algorithms).
Regarding claim 20, Townend discloses a method comprising: receiving an image and position data (col. 2, lines 52-55, receiving image data including image data relating to a network asset, the image data also including information regarding a plurality of identifiers associated with the network asset and positioned at a predetermined location relative to the network asset); identifying one or more objects visible in the image based on the position data (col. 2, lines 56-59, determining an identity of the network asset based at least in part on recognition of the plurality of identifiers and a relative position); obtaining status data corresponding to the one or more objects (col. 11, lines 34-36, communicate various other information to the technician as required for interaction with the asset (e.g., warranty, error, status, or instructional information); and generating, based on the status data, overlay data associated with the image (col. 2, lines 60-62, generating overlay information identifying a location and an identity of the network asset in the image data).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Lairsey et al., U.S. Patent Number 10,972,361 B2
Lairsey discloses 102, processor; 180, network interface; 120, memory; col. 9, lines 17-19, obtain still and motion-based images from the surroundings of the mobile service device; col. 9, lines 36-39, accelerometer module also includes an ability to locate the mobile device; accelerometer module may include a Global Positioning System (GPS) functionality to determine the location; col. 10, lines 66-67, identify the datacenter equipment within server rack; figure 3; col. 7, lines 7-10, may include information related to the health of the switch in terms of physical operational status and in terms of logical operation status such as error and alert status information; col. 11, lines 26-27, creating augmented reality overlays for display over image information.
Michalscheck et al., US. Patent Number 10,950,051 B2
Michalscheck discloses col. 15, line 26, receive captured image data and/or audio data; col. 15, lines 38-40, based on the image data and/or audio data, the cloud-based computing system may determine the identity of the industrial automation equipment; col. 14, lines 11-20, after determining the identity of the industrial automation equipment, the cloud-based computing system may determine relevant information; relevant information may include various types of information related to the industrial automation system, such as operational parameter information, status information, procedure instructions and/or assessments information; col. 20, lines 31-35, facilitate the determination of the relevant information by displaying visual representation (e.g., real objects) captured by the image data superimposed with virtual objects indicating other relevant information (e.g., operational parameters; col. 28, lines 25-28, may update procedure instruction periodically and/or in response to receiving a request to update procedure instructions; col. 28, line 34, storing procedure instructions.
Brun et al., U.S. Patent Number 11,295,135 B2
Brun discloses col. 21, line 15, receive an image; col. 21, line 20-23, image may be captured by the camera of the machine vision system and may include data indicating a position and orientation of the machine vision system when the image was captured; col. 21, lines 54-56, in response to receiving the image, the process may proceed and identify any asset identifier and network assets that are recognized in the image; col. 30, lines 27-28, process may cause an indication of the verification status; col. 30, lines 28-30, highlighting the search matrix location in green if the verification is a match or red is the verification is not a match.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Motilewa Good-Johnson whose telephone number is (571)272-7658. The examiner can normally be reached Monday - Friday 6am-2:30pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jason Chan can be reached at 571-272-3022. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
MOTILEWA . GOOD JOHNSON
Primary Examiner
Art Unit 2616
/MOTILEWA GOOD-JOHNSON/Primary Examiner, Art Unit 2619