Prosecution Insights
Last updated: April 19, 2026
Application No. 18/414,318

Augmented Reality System for Network Management

Final Rejection §103
Filed
Jan 16, 2024
Examiner
GOOD JOHNSON, MOTILEWA
Art Unit
2619
Tech Center
2600 — Communications
Assignee
Cisco Technology Inc.
OA Round
2 (Final)
73%
Grant Probability
Favorable
3-4
OA Rounds
3y 5m
To Grant
87%
With Interview

Examiner Intelligence

Grants 73% — above average
73%
Career Allow Rate
608 granted / 831 resolved
+11.2% vs TC avg
Moderate +14% lift
Without
With
+14.1%
Interview Lift
resolved cases with interview
Typical timeline
3y 5m
Avg Prosecution
35 currently pending
Career history
866
Total Applications
across all art units

Statute-Specific Performance

§101
8.9%
-31.1% vs TC avg
§103
48.8%
+8.8% vs TC avg
§102
24.4%
-15.6% vs TC avg
§112
11.0%
-29.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 831 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1-10 and 14-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mick et al., U.S. Patent Publication Number 2013/0031202 A1, in view of Kin et al., U.S. Patent Number 10,739,861 B2. Regarding claim 1, Mick discloses a device (210, portable device), comprising: a processor; a memory communicatively coupled to the processor; and an augmented reality logic, configured to: receive an image (paragraph 0013, receiving an image); determine device position data (paragraph 0004, location information regarding a location of the portable device in proximity to the equipment; paragraph 0013, smart phone or other computing device may include a GPS sensor that can provide accurate geographic location information); identify one or more objects visible in the image based on the device position data (paragraph 0005, identification information received from the mobile device; paragraph 0004, communicate identification information; see also figure 3); obtain control data corresponding to the one or more objects (paragraph 0004, maintains information regarding controllable systems; overlay information for the controllable system can be received); and generate an augmented image by superimposing the control data on the image (paragraph 0005, can access selected information of the controllable system; send at least some of the information to mobile device; this information may be presented on display of the mobile device as an overlay to an image of the controllable system; paragraph 0015, specific information regarding equipment present in the rack or other enclosure can be communicated and displayed directly on the smart phone or tablet, overlaid to an image of the equipment). However, it is noted that Mick fails to disclose a position and orientation of the device; and using one or more computer vision techniques. Kin discloses a position and orientation of the device (col. 7, lines 22-29, generates IMU tracking data indicating an estimated position of the NED; position sensors include multimer accelerometers to measure translational motion; and multimer gyroscopes to measure rotation motion); and using one or more computer vision techniques (col. 8, line 7, use of computer vision algorithms). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to include in the augmented reality interface as disclosed by Mick, indicating a position and orientation of the device, to determine position and orientation to display the overlaid controls in the image. It further would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to include the use of computer vision as disclosed by Kin, to capture more depth in the environment and identify various gestures of the user. Regarding claim 2, Mick discloses wherein the augmented reality logic is further configured to: access an object identification database (paragraph 0005, manager can access selected information of the controllable system from the infrastructure system responsive to identification information); transmit an identification request indicative of the device position data to the object identification database (paragraph 0005, different servers or other systems that can store and access the information about the systems; paragraph 0014, collected information, e.g., image, geographic information and user information can then be communicated to the augmented reality manager as a query); receive identification data from the object identification database in response to the identification request (paragraph 0014, obtain information about the components in the visual proximity of the computing device) ; and identify the one or more objects visible in the image based on the identification data (paragraph 0005, responsive to identification information received from mobile device; paragraph 0014, populate information on the computing device’s display with data obtained). Regarding claim 3, Mick discloses wherein the augmented reality logic is further configured to: receive biometric data (paragraph 0014, identity of the user may be communicated to the augmented reality manager, Examiner interprets identifying of the user as biometric data, in that it is specific to a user); authenticate a user of the device based on the biometric data (paragraph 0014, communication may also include, or be preceded by, a login/authentication protocol to ensure the user is authorized to access the augmented reality manager); and determine a user identifier corresponding to the user (paragraph 0024, enable an identification of individuals that have accessed a secure location; imaging the badge reader or other controlled access point; identify the individuals who have gained access to the secure location). Regarding claim 4, Mick discloses wherein the augmented reality logic is further configured to: access an administrative database (paragraph 0014, image may be communicated to an augmented reality manager); transmit an access control request indicative of the user identifier to the administrative database (paragraph 0014, identity of the user may be communicated to the augmented reality manager); and receive access control data from the administrative database in response to the access control request (paragraph 0014, by providing this information, targeted information for use by the type of individual can be accessed and sent to the user; paragraph 0020, application can be downloaded, e.g., from a storage of the augmented reality system by an authorized user to enable the user to access information while present at a particular location). Regarding claim 5, Mick discloses wherein the augmented reality logic is further configured to: identify one or more controllers associated with the one or more objects (paragraph 0021, remote control, diagnostics and/or system debugging tools can be accessed by a user via augmented reality manager); transmit one or more status requests to the one or more controllers (paragraph 0016, the information can be of many different types, including status of a given server); and receive the control data from the one or more controllers in response to the one or more status requests (paragraph 0015, specific information regarding equipment present in the rack or other enclosure can be communicated and displayed directly on the smart phone or tablet, overlaid to an image of the equipment itself). Regarding claim 6, Mick discloses wherein the one or more status requests are indicative of: the identification data corresponding to the one or more objects (paragraph 0013, can have a machine-readable identifier); and the access control data corresponding to the user (paragraph 0012, augmented reality tool to provide relevant information to a user; a financial analyst or other financial personnel of the data center can use the same augmented reality system to obtain different information tailored to his or her needs; a maintenance worker of the data center facility may be able to obtain limited information). Regarding claim 7, Mick discloses wherein the identification data is indicative of one or more of: three-dimensional positional coordinates corresponding to the one or more objects; object identifiers corresponding to the one or more objects; or controllers associated with the one or more objects (paragraph 0016, information, such as an identification of cabling or other connections between a given server and other equipment in the rack, can also be provided). Regarding claim 8, Mick discloses wherein the augmented reality logic is further configured to generate one or more control signals corresponding to the one or more objects based on the control data (paragraph 0018, user may take remote control actions using the smart phone, tablet or other computing device; user can be provided with a menu of options such as an option to reboot the machine, or to perform a hard reset or another type of control operation). Regarding claim 9, Mick discloses wherein the augmented reality logic is further configured to: receive an input from the user (paragraph 0018, user may take remote control actions); and generate the one or more control signals based on the control data and the input (paragraph 0018, perform a hard reset or another type of control operation). Regarding claim 10, Mick discloses wherein the augmented reality logic is further configured to transmit the one or more control signals to the one or more objects (paragraph 0018, user may take remote control actions using the smart phone, tablet or other computing device; user can be provided with a menu of options such as an option to reboot the machine, or to perform a hard reset or another type of control operation). Regarding claim 14, Mick discloses wherein the augmented reality logic is further configured to display the augmented image on a display (paragraph 0019, overlay information can be dynamically displayed to correspond to the view of the equipment present on the display of the computing device). Regarding claim 15, Mick discloses wherein the one or more objects are one or more electronic devices (paragraph 0010, equipment present in a data center can include various server systems such as rack-based server systems; may include various storages such as storage attached networks, switching equipment, firewalls, load balancers and many other types of equipment). Regarding claims 16-18, they are rejected based upon similar rational as above claims 1, 10 and 1. Mick further discloses device position data indicative of position and orientation of the device (paragraph 0014). Regarding claims 19 and 20, they are rejected based upon similar rational as above claims 1 and 5. Mick further discloses a method (paragraph 0003). Claim(s) 11-13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mick et al., in view of Kin et al., as applied to claim 1 above, and further in view of Saurabh et al., U.S. Patent Publication Number 2019/0129607 A1. Regarding claim 11, it is noted that Mick discloses device location and GPS, but fails to specifically disclose wherein the device position data includes: three-dimensional positional coordinates indicative of a position of the device; and three-dimensional angular coordinates indicative of an orientation of the device. Kin discloses col. 7, lines 21-28, generates IMU tracking data indicating an estimated position of the NED 305 relative to an initial position of the NED 305. For example, the position sensors 335 include multiple accelerometers to measure translational motion (forward/back, up/down, left/right) and multiple gyroscopes to measure rotational motion (e.g., pitch, yaw, and roll). Saurabh discloses wherein the device position data includes: three-dimensional positional coordinates indicative of a position of the device; and three-dimensional angular coordinates indicative of an orientation of the device (paragraph 0231, IPS will fine the exact spatial coordinates of the AR device; Inertial measurements, Angle of arrival). It would have been obvious to one of ordinary skill in the art before the effective date of filing of the application to include in the device position as disclosed by Mick, the angular coordinates as disclosed by Saurabh, to provide accurate positioning of the overlays of the captured devices with the control information on the augmented reality device. Regarding claim 12, Saurabh discloses wherein the augmented reality logic is further configured to receive the three-dimensional angular coordinates from an Inertial Measurement Unit (IMU) (paragraph 0231, positioning unit uses Indoor Positioning System (IPS) to place multimedia content in the real scene being previewed on the AR enabled display; IPS will fine the exact spatial coordinates of the AR device; IPS enables identifying exact spatial coordinates of an image being captured and processed by the imaging unit; if the spatial coordinates coincide with the 360 degree parameters stored in the placement database, then the corresponding multimedia content of the connected device is place at that exact spatial position in the AR; IPS technologies may include, but are not limited to : Inertial measurements, Angle of arrival). Regarding claim 13, Mick discloses wherein the augmented reality logic is further configured to determine the three-dimensional positional coordinates based on one or more Radio Frequency (RF) signals received by the device (paragraph 0013, the rack may have radio frequency; paragraph 0014, GPS sensor that can provide relatively accurate geographic location information; or a location can be determined by using triangulation information, e.g., accessed from multiple wireless access points in proximity to the smart phone). Response to Arguments Applicant’s arguments, see pages 7-8, filed 10/24/2025, with respect to the rejection(s) of claim(s) 1-20 under 102 have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of 103, Mick in view of Kin. Applicant argues the prior art cited Mick fails to disclose position and orientation of the device and using one or more computer vision techniques. Examiner responds Kin discloses col. 7, lines 22-29, generates IMU tracking data indicating an estimated position of the NED; position sensors include multimer accelerometers to measure translational motion; and multimer gyroscopes to measure rotation motion; and col. 8, line 7, use of computer vision algorithms. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. 20150070347 A1 Hoffman et al. Hoffman discloses computer-vision based augmented reality system, and further discloses customizable graphical user interface, and the graphical user interface is displayable in perspective with objects in augmented reality through the user of computer vision techniques. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Motilewa Good-Johnson whose telephone number is (571)272-7658. The examiner can normally be reached Monday - Friday 6am-2:30pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jason Chan can be reached at 571-272-3022. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. MOTILEWA . GOOD JOHNSON Primary Examiner Art Unit 2616 /MOTILEWA GOOD-JOHNSON/ Primary Examiner, Art Unit 2619
Read full office action

Prosecution Timeline

Jan 16, 2024
Application Filed
Jul 23, 2025
Non-Final Rejection — §103
Aug 11, 2025
Applicant Interview (Telephonic)
Aug 11, 2025
Examiner Interview Summary
Oct 24, 2025
Response Filed
Jan 09, 2026
Final Rejection — §103
Apr 13, 2026
Request for Continued Examination
Apr 15, 2026
Response after Non-Final Action

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602107
SYSTEM AND METHOD FOR DETERMINING USER INTERACTIONS WITH VISUAL CONTENT PRESENTED IN A MIXED REALITY ENVIRONMENT
2y 5m to grant Granted Apr 14, 2026
Patent 12602884
DISPLAY SYSTEM AND DISPLAY METHOD FOR AUGMENTED REALITY
2y 5m to grant Granted Apr 14, 2026
Patent 12597218
EXTENDED REALITY (XR) MODELING OF NETWORK USER DEVICES VIA PEER DEVICES
2y 5m to grant Granted Apr 07, 2026
Patent 12592047
Method and Apparatus for Interaction in Three-Dimensional Space, Storage Medium, and Electronic Apparatus
2y 5m to grant Granted Mar 31, 2026
Patent 12573100
USER-DEFINED CONTEXTUAL SPACES
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
73%
Grant Probability
87%
With Interview (+14.1%)
3y 5m
Median Time to Grant
Moderate
PTA Risk
Based on 831 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month