Prosecution Insights
Last updated: April 19, 2026
Application No. 17/694,745

METHOD AND SYSTEM FOR DISPLAYING AND MANAGING A SITUATION IN THE ENVIRONMENT OF AN AIRCRAFT

Final Rejection §103
Filed
Mar 15, 2022
Examiner
HE, WEIMING
Art Unit
2611
Tech Center
2600 — Communications
Assignee
Airbus Helicopters
OA Round
5 (Final)
46%
Grant Probability
Moderate
6-7
OA Rounds
3y 4m
To Grant
60%
With Interview

Examiner Intelligence

Grants 46% of resolved cases
46%
Career Allow Rate
190 granted / 410 resolved
-15.7% vs TC avg
Moderate +14% lift
Without
With
+13.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 4m
Avg Prosecution
40 currently pending
Career history
450
Total Applications
across all art units

Statute-Specific Performance

§101
7.4%
-32.6% vs TC avg
§103
59.2%
+19.2% vs TC avg
§102
12.4%
-27.6% vs TC avg
§112
15.0%
-25.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 410 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment The amendment filed on 10/28/25 has been entered and made of record. Claim 3 is amended. Claims 2, 8, 11 and 18 are cancelled. Claims 1, 3-7, 9-10, 12-17 and 19-23 are pending. Response to Arguments Applicant’s arguments with respect to claims 1, 17 and 21 have been considered but they are not persuasive. Applicant asserts that the cited references fail to disclose, teach, or suggest at least i) the gaze-centered surveillance zone (i.e., the monitoring zone being a surveillance zone of a region of the external landscape viewed by the occupant of the aircraft, the surveillance region centered on the region of the external landscape viewed by the occupant of the aircraft), ii) the second image derived from actual image capture devices (i.e., the second image constructed from the images of the environment of the aircraft captured by the image capture devices), or iii) the display behavior being independent of head orientation (i.e., during the display step, the second image is displayed irrespective of the position and orientation of the head of the occupant) (p. 11 of Remarks). Examiner notices that independent claims don’t recite limitation “gaze-centered surveillance zone). Here, the monitoring zone is merely one surveillance zone within a display screen, which is not related to a gaze control. As to the second image, Lux discloses “a second display screen presenting symbols superimposed on said real world… in a second step, a symbol representative of said selected object appears on the second screen of visualization, said symbol being superimposed on the real object or in the vicinity of the real object if the latter is in the visual field of said second display screen” at p. 3. It is obvious that the image of real world is captured by a camera. Since the camera can be fixed at the outside of the aircraft, the second image displayed on the second screen is not related to the position and orientation of the head of the pilot. Applicant also alleges that the cited references fail to teach at least the limitation of claim 1 of, “the monitoring zone being a surveillance zone of a region of the external landscape viewed by the occupant of the aircraft, the surveillance region centered on the region of the external landscape viewed by the occupant of the aircraft" (p. 11 of Remarks). Applicant argues that Lux doesn’t disclose an image capture device, use the line of sight to determine the monitoring zone. Examiner notices that claim 1 merely recites “determining a monitoring zone in the environment of the aircraft”, which determining step doesn’t request using the line of sight to determine the monitoring zone. Lux teaches both head-down display and head-mounted display in the application of aircraft; “Indeed, the image displayed in this type of system is on the exterior landscape and the objects of interest seen are real” in [0001]. Here, it is inherent that there is an image capturing device to capture exterior landscape and real world object. Lux also discloses “The method of designating and displaying information according to the invention comprises two main stages. In a first step, we designate and select, on a first display screen, a particular object in the real world or in the synthetic image of this real world. 5 In a second step, a symbol representative of this object appears on the second screen superimposed on the real object or on its synthetic representation depending on the type of display screen… the designation/selection condition can be a distance condition from the aircraft. In this case, several objects can be displayed at the same time because they correspond to the same selection condition. This selection condition can be determined by: - a predefined value corresponding, for example, to all traffic present within a determined distance from the first aircraft; 25 - a value selected from a list of predefined values such as, for example, distance values to the first aircraft; - a value defined by the user such as, for example, a distance to the first aircraft” at p. 4. Here, in response to an object selection on the first display device (HDD), a monitoring zone in the environment is determined, such as, a symbol representative of the selected object is displayed in Fig 2-5, one or more objects satisfied for the selection condition can be displayed at the same time. Lux also discloses “a zoom on all the objects close to the designation zone… an “exploded” and fixed representation of each of the nearby objects… an “exploded” and animated representation of each of the objects, on a given surface… This information is, for example, the type of aircraft, its distance from the first aircraft, its speed, its direction, its altitude, etc.” at p. 6; “the satellite positioning system "GPS" and the inertial unit allowing to determine the position and attitude of the aircraft or the accuracy of the databases” at p. 7. Here, a zoom of all the objects close to the designation zone, may provide the coordinates of the monitoring zone, while altitude value also defines a coordinate of the monitoring zone. CAZAUX further discloses a set of windows in an aircraft cockpit and user may select one window (i.e. W4 is 3D view of the terrain being flown over) as shown in Fig 1. Here, the combination of Lux and CAZAUX teaches selecting one interest of object(s) or a window for display. However, Lux and CAZAUX don’t explicitly teach centering the selected object (i.e. a surveillance region) within the display. Josephson further discloses “Once the selection object and a selectable display object touch or the selection object and a selectable display object active area touch or the selection object and a selectable display object is predicted with a threshold degree of certainty, a triggering threshold event (this may be the distance of proximity or probability without ever touching), the selectable object(s) is selected and non-selected display object are removed from the display or fade away or become less prominent or change in such a way that they are recognizable as the non-selected object(s) and the selected object is centered within the display or at a predetermined position, is adjusted to a desired amount if an adjustable attribute, or is executed if the selected object (s) is an attribute or selection command, or any combination of these” in [0017]. Therefore, the combination of Lux, CAZAUX and Josephson teaches every element of claim 1. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1, 4-7, 14 and 19-23 are rejected under 35 U.S.C. 103 as being unpatentable over Lux et al. (FR3057685) in view of CAZAUX et al. (US 2017/0003838) and Josephson (US 2015/0135132 A1). As to Claim 1, Lux teaches A method for displaying and managing a situation in an environment of an aircraft, the aircraft comprising image capture devices for capturing images of the environment of the aircraft, at least one calculator, at least one receiving device, at least one first display device arranged inside a cockpit of the aircraft, at least one second display device intended to be positioned at the head of an occupant of the aircraft, at least one sensor arranged inside the aircraft in order to determine a position and an orientation of the head of the occupant with respect to the aircraft, a selection device and a system for tracking the aircraft, the method comprising the following steps (Lux discloses a first display screen [HDD], a designation device [selecting device] associated with the first display screen, a second display screen [HMD], processor [calculator] at p. 3, GPS and inertial unit [sensor for position/orientation of HMD] at p. 6, see also Fig 1. Here, Lux discloses a display system for an aircraft comprising a first display screen presenting image of real world, which refers to an image capture device for capturing images of environment of the aircraft. For example, CAZAUX discloses on-board video cameras in [0008]): determining a monitoring zone in the environment of the aircraft; displaying a first image representing the monitoring zone on the first display device (Lux discloses “The invention finally relates to a method for designating and displaying information in a display system for an aircraft comprising at least: - a first display screen presenting symbols superimposed on the real world…a designation device associated with said first display screen; 10 characterized in that, when, in a first step of the method, the designation device selects a real object in the visual field of said first display screen” at p. 3); selecting, on the first display device, a center of interest in the monitoring zone, by means of the selection device (Lux discloses “in a first step of the method, the designation device selects a real object in the visual field of said first display screen” at p. 3. Here, the selected real object refers to a center of interest in the monitoring zone.); displaying a second image representing the center of interest on the second display device; and displaying a sighting marker indicating the center of interest on the second image (Lux discloses “a second display screen presenting symbols superimposed on said real world… in a second step, a symbol representative of said selected object appears on the second screen of visualization, said symbol being superimposed on the real object or in the vicinity of the real object if the latter is in the visual field of said second display screen” at p. 3, see also symbol [sighting marker] in Fig 6.); wherein the step of determining a monitoring zone in the environment of the aircraft is performed, by a selection by an occupant of the aircraft by means of a selection device on the first display device, the first display device displaying an image representing the environment of the aircraft in the form of an aerial view; or by receiving the coordinates of the monitoring zone via a receiving means (Lux discloses “The method of designating and displaying information according to the invention comprises two main stages. In a first step, we designate and select, on a first display screen, a particular object in the real world or in the synthetic image of this real world. 5 In a second step, a symbol representative of this object appears on the second screen superimposed on the real object or on its synthetic representation depending on the type of display screen… the designation/selection condition can be a distance condition from the aircraft. In this case, several objects can be displayed at the same time because they correspond to the same selection condition. This selection condition can be determined by: - a predefined value corresponding, for example, to all traffic present within a determined distance from the first aircraft; 25 - a value selected from a list of predefined values such as, for example, distance values to the first aircraft; - a value defined by the user such as, for example, a distance to the first aircraft” at p. 4. Here, in response to an object selection on the first display device (HDD), a monitoring zone in the environment is determined, such as, a symbol representative of the selected object is displayed in Fig 2-5, one or more objects satisfied for the selection condition can be displayed at the same time. Lux also discloses “a zoom on all the objects close to the designation zone… an “exploded” and fixed representation of each of the nearby objects… an “exploded” and animated representation of each of the objects, on a given surface… This information is, for example, the type of aircraft, its distance from the first aircraft, its speed, its direction, its altitude, etc.” at p. 6; “the satellite positioning system "GPS" and the inertial unit allowing to determine the position and attitude of the aircraft or the accuracy of the databases” at p. 7. Here, a zoom of all the objects close to the designation zone, may provide the coordinates of the monitoring zone, while altitude value also defines a coordinate of the monitoring zone. CAZAUX further discloses a set of windows in an aircraft cockpit and user may select one window (i.e. W4 is 3D view of the terrain being flown over) as shown in Fig 1); wherein, during the display step, the second image is constructed from the images of the environment of the aircraft captured by the image capturing devices (Lux discloses “More specifically, the subject of the invention is a method for designating and displaying information in a display system for an aircraft comprising at least: - a first display screen presenting a synthetic image of the real world; - a designation device associated with said first display screen; 3057685 3 - a second display screen presenting symbols superimposed on said real world” at p. 2, see also Fig 2-9); wherein, during the display step, the second image is displayed irrespective of the position and orientation of the head of the occupant (Lux discloses “In a second mode, we can associate with the size of the symbol a notion of precision on the real position of the object” at p. 6; “The designated objects can be mobile. They correspond, for example, to air traffic. The symbology displayed to represent the information corresponding to this mobile object is then regularly updated and follows the movement of the object in real time” at p. 7. Here, the symbol is associated with a real object and updated by following the movement of the object, which is independent of the position and orientation of the head of the occupant.) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the invention of Lux with the teaching of CAZAUX so that the interior or exterior of the aircraft can be captured by on-board video cameras to assist pilot for flighting control (CAZAUX, [0008]) and any window can be selected by user for monitoring (CAZAUX, Fig 1). Lux and CAZAUX don’t explicitly teach centering the surveillance region. The combination of Josephson further teaches following limitations: the monitoring zone being a surveillance zone of a region of the external landscape viewed by the occupant of the aircraft, the surveillance region centered on the region of the external landscape viewed by the occupant of the aircraft (Lux discloses “The invention finally relates to a method for designating and displaying information in a display system for an aircraft comprising at least: - a first display screen presenting symbols superimposed on the real world…a designation device associated with said first display screen; 10 characterized in that, when, in a first step of the method, the designation device selects a real object in the visual field of said first display screen” at p. 3. Josephson further discloses “Once the selection object and a selectable display object touch or the selection object and a selectable display object active area touch or the selection object and a selectable display object is predicted with a threshold degree of certainty, a triggering threshold event (this may be the distance of proximity or probability without ever touching), the selectable object(s) is selected and non-selected display object are removed from the display or fade away or become less prominent or change in such a way that they are recognizable as the non-selected object(s) and the selected object is centered within the display or at a predetermined position, is adjusted to a desired amount if an adjustable attribute, or is executed if the selected object (s) is an attribute or selection command, or any combination of these” in [0017].) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the invention of Lux and CAZAUX with the teaching of Josephson so that the display, movement, and positioning of sublist members may be simultaneous and synchronous or asynchronous with the movement and display of the selectable object(s) being influenced by the motion of the selection object(s) (Josephson, [0017]). As to Claim 4, Lux in view of CAZAUX and Josephson teaches The method according to claim 1, wherein the monitoring zone is a circle centered on the zone of the external landscape viewed by the occupant of the aircraft and has predetermined dimensions having a radius equal to one meter to several hundred meters (Lux discloses “the designation/selection condition can be a distance condition from the aircraft. In this case, several objects can be displayed at the same time because they correspond to the same selection condition. This selection condition can be determined by: - a predefined value corresponding, for example, to all traffic present within a determined distance from the first aircraft; 25 - a value selected from a list of predefined values such as, for example, distance values to the first aircraft; - a value defined by the user such as, for example, a distance to the first aircraft” at p. 4. Here, the monitoring range can be obtained from the distance, speed or attitude etc., or selection from the known values. Josephson, [0017] and Fig 1F-1G.) As to Claim 5, Lux in view of CAZAUX and Josephson teaches The method according to claim 1, wherein the center of interest is a single point and constitutes a point of interest, the sighting marker indicating the point of interest during the display step (Lux discloses “One of the functions of this type of system is to provide information on certain objects in the outside world, these objects being able to be fixed or mobile. For example, in the aeronautics field, fixed objects are airports or their landing strips or certain natural or artificial obstacles or waypoints called “waypoints”. The moving objects are, for example, aircraft located in the environment close to the device. Generally speaking, in the aeronautical field, the invention relates to any object of interest stored in a database or whose information is known to the avionics system, either by air traffic control or by other means” at p. 1. Although an aircraft is used as embodiment in Fig 2-8, it is obvious that the aircraft can be represented as a single point on the aircraft monitor screen.) As to Claim 6, Lux in view of CAZAUX and Josephson teaches The method according to claim 1, wherein the center of interest is a part of the monitoring zone, the sighting marker indicating a center of the part of the monitoring zone during the display step (Lux discloses “The symbology representing the object in the field of vision can take different forms. A first type of representation… A second type of representation can be a symbol S surrounding the object in a conformal manner, such as a square, a rectangle, a circle, an ellipse, or any other symbol whose center is “transparent”.… A third type of representation can be a conforming symbol, transparent or not in its center, different from the first type…” at p. 6, see also Fig 6-8. Here, it is obvious that first or second type of representation indicates a center of the selected object in the monitoring zone.) As to Claim 7, Lux in view of CAZAUX and Josephson teaches The method according to claim 6, wherein, the aircraft including an auxiliary selection device, the method includes an additional step of selecting a point of interest in the part of the monitoring zone by means of the second display device and the auxiliary selection device, the sighting marker indicating the point of interest during the display step (Lux discloses “In the case where the designation of the real object is carried out in the HMD, this designation can lead to the display of the information directly on the synthetic three-dimensional view of the HDD, but also on a two-dimensional view of the HDD for a better understanding of the situation by the pilot. For example, the designation of the object can be done by means of the line of sight and a control means associated with the HMD. This means of control can be a simple control button or even the scroll wheel of a mouse” at p. 7.) As to Claim 14, Lux in view of CAZAUX and Josephson teaches The method according to claim 1, wherein the method includes additional steps of displaying information relating to the monitoring zone on the first image and to the center of interest and/or the environment of the center of interest on the second image (Lux discloses “When the object is outside the field of view of the HMD, a different symbology is displayed at the edge of the field to indicate in which direction to look to view the selected object. This last point is illustrated in Figure 9 where a symbology comprising three triangles T appears on the right edge of the field of the HMD indicating to the user in which direction he must turn his head to see the aircraft A. The designated objects can be mobile. 290 They correspond, for example, to air traffic. The symbology displayed to represent the information corresponding to this mobile object is then regularly updated and follows the movement of the object in real time” at p. 6-7.) Claim 19 recites similar limitations as claim 1 but in a system form, except for auxiliary selection device and a movable member (Josephson, Fig 1 & 5). Therefore, the same rationale used for claims 1 is applied. Claim 20 recites similar limitations as claim 19 but in an aircraft form. Therefore, the same rationale used for claim 19 is applied. Claim 21 is rejected based upon similar rationale as Claim 1. Claim 22 is rejected based upon similar rationale as Claims 3 & 4. Claim 23 is rejected based upon similar rationale as Claim 5. Claim 3 is rejected under 35 U.S.C. 103 as being unpatentable over Lux in view of CAZAUX and Josephson, further in view of Lafon et al. (US 2020/0183154 A1). As to Claim 3, Lux in view of CAZAUX and Josephson teaches The method according to claim 1, wherein, during the display step, the monitoring zone is displayed on the first display device as an aerial view, wherein the monitoring zone is dynamically defined in real time as a function of the position and orientation of the head of the occupant, such that the monitoring zone remains centered on a region of the actual external landscape viewed by the occupant, independently of graphical elements or selectable objects displayed on the first display device (Lux teaches displaying an aerial view in Fig 2. Lafon further discloses “determining a line of sight of the pilot, receiving a command to enter the frozen display mode and to keep at least part of the display on the second man-machine interface in a frozen position, independently of the pilot's line of sight, displaying at least one complementary man-machine interface element, as a function of the pilot's line of sight, in at least one free area of the cockpit” in [0012-0014]; “This system further includes a device configured to determine the line of sight of the pilot as a function of a position of the head and/or eyes of the pilot, the second controller being configured in order, in a first standard display mode, to adapt at least one display on the second man-machine interface as a function of the pilot's line of sight. The system further includes a control member able to be actuated by the pilot in order to enter a frozen display mode, the second controller maintaining at least part of the display on the second man-machine interface in a frozen position independently of the pilot's line of sight, and a third controller suitable for displaying at least one complementary man-machine interface element, as a function of the pilot's line of sight, in at least one free area of the cockpit, the system further being configured to display a cursor representative of the pilot's line of sight” in [0024]; see also [0092].) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the invention of Lux, CAZAUX and Josephson with the teaching of Lafon so as to offer the pilot an extended, effective and ergonomic man-machine interface, allowing them to access all information and to interact quickly with the control system (Lafon, [0096]). Claims 9-10 are rejected under 35 U.S.C. 103 as being unpatentable over Lux in view of CAZAUX and Josephson, further in view of Ryota et al. (JP2017107535A). As to Claim 9, Lux in view of CAZAUX and Josephson teaches The method according to claim 1, wherein the second image includes a non-distorted central view of a first part of the environment outside the aircraft and a distorted peripheral view of a second part of the environment outside the aircraft, the peripheral view being situated around the central part, the first part of the environment outside the aircraft comprising the center of interest, the second part of the environment outside the aircraft being situated around the first part (Lux discloses HMD display an image of the environment outside the aircraft in Fig 2. Ryota further discloses “For the reception of the selected part of the display from the input, and processing the selected part as a first region, one or more based on the selected part, generated by the (continuous distortion function) Configured to determine a second region from the camera image of… The second area may be a panoramic area of the image to be processed” at p. 3; “The exemplary implementation shows that the central undistorted (perspectively accurate) region is surrounded by a distorted larger peripheral image and joined seamlessly (i.e. without discontinuities or unnatural corners)… The central undistorted area and the larger distorted peripheral area are combined to produce a continuous and seamless view image by using the continuous distortion function” at p. 6.) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the invention of Lux, CAZAUX and Josephson with the teaching of Ryota so as to provide user with a wide view or perimeter recognition by combining undistorted central region and distorted peripheral region to provide user with substantially the same view as at a remote location (Ryota, p. 2). As to Claim 10, Lux in view of CAZAUX, Josephson and Ryota teaches The method according to claim 9, wherein the first part and the second part of the environment together cover the whole of the environment around the aircraft, the image capture devices being arranged so as to capture images that together cover the entire external environment in which the aircraft is travelling (Ryota discloses “Aspects of the present disclosure may include a first device configured to steer a movable second device. The first device may include a display and a processor that processes each of the one or more camera images from the movable second device into a first region and a second region; The second region is a distorted image surrounding the first region, the first region is a central region of each of the one or more camera images, and the first region and the second region are continuous distortion functions. For the reception of the selected part of the display from the input, and processing the selected part as a first region, one or more based on the selected part, generated by the (continuous distortion function) Configured to determine a second region from the camera image of and to direct the movable second device to be directed forward based on the selected portion . The second area may be a panoramic area of the image to be processed” at p. 2-3.) Claims 12-13 are rejected under 35 U.S.C. 103 as being unpatentable over Lux in view of CAZAUX and Josephson, further in view of ANDRE et al. (US 2017/0053453). As to Claim 12, Lux in view of CAZAUX and Josephson teaches The method according to claim 1, wherein, during the display step, the second image is modified as a function of the movements of the head of the occupant (Lux discloses “When the object is outside the field of view of the HMD, a different symbology is displayed at the edge of the field to indicate in which direction to look to view the selected object. This last point is illustrated in Figure 9 where a symbology comprising three triangles T appears on the right edge of the field of the HMD indicating to the user in which direction he must turn his head to see the aircraft A” at p. 6-7. Here, the scrolling of the second image can be interpreted as a modification of the second image. ANDRE further discloses “When the marking symbol is in the visual field of the user, the latter can then overlay it on an area or a particular object of the terrain being flown over by a simple head movement” in [0030]. Here, ANDRE teaches the second image is modified by a head movement.) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the invention of Lux, CAZAUX and Josephson with the teaching of ANDRE so as to overlay a marking symbol on an area or a particular object of the terrain by a head movement (ANDRE, [0030]). As to Claim 13, Lux in view of CAZAUX, Josephson and ANDRE teaches The method according to claim 12, wherein the display step includes the following sub-steps: determining the position and orientation of the head of the occupant; and calculating a new second image, following a movement of the head of the occupant, as a function of the position and the orientation of the head and based on the images of the environment of the aircraft captured by the image capture devices (Lux discloses “Indeed, the overall precision 5 is linked to the precision of different elements such as the position detection system, making it possible to determine the orientation of the pilot's head, the satellite positioning system "GPS" and the inertial unit allowing to determine the position and attitude of the aircraft or the accuracy of the databases. The graphic representation of the 10 symbol is therefore not completely consistent, that is to say not perfectly superimposed on the real position of the object. The avionics system is capable of determining the overall precision with which the three-dimensional position of the object to be represented is calculated. We can then vary the size of the symbol with the precision. In other words, the lower the precision, the larger the size of the symbol, in order to guarantee that the real object is indeed inside the symbol when the pilot seeks to acquire it visually” at p. 6; “The designated objects can be mobile. They correspond, for example, to air traffic. The symbology displayed to represent the information corresponding to this mobile object is then regularly updated and follows the movement of the object in real time. Of course, the method according to the invention can be applied to a plurality of selected objects. In the case where the designation of the real object is carried out in the 35 HMD, this designation can lead to the display of the information directly 3057685 12 on the synthetic three-dimensional view of the HDD…For example, the designation of the object can be done by means of the line of sight and a control means associated with the HMD” at p. 7.) Claims 15 and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Lux in view of CAZAUX and Josephson, further in view of Lee (US 2018/0356884). As to Claim 15, Lux in view of CAZAUX and Josephson teaches The method according to claim 1. The combination of Lee further teaches wherein, after the step of displaying a sighting marker indicating the center of interest, the sighting marker is moved on the second image following a movement of the head of the occupant, according to the movement of the head (Lux teaches display a marker indicating the center of interest in Fig 6. Lee further discloses “Referring to FIG. 1B, the user 50 who wears the HMD device 100 may move the head in up and down directions and, according to movement of the head, the user 50 may see upper and lower portions of the VR image through the view region 60” in [0058]; “according to movement of the head, the user 50 may see left and right portions of the VR image through the view region 60” in [0059].) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the invention of Lux, CAZAUX and Josephson with the teaching of Lee so as to control the display to present a view region corresponding to the head movement (Lee, [0006]). Claim 17 is rejected based upon similar rationale as Claim 1 and 15. Claim 16 is rejected under 35 U.S.C. 103 as being unpatentable over Lux in view of CAZAUX and Josephson, further in view of Gold (US 2019/0301837). As to Claim 16, Lux in view of CAZAUX and Josephson teaches The method according to claim 1. The combination of Gold further teaches wherein, the aircraft including a movable member, the method includes the following additional steps: slaving the movable member pointing to the sighting marker; locking the movable member on the sighting marker by means of a locking device; and activating the movable member towards the locked sighting marker (Lux teaches a method for designating and displaying information (i.e. symbol) in both head-mounted display and head-down display for an aircraft in Abstract. The cited limitation is well-known to the skilled in the field of military. For example, Gold discloses “a Mark Point or Lock On for a Sensor Point of lnterest within the region of interest. In some cases, a subsequent action comprises disengaging, attacking, or releasing a weapon” in [0007]; “The "LOCK ON" describes the moment when the snapshot is taken and refers to the requester (e.g., a pilot) invoking a Sensor Point of Interest (J12.6) indication at a specified geographical point (e.g., latitude, longitude)” in [0027]; “The "LOCK ON" describes the moment when the snapshot is taken and refers to the requester (e.g., a pilot) invoking a Sensor Point of Interest (J12.6) indication at a specified geographical point (e.g., latitude, longitude)” in [0029]; “when a pilot designates an SPI with "Lock On/Mark Point" that then correlates with a "Weapon Release." “ in [0055].) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the invention of Lux, CAZAUX and Josephson with the teaching of Gold so as to apply “lock on” for the designated point of interest (target) and release weapon (i.e. missile) to attack the point of interest in a military application. Conclusion THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to WEIMING HE whose telephone number is (571)270-1221. The examiner can normally be reached Monday-Friday, 8:30am-5:00pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Tammy Goddard can be reached on 571-272-7773. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Weiming He/ Primary Examiner, Art Unit 2611
Read full office action

Prosecution Timeline

Mar 15, 2022
Application Filed
May 04, 2024
Non-Final Rejection — §103
Aug 09, 2024
Response Filed
Aug 23, 2024
Final Rejection — §103
Dec 30, 2024
Request for Continued Examination
Jan 08, 2025
Response after Non-Final Action
Mar 12, 2025
Non-Final Rejection — §103
Jul 18, 2025
Response Filed
Jul 24, 2025
Non-Final Rejection — §103
Oct 28, 2025
Response Filed
Nov 13, 2025
Final Rejection — §103
Apr 13, 2026
Examiner Interview Summary
Apr 13, 2026
Applicant Interview (Telephonic)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12567135
MULTIMEDIA PLAYBACK MONITORING SYSTEM AND METHOD, AND ELECTRONIC APPARATUS
2y 5m to grant Granted Mar 03, 2026
Patent 12561876
System and method for an audio-visual avatar creation
2y 5m to grant Granted Feb 24, 2026
Patent 12514672
System, Method And Software Program For Aiding In Positioning Of Objects In A Surgical Environment
2y 5m to grant Granted Jan 06, 2026
Patent 12494003
AUTOMATIC LAYER FLATTENING WITH REAL-TIME VISUAL DEPICTION
2y 5m to grant Granted Dec 09, 2025
Patent 12468949
SYSTEMS AND METHODS FOR FEW-SHOT TRANSFER LEARNING
2y 5m to grant Granted Nov 11, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

6-7
Expected OA Rounds
46%
Grant Probability
60%
With Interview (+13.8%)
3y 4m
Median Time to Grant
High
PTA Risk
Based on 410 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month