Prosecution Insights
Last updated: April 19, 2026
Application No. 18/720,977

METHOD FOR DISPLAYING A GRAPHICAL REPRESENTATION

Non-Final OA §103
Filed
Jun 17, 2024
Examiner
CRAWFORD, JACINTA M
Art Unit
2617
Tech Center
2600 — Communications
Assignee
Renault S A S
OA Round
1 (Non-Final)
88%
Grant Probability
Favorable
1-2
OA Rounds
2y 7m
To Grant
97%
With Interview

Examiner Intelligence

Grants 88% — above average
88%
Career Allow Rate
709 granted / 805 resolved
+26.1% vs TC avg
Moderate +9% lift
Without
With
+9.2%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
29 currently pending
Career history
834
Total Applications
across all art units

Statute-Specific Performance

§101
7.7%
-32.3% vs TC avg
§103
55.1%
+15.1% vs TC avg
§102
5.2%
-34.8% vs TC avg
§112
16.8%
-23.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 805 resolved cases

Office Action

§103
DETAILED ACTION This action is in response to communications: Preliminary Amendment filed June 17, 2024. Claims 10-24 are pending in this case. Claims 1-10 have been newly cancelled. Claims 10-24 have been newly added. This action is made Non-Final. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55. Information Disclosure Statement The information disclosure statement (IDS) submitted on June 17, 2024 was filed on the filing date of the application on June 17, 2024. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Claim Objections Claim 10 is objected to because of the following informalities: Claim 10 recites, “…after the brining, selecting a graphical representation…” but should recite, “…after the bringing, selecting a graphical representation…” Appropriate correction is required. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 10-24 is/are rejected under 35 U.S.C. 103 as being unpatentable over LEE et al. (US 2020/0156536) in view of FERRI (US 2019/0310486) and MOREL et al. (US 2018/0090011). As to claim 10, LEE et al. disclose a method for displaying a graphical representation for an arrangement (Figure 7) comprising a vehicle body (e.g. Figures 2-6 illustrates vehicle body of ego vehicle), a device for displaying the graphical representation (e.g. Figure 1, apparatus for transmitting driving information of the vehicle, further comprising information output unit 140, where [0051] and [0052] notes the information output unit 140 may include display lamps and speakers which are installed on the vehicle body, the display lamp may perform an information output function as a display panel while performing a lighting lamp function using an LED device, the information may include a text, image, pictogram, or figure)…at least one display means integrated in a front portion of a vehicle (e.g. Figure 2A, further illustrated in Figures 3A, 3B, and 4, where at least [0055] and [0057] notes the information output unit 140 may be installed at the front of the vehicle body) and/or a rear portion of the vehicle (e.g. Figure 2C, further illustrated in Figure 6, where at least [0055] and [0057] notes the information output unit 140 may be installed at the rear of the vehicle), and a control unit for controlling the display means (e.g. control unit 130, where [0049] and [0050] notes the control unit 130 to output determined information through information output unit 140), the display means comprising a face configured to play the graphical representation and oriented toward outside of the vehicle (e.g. as noted above, further noted at [0053], the information output unit 140 may be installed or formed in a different shape depending on the installation position on the vehicle body, e.g. front, side, or rear, where the shape corresponds to the exterior design of the vehicle, [0050] further notes the information output unit 140 may output information for helping an obstacle, e.g. a pedestrian or another vehicle, in a direction that the obstacle can easily see the information, a direction facing the obstacle, e.g. at the front of the ego vehicle body when the obstacle is in front, at the side of the ego vehicle body when the obstacle is on the side, or at the rear of the ego vehicle body when the obstacle is in the back, and [0056] additionally notes the information output unit 140 installed on the vehicle body may simultaneously perform the function of an existing vehicle lamp, e.g. a headlamp, DRL, position lamp, turn signal light, brake light, or emergency light while displaying information, e.g. a text, image, pictogram, or figure), the method (Figure 7) comprising: selecting, via the control unit, an image and an animation script (e.g. step S104, set contents and display style of information to be displayed in response to determined situation, step S105, set display direction (position) of information to be displayed in response to determined situation)…bringing the vehicle into a wake-up state (e.g. step S102, [0069] notes control unit 130 may sense vehicle information of the ego vehicle, where the status of the ego vehicle may include a start-up of the vehicle, power-on, positions of passengers, turn signal light operation, the number and positions of passengers, and a steering wheel operation of a driver, where “start-up” or “power-on” may be considered “bringing the vehicle into a wake-up state,” and step S103, [0070] notes control unit 130 may further determine the status of the ego vehicle and the surrounding situations, e.g. stop, emergency, direction indication, a status to inform the driver of the expected time of departure, a status to display appreciation, a status to display deceleration, and a status to display another vehicle recognition information, based on the sensed information); after the brin[g]ing (e.g. after steps S102 and S103), selecting a graphical representation according to a first image and a first animation script by the control unit (e.g. step S104, [0071] notes control unit 130 may set contents and display style, e.g. text, image, pictogram, or figure, of information to be displayed in response to determined situation, and step S105, [0073] notes control unit 130 may set display direction (position) of information to be displayed in response to determined status of the ego vehicle and the determined surrounding situations, [0074] notes the display direction (position) may be set to one or more of the front, side, rear, top and bottom of the vehicle body, and include the direction (position) of the obstacle, e.g. pedestrian or another vehicle, as noted above); and after the selecting the graphical representation (e.g. after steps S104 and S105), sending to the display means a signal associated with the selected graphical representation (e.g. step S106, [0075] notes when the contents, display style, and direction (position) of the information to be displayed are set, the control unit 130 may output the information in designated display direction (position) for designated time). As noted above, LEE et al. disclose its device for displaying the graphical representation (e.g. Figure 1, apparatus for transmitting driving information of the vehicle, further comprising information output unit 140), but do not disclose the device “…comprising a digital data storage means…” LEE et al. also disclose its method comprising selecting, via the control unit, an image and an animation script (e.g. step S104, set contents and display style of information to be displayed in response to determined situation, step S105, set display direction (position) of information to be displayed in response to determined situation), but do not disclose the image and animation script “…previously recorded in said storage means…” Lastly, LEE et al. disclose bringing the vehicle into a wake-up state (e.g. step S102, [0069] notes control unit 130 may sense vehicle information of the ego vehicle, where the status of the ego vehicle may include a start-up of the vehicle or power-on…), but do not disclose “…the selecting including receiving a predetermined encrypted signal; after the receiving, bringing the vehicle into a wake-up state…” FERRI further discloses a method for displaying a graphical representation for an arrangement (e.g. Figure 12) comprising a vehicle body (Figure 1, vehicle 10), a device for displaying the graphical representation (e.g. Figures, holographic display device 20)…at least one display means integrated in a front portion of a vehicle and/or a rear portion of the vehicle ([0037] notes holographic display device 20 within an external part 22 for a vehicle 10, an external part 22 is a component that is exposed to the outside space around the vehicle 10, e.g. external part 22 may be a headlight assembly, a fog lamp assembly, a rear taillight assembly, a door with a transparent strip, a front grille, a side rear view mirror, or may be provided as a standalone unit, the holographic display device 20 may further be provided in an inside brake light), and a control unit for controlling the display means (e.g. Figure 7C, controller 100, further described below), the display means comprising a face configured to play the graphical representation and oriented toward outside of the vehicle (Figure 7B, [0037] and [0047] notes the holographic display device 20 is preferably configured for view by a person outside of the vehicle 10)…the method comprising: receiving a predetermined encrypted signal (e.g. step 202, [0055] notes detecting the approach of a user 49, e.g. detecting a FOB 110 using the Body Control Module (BCM) 108 and controller 100, where [0046] notes controller 100 may be in communication with a vehicle system, such as BCM 108, the controller 100 may receive commands from the BCM 108 instructing controller 100 to operate image source 34 and/or the illumination source 44, BCM 108 may implement a wireless or remote access and/or authentication system including, for example, a wireless key FOB 110 carried by user 49, where BCM 108 may detect wireless transmitted signal 111 from FOB 110, where it is considered the transmitted signal 111 is “encrypted” as it needs to be authenticated as noted above); after the receiving (e.g. after step 202, detecting the approach of user 49, e.g. transmitted signal 111 from key FOB 110), bringing the vehicle into a wake-up state (e.g. step 204, [0055] notes activating the holographic display device 20 to display an image 32 in response to detecting the approach of user 49 at step 202, where [0046] further notes in response to detecting key FOB 110 approaching the vehicle 10, e.g. via wirelessly transmitted signal 111, BCM may be considered to control or instruct controller 100 to control image source 34); after the brin[g]ing (e.g. after activating the holographic display device 20)…sending to the display means a signal associated with the selected graphical representation ([0055] notes presenting an welcome image such as a driver’s name or a logo, or other information, or images, or text, e.g. via holographic display device 20). It would have been obvious to one of ordinary skill in the art at the time of the invention to modify LEE et al.’s device for displaying a graphical representation and method of selecting an image and animation script with features of FERRI’s holographic display device and method of receiving a predetermined encrypted signal and after the receiving, bringing the vehicle into a wake-up state as an additional state or status for further controlling the system to allow the image selection and display to begin as soon as the user approaches the vehicle, thus enhancing the functionality of the system. LEE et al. modified with FERRI differ from the invention defined in that LEE et al. modified with FERRI still do not disclose its device “…comprising a digital data storage means…” and the image and animation script “…previously recorded in said storage means…” MOREL et al. disclose a method for displaying a graphical representation for an arrangement (e.g. Figures 2 and 3) comprising a vehicle body (Figure 4, vehicle), a device for displaying the graphical representation (e.g. Figure 1, [0044], projection system 2) comprising a digital data storage means (e.g. storage unit 6, [0044])…the method (Figures 2 and 3) comprising: selecting, via the control unit, an image and an animation script previously recorded in said storage means ([0050] thru [0052] notes storage unit 6 capable of storing images to be projected, e.g. pictograms, and also storing coordinates of the position of the driver in a predefined reference frame referred to as the projection reference frame Rp, where step 34, [0064] notes processing unit 10 selects, from the storage unit 6, at least one image showing a specific pictogram characteristic of the detected event from all of the images stored in the storage unit and depending on the event detected by the detection device 4, step 36, [0065] further notes processing unit 10 establishes a sequence of images representing an animation of said pictogram, said sequence of images being paced depending on the time estimated in step 32, [0299] further notes images pre-loaded). It would have been obvious to one of ordinary skill in the art at the time of the invention to further modify LEE et al. modified with FERRI’s device for displaying the graphical representation to further comprise a digital data storage means as described in MOREL et al. such that the information, e.g. text, image, pictogram, or figure, as described in LEE et al. (and/or images as described in FERRI) may be stored for easy retrieval and use for selecting, e.g. set, as the information to be displayed as described by the system, thus yielding predictable results. As to claim 11, LEE et al. modified with FERRI and MOREL et al. disclose the vehicle body is a body of a motor vehicle (e.g. as noted for claim 10, LEE, Figures 2-6 illustrate vehicle body of ego vehicle; modified with FERRI, Figure 1 illustrate vehicle body of vehicle 10; further modified with MOREL, Figure 4 illustrates vehicle). As to claim 12, LEE et al. modified with FERRI and MOREL et al. disclose the display means is integrated into the front portion and the front portion connects front lights of the vehicle (e.g. as noted in claim 10, LEE, Figures 2A, 3, and 4, [0053], [0056] notes information output unit 130 may be installed on the front of the vehicle body, and may simultaneously perform the function of an existing vehicle lamp, e.g. a head lamp, and display information, e.g. a text, image, pictogram, or figure, thus integrated into the front portion and connected to the front lights of the vehicle; modified with FERRI, Figures 1 and 2, [0037] notes holographic display device 20 within an external part 22 for a vehicle 10, an external part 22 is a component that is exposed to the outside space around the vehicle 10, e.g. external part 22 may be a headlight assembly, fog lamp assembly, or a front grille, thus integrated into the front portion and connected to the front lights of the vehicle). As to claim 13, LEE et al. modified with FERRI and MOREL et al. disclose the display means is integrated into the rear portion inside a rear light of the vehicle (e.g. as noted in claim 10, LEE, Figures 2C and 6, [0053], [0056] notes information output unit 130 may be installed on the rear of the vehicle body, and may simultaneously perform the function of an existing vehicle lamp, e.g. a brake light, and display information, e.g. a text, image, pictogram, or figure; modified with FERRI, [0037] notes holographic display device 20 within an external part 22 for a vehicle 10, an external part 22 is a component that is exposed to the outside space around the vehicle 10, e.g. external part 22 may be a rear taillight assembly, and may also be provided in an inside brake light, thus integrated into the rear portion inside the rear light of the vehicle). As to claim 14, LEE et al. modified with FERRI and MOREL et al. disclose the image is a three- dimensional representation of an object (modified with FERRI, Figure 7B, [0047] notes three-dimensional appearance of the holographic image 32 as viewed from the perspective of an external viewer; further modified with MOREL, [0076] notes processing unit 10 adds areas of shadow to the images of the sequence of images to give the driver the visual impression that the pictogram shown in the transformed image is displayed in 3D). As to claim 15, LEE et al. modified with FERRI and MOREL et al. disclose the image is a logo (modified with FERRI, [0040] notes the holographic image 32 may include a logo, symbol, or other graphic). As to claim 16, LEE et al. modified with FERRI and MOREL et al. disclose the control unit modifies at least one animation parameter according to at least one parameter internal and/or external to the vehicle (e.g. as noted in claim 10, LEE, step S102, [0069] notes control unit 130 may sense the status of the ego vehicle, e.g. a start-up of the vehicle, power-on, brake operation, turn signal light operation, the number and positions of passengers, and a steering wheel operation of a driver, step S103, [0070] notes control unit 130 may further determine the status of the ego vehicle and the surrounding situations, e.g. stop, emergency, direction indication, a status to inform the driver of the expected time of departure, a status to display appreciation, a status to display deceleration, and a status to display another vehicle recognition information, based on the sensed information, then step S104, [0071] notes control unit 130 may set contents and display style, e.g. text, image, pictogram, or figure, of the information to be displayed, in response to the determined status of the ego vehicle and the determined surrounding situations, and step S105, [0073] notes control unit 130 may further set the display direction (position) of the information to be displayed, in response to the determined status of the ego vehicle and the determined surrounding situations, where each of the determined status and/or determined surrounding situations may be considered “at least one parameter internal and/or external to the vehicle,” which the control unit 130 bases which contents and display style as well as display direction (position) to be set). As to claim 17, LEE et al. modified with FERRI and MOREL et al. disclose the at least one parameter internal to the vehicle is a wake-up state of the vehicle, and/or an operating state of the vehicle, and/or a state of movement of the vehicle, and/or a state of braking of the vehicle (e.g. as noted in claim 16, LEE describes status of the ego vehicle may include a start-up of the vehicle, power-on, brake operation, turn signal light operation, the number and positions of passengers, and a steering wheel operation of a driver, which may be considered “internal parameters to the vehicle”). As to claim 18, LEE et al. modified with FERRI and MOREL et al. disclose identifying an operating state of the vehicle via identification means (e.g. as noted in claim 10, LEE, step S102, [0069] notes control unit 130 may sense the status of the ego vehicle, e.g. a start-up of the vehicle, power-on, brake operation, turn signal light operation, the number and positions of passengers, and a steering wheel operation of a driver, step S103, [0070] notes control unit 130 may further determine the status of the ego vehicle and the surrounding situations, e.g. stop, emergency, direction indication, a status to inform the driver of the expected time of departure, a status to display appreciation, a status to display deceleration, and a status to display another vehicle recognition information, based on the sensed information), followed by selecting a second graphical representation according to a second image and a second animation script by the control unit (e.g. as noted in claim 10, LEE, step S104, [0071] notes control unit 130 may set contents and display style, e.g. text, image, pictogram, or figure, of the information to be displayed, in response to the determined status of the ego vehicle and the determined surrounding situations, and step S105, [0073] notes control unit 130 may further set the display direction (position) of the information to be displayed, in response to the determined status of the ego vehicle and the determined surrounding situations), followed by sending to the display means a signal associated with the second selected graphical representation (e.g. as noted in claim 10, LEE, step S106, [0075] notes when the contents, display style, and direction (position) of the information to be displayed are set, the control unit 130 may output the information in designated display direction (position) for designated time)(NOTE: it may be considered a “second image and second animation script” may be selected upon sensing and determining a second status of the ego vehicle and the surrounding situations different from a first status of the ego vehicle and the surrounding situations, e.g. as sensed and determined in claim 10, e.g. the process of Figure 7 repeated). As to claim 19, LEE et al. modified with FERRI and MOREL et al. disclose identifying a start- up state of the powertrain of the vehicle (e.g. as noted in claims 10 and 18, LEE, step S102 and S103, the status of the ego vehicle may include a start-up of the vehicle), followed by selecting a second graphical representation according to a second image and a second animation script by the control unit (e.g. as noted in claims 10 and 18, LEE, followed by step S104, setting contents and display style, and step S105, setting display direction (position)), followed by sending to the display means a signal associated with the second selected graphical representation (e.g. as noted in claims 10 and 18, LEE, followed by step S106, outputting the information in designated display direction (position))(NOTE: it may be considered a “second image and second animation script” may be selected upon sensing and determining a second status of the ego vehicle and the surrounding situations different from a first status of the ego vehicle and the surrounding situations, e.g. as sensed and determined in claim 10, e.g. the process of Figure 7 repeated). As to claim 20, LEE et al. modified with FERRI and MOREL et al. disclose receiving a signal representative of a braking intensity (e.g. as noted in claims 10 and 18, LEE, step S102 and S103, the status of the ego vehicle may include a brake operation, and may further include the vehicle stopped and/or decelerating, which may correspond to a braking intensity), followed by selecting a third graphical representation according to a third image and a third animation script by the control unit (e.g. as noted in claims 10 and 18, LEE, followed by step S104, setting contents and display style, and step S105, setting display direction (position)), followed by sending to the display means a signal associated with the third selected graphical representation (e.g. as noted in claims 10 and 18, LEE, followed by step S106, outputting the information in designated display direction (position))(NOTE: it may be considered a “third image and third animation script” may be selected upon sensing and determining a third status of the ego vehicle and the surrounding situations different from the first and second status of the ego vehicle and the surrounding situations, e.g. as sensed and determined in claims 10 and 18, e.g. the process of Figure 7 repeated). As to claim 21, LEE et al. modified with FERRI and MOREL et al. disclose at least one of the first and third animation scripts defines a speed of rotation of the image about an axis (further modified with MOREL, Figure 4, [0066] notes the animation comprises the pivoting of the pictogram shown in the selected image relative to a horizontal axis A-A and perpendicular to the direction of movement of the motor vehicle, where Figure 3, [0067] thru [0076] further notes process of selected image in order to produce the visual impression that the image is being pivoted relative to the horizontal axis A-A). As to claim 22, LEE et al. modified with FERRI and MOREL et al. disclose the axis is an axis vertical to the vehicle (modified with FERRI, [0038] and [0039] notes the image projected onto or reflected from presentation surface 30 would seem to be suspended relative to other components, e.g. in front of other components in the external part 22 assembly, which may be considered vertical to the vehicle (e.g. differs from MOREL which displays flat on ground surface, e.g. horizontal axis)). As to claim 23, LEE et al. modified with FERRI and MOREL et al. disclose transferring operating parameters of the at least one display means according to a light intensity signal transmitted from brightness measuring means to the control unit (modified with FERRI, [0045] notes image source 34 in electrical communication with controller 100 including processer unit 102 and memory 103 for storing instructions and image information or data for execution and//or processing by the processor unit 102 for controlling the light or image output of the image source 34, where Figure 10, [0053] notes the image source 34 having a light source 50 generating a first beam 52 of unpolarized light, a first circular polarizer 54 includes a linear polarizer 56 and a quarter-wave plate 58 together creating the circularly polarized light having a circular polarization in the first direction from the first beam 52 of unpolarized light, and a second circular polarizer 55 includes a similar construction as the first circular polarizer 54 but operates in reverse to pass-through only light 77 having a circular polarization in a given direction; further modified with MOREL, [0171] notes 3) projecting the image onto the projection surface, where [0173] thru [0217] notes 3a) calculating a luminance map, including a goniophotometer, e.g. the rotational movement about the horizontal axis supports the rotational movement about the vertical axis adjusted by the rotation about the horizontal axis, [0218] thru [0235] notes 3b) calculating the positions of the luminance points in the image reference frame, [0236] thru [0268] notes 3c) defining coordinates of the protection of a luminance point, [0269] thru [0278] notes 3d) defining the coordinates of the corresponding pixel, and [0279] thru [0315] notes 3e) correcting the intensity value of the corresponding intensity indicator). As to claim 24, LEE et al. modified with FERRI and MOREL et al. disclose a vehicle (e.g. LEE, Figures 2-6 illustrate vehicle body of ego vehicle; modified with FERRI, Figure 1 illustrate vehicle body of vehicle 10; further modified with MOREL, Figure 4 illustrates vehicle), comprising: hardware and/or software means configured to implement the method as claimed in claim 10 (LEE, Figure 1, apparatus; modified with FERRI, Figure 7C, holographic display device 20; further modified with MOREL, Figure 1, projection system 2). Please see the rejection and rationale of claim 10. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Sweeney et al. (US 2017/0240098) disclose a control system of a self-driving vehicle (SDV) can process sensor data from a sensor system of the SDV to autonomously control acceleration, steering, and braking systems of the SDV along a current route, based on the current route, the control system can dynamically determine a set of immediate actions to be performed by the SDV, and based on the set of immediate actions, the control system can generate a set of intention outputs on a lighting strip of the SDV, the set of intention outputs indicating the set of immediate actions prior to the SDV executing the set of immediate actions. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JACINTA M CRAWFORD whose telephone number is (571)270-1539. The examiner can normally be reached 8:30a.m. to 4:30p.m. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, King Y. Poon can be reached at (571)272-7440. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JACINTA M CRAWFORD/Primary Examiner, Art Unit 2617
Read full office action

Prosecution Timeline

Jun 17, 2024
Application Filed
Dec 03, 2025
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602734
GRAPHICS PROCESSORS
2y 5m to grant Granted Apr 14, 2026
Patent 12602735
GRAPH DATA CALCULATION METHOD AND APPARATUS
2y 5m to grant Granted Apr 14, 2026
Patent 12602841
HIGH DYNAMIC RANGE VISUALIZATIONS INDICATING RANGES, POINT CURVES, AND PREVIEWS
2y 5m to grant Granted Apr 14, 2026
Patent 12597180
ARTIFICIAL INTELLIGENCE AUGMENTATION OF GEOGRAPHIC DATA LAYERS
2y 5m to grant Granted Apr 07, 2026
Patent 12591946
DETECTING ERROR IN SAFETY-CRITICAL GPU BY MONITORING FOR RESPONSE TO AN INSTRUCTION
2y 5m to grant Granted Mar 31, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
88%
Grant Probability
97%
With Interview (+9.2%)
2y 7m
Median Time to Grant
Low
PTA Risk
Based on 805 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month