Prosecution Insights
Last updated: April 19, 2026
Application No. 18/587,143

METHODS AND SYSTEMS FOR THREE-DIMENSIONAL VISUALIZATION

Final Rejection §103
Filed
Feb 26, 2024
Examiner
PROVIDENCE, VINCENT ALEXANDER
Art Unit
2617
Tech Center
2600 — Communications
Assignee
Epicor Software Corporation
OA Round
2 (Final)
83%
Grant Probability
Favorable
3-4
OA Rounds
2y 5m
To Grant
99%
With Interview

Examiner Intelligence

Grants 83% — above average
83%
Career Allow Rate
15 granted / 18 resolved
+21.3% vs TC avg
Strong +25% interview lift
Without
With
+25.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 5m
Avg Prosecution
38 currently pending
Career history
56
Total Applications
across all art units

Statute-Specific Performance

§101
0.9%
-39.1% vs TC avg
§103
82.4%
+42.4% vs TC avg
§102
14.8%
-25.2% vs TC avg
§112
0.9%
-39.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 18 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment The Amendment filed January 14th, 2026 is insufficient to overcome the rejection of Claims 1-20 based upon Santos in view of Yasuo as set forth in the last Office action because the limitations amended in Claims 1-20 are taught by the prior art “Santos” cited in the previous Office action. See Response to Arguments below. Response to Arguments Applicant's arguments filed January 14th 2026 have been fully considered but they are not persuasive. The applicant states: “Santos does not teach or suggest receiving, by the processor of the three-dimensional visualization system, two-dimensional sketch data through the two-dimensional sketch user interface, the two-dimensional sketch data defining a plurality of nodes, each node of the plurality of nodes comprising one of a plurality of node types, and the plurality of nodes defining a two-dimensional sketch of at least a portion of the object as recited in the pending claims. Rather, Santos seems to be silent regarding anything that can reasonably be considered to suggest such recitations.” The Examiner respectfully disagrees with the argument. In the previous action, the Examiner alleged that Santos teaches: “receiving, by the processor of the three-dimensional visualization system, two-dimensional sketch data (Santos: constrained sketches 12 [0027]), the two-dimensional sketch data defining a plurality of nodes, (Santos: Instructions are collected as a series of nodes in a graph as represented in FIG. 4, [0041]) and the plurality of nodes defining a two-dimensional sketch of at least a portion of the object (Santos: each node representing a type of instruction of a set of instructions; […] wherein the starting sketch is an initial input of the set of instructions [0011]);” The claimed limitations may be split into four key parts: a) receiving, by the processor of the three-dimensional visualization system, two-dimensional sketch data through the two-dimensional sketch user interface, b) the two-dimensional sketch data defining a plurality of nodes, c) each node of the plurality of nodes comprising one of a plurality of node types, and d) the plurality of nodes defining a two-dimensional sketch of at least a portion of the object. Santos teaches a processor: “a computing device processor to fulfill inputs, execute instructions and propagate the outputs of said instructions;” [0027]. Santos also teaches that the first step of the method is “defining a starting sketch” [0011], see also Fig. 1. One of ordinary skill in the art would understand that Santos teaches receiving two-dimensional sketch data by using the processor. In the previous action, the Examiner stated that Santos does not explicitly teach receiving the data from a “two-dimensional sketch user interface” and therefore cited to Yasuo. As to the Applicant’s argument regarding Yasuo, in the previous action the Examiner alleged that Yasuo teaches “receiving, by the processor of the three-dimensional visualization system, two-dimensional sketch data through the two-dimensional sketch user interface”, because Santos was not explicitly clear that their user interface was two-dimensional. Specifically, the Examiner cited to [0053] of Yasuo, which teaches: “When the XZ plane is designated as the sketch plane, the screen changes to the sketch mode shown in FIG. 5 and the GUI screen changes to a view perpendicular to the sketch plane. The designer sketches a two-dimensional profile using the input device 1 such as a mouse.” When the sketch may be represented by a plurality of nodes (as explained below) receiving a sketch from the two-dimensional user interface in Yasuo is analogous to “receiving, by the processor of the three-dimensional visualization system, two-dimensional sketch data through the two-dimensional sketch user interface”. Therefore, Yasuo in combination with Santos would teach the part (a) of the claimed limitations above. For this reason, the Examiner disagrees that Yasuo is “silent regarding anything that can reasonably be considered to suggest such recitations”. Figure 4 showcases two-dimensional sketch data (note that this is not the same as the claimed “two-dimensional sketch”) including a “Sketch” node which represents the initial two-dimensional sketch: “we represent our instruction set as a directed asynchronous graph starting on the initial sketch” [0030]. Therefore, one of ordinary skill in the art would understand that the two-dimensional sketch data may define a plurality of nodes, which corresponds to part (b) of the claimed limitations. Furthermore, Figure 4 showcases a plurality of nodes within a user interface, with varying types. Therefore, Santos teaches part (c) of the claimed limitations. In Figures 2 and 3, Santos showcases two-dimensional “constrained sketches 12”. In Figure 4, Santos showcases a graph of nodes or “instructions”. Santos teaches that the constrained sketches “may include a geometric shape, such as a polygon, wherein one or more of the geometric primitives of the geometric shape is associated with constraints (12a, 12b and 16);” [0027] and that “a set of instructions [is] capable of converting a set of inputs into outputs (for example, a cut line 16 constraining the sketch 12)” [0027]. In other words, the instructions in the graph may modify or impose constraints on the sketch. Santos teaches that “Instructions are collected as a series of nodes in a graph as represented in FIG. 4.” Therefore, the constraints for the sketch 12 are represented as nodes. Notably, the node graph in Figure 4 contains a Cut node (between the Inset and Extrusion nodes) that could correspond to a cut line constraint, such as the cut line 16 in the two-dimensional sketch depicted in Figure 3. Santos further teaches that: “The output of that instruction will be redirected into the input of a new instruction of type “Cut” where the floorplan of the house will be created as successive cuts. The resulting segments will be assigned profiles (2D paths showing how the walls will go up and form the roof).” In other words, the cut instruction produces a collection of 2D portions of an object. Santos further teaches that: “The output of the cut instruction will be fed into a new extrude instruction that will create the house's volume based on the profiles assigned to each segment.” [0032]. That is, prior to an Extrude instruction, the sketch remains two-dimensional. Therefore, one of ordinary skill in the art would understand that the plurality of nodes in Figure 4 may define a two-dimensional sketch, which corresponds to part (d) of the claimed limitations. Therefore, the Examiner submits that Santos teaches parts (b), (c), and (d) of the limitations discussed above, and that in combination with Yasuo, teaches part (a). Even if, for the sake of argument, there are differences between the Santos reference and the limitations discussed above, there are enough similarities that it is unlikely that Santos is “silent regarding anything that can reasonably be considered to suggest such recitations”. Additionally, because both Santos and Yasuo are directed to converting content from a 2D sketch into a 3D image (e.g.: Santos teaches “three-dimensional modeling through constrained sketches and its related operations” (Abstract) and Yasuo teaches “element creation based on geometric information of a two-dimensional cross-sectional shape defined at the time of three-dimensional design” (Abstract), the Examiner submits that one of ordinary skill in the art would be motivated to combine the teachings of Santos and Yasuo. The Examiner also notes that the Applicant has amended the claims to highlight that “each node of the plurality of nodes compris[es] one of a plurality of node types”. However, in Fig. 4 of Santos cited previously, it is visible that the sketch, when represented by a plurality of nodes, comprises a variety of node types (such as “Inset”, “Cut”, “Extrusion”, etc.). Therefore, the amendments do not appear to overcome the rejections presented in the previous action. PNG media_image1.png 376 609 media_image1.png Greyscale Nodes representing the sketch in Fig. 4 of Santos. For at least the above reasons, the Examiner is not convinced that the combination of Santos in view of Yasuo “fails to teach or suggest each claim recitation”. Accordingly, the §103 rejections are maintained. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1, 3, 13, 15, 17, and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Santos (US 20190066374 A1) in view of Yasuo (JP 2008102767 A). Regarding claim 1: Santos teaches: A method for generating a three-dimensional drawing representing an object (Santos: A method for three-dimensional modeling through constrained sketches, Abstract), the method comprising: presenting, by a processor of a three-dimensional visualization system (Santos: a computer-readable medium storing previous items; a computing device processor to fulfill inputs [0027]) a two-dimensional sketching user interface (Santos: the present invention may include at least one computing device with a user interface [0026]; Santos: Figure 7) receiving, by the processor of the three-dimensional visualization system, two-dimensional sketch data (Santos: constrained sketches 12 [0027]), the two-dimensional sketch data defining a plurality of nodes, each node of the plurality of nodes comprising one of a plurality of node types, (Santos: Instructions are collected as a series of nodes in a graph as represented in FIG. 4, [0041]) and the plurality of nodes defining a two-dimensional sketch (see Note 1C) of at least a portion of the object (Santos: each node representing a type of instruction of a set of instructions; […] wherein the starting sketch is an initial input of the set of instructions [0011]); reading, by the processor of the three-dimensional visualization system, node data for each of the plurality of nodes defined in the two-dimensional sketch data (Santos: A coder would create software capable of describing and executing the concepts of the invention, including but not limited to a system capable of reading: […] a directed graph of instructions 22 (representable through user interface modeling tools 20, as illustrated in FIG. 4 [0031]); and extruding, by the processor of the three-dimensional visualization system, the three-dimensional drawing representing the object (Santos: an extrude instruction configured to generate volume to planar surfaces of the polygon based on user-defined profile [0012]). Note 1C: In Figures 2 and 3, Santos showcases two-dimensional “constrained sketches 12”. In Figure 4, Santos showcases a graph of nodes or “instructions”. Santos teaches that the constrained sketches “may include a geometric shape, such as a polygon, wherein one or more of the geometric primitives of the geometric shape is associated with constraints (12a, 12b and 16);” [0027] and that “a set of instructions [is] capable of converting a set of inputs into outputs (for example, a cut line 16 constraining the sketch 12)” [0027]. In other words, the instructions in the graph may modify or impose constraints on the sketch. Santos teaches that “Instructions are collected as a series of nodes in a graph as represented in FIG. 4.” Therefore, the constraints for the sketch 12 are represented as nodes. Notably, the node graph in Figure 4 contains a Cut node (between the Inset and Extrusion nodes) that could correspond to a cut line constraint, such as the cut line 16 in the two-dimensional sketch depicted in Figure 3. Santos further teaches that: “The output of the cut instruction will be fed into a new extrude instruction that will create the house's volume based on the profiles assigned to each segment.” [0032]. That is, prior to an Extrude instruction, the sketch remains two-dimensional. Therefore, one of ordinary skill in the art would understand that the plurality of nodes may define a two-dimensional sketch. Santos fails to explicitly teach: receiving, by the processor of the three-dimensional visualization system, two-dimensional sketch data through the two-dimensional sketch user interface Note for “fails to teach” sections: ONLY for paragraphs (each ‘paragraph’ is separated by one or more line breaks) that contain bolded text, the bolded text is the material not taught by the reference in question, whereas non-bolded text is material that is taught. Any material within a paragraph that is included in a “fails to teach” section that contains no bolded text is not taught by the reference in question. Yasuo teaches: presenting, by a processor of a three-dimensional visualization system (Yasuo: the data processing device 2 includes a personal computer, the display device 4 includes a display, and the external storage device 3 includes an HDD, a DVD, and the like, [0048]), a two-dimensional sketching user interface (Yasuo: FIG. 4 shows an example of a GUI screen activated by the three-dimensional CAD system [0025]); receiving, by the processor of the three-dimensional visualization system, two-dimensional sketch data through the two-dimensional sketch user interface (Yasuo: When the XZ plane is designated as the sketch plane, the screen changes to the sketch mode shown in FIG. 5 and the GUI screen changes to a view perpendicular to the sketch plane. The designer sketches a two-dimensional profile using the input device 1 such as a mouse. [0053]), the two-dimensional sketch data defining a plurality of nodes and the plurality of nodes defining a two-dimensional sketch of at least a portion of the object (Yasuo: In step B9, the element dividing means 20 divides the basic figure into two-dimensional elements. At that time, the division is performed so that the number of divided adjacent sides is the same, and the nodes are shared, [0039]; see Note 1A); reading, by the processor of the three-dimensional visualization system, node data for each of the plurality of nodes defined in the two-dimensional sketch data (see Note 1B); and extruding, by the processor of the three-dimensional visualization system, the three-dimensional drawing representing the object (Yasuo: By a three-dimensional operation such as extrusion performed in the process of three-dimensional design, the basic shape region is also simultaneously extruded and converted into a three-dimensional volume, [0014]) from the node data (Yasuo: In step B9, the element dividing means 20 divides the basic figure into two-dimensional elements. At that time, the division is performed so that the number of divided adjacent sides is the same, and the nodes are shared [0039]; see Note 1A). Note 1A: Yasuo teaches that the sketch may be divided into elements or “nodes”, and that the nodes are then shared. Yasuo further teaches: “the two-dimensional profile divided into the elements is similarly extruded and three-dimensionalized,” [0040]. Therefore, the extrusion is performed on the sketch representing the object from the node data. Note 1B: Extruding the sketch based on the node data, as described in Note 1A, will inherently require the processor to read the node data that has been received. Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to combine the teachings of Yasuo with Santos. Receiving, by the processor of the three-dimensional visualization system, two-dimensional sketch data through the two-dimensional sketch user interface, as in Yasuo, would benefit the Santos teachings by enabling a user to easily draw sketches with a preferred input device (Yasuo: The designer sketches a two-dimensional profile using the input device 1 such as a mouse, [0053]) without having to worry about the intricacies of 3D movement (Yasuo: When the XZ plane is designated as the sketch plane, the screen changes to the sketch mode shown in FIG. 5 and the GUI screen changes to a view perpendicular to the sketch plane [0053]). Regarding claim 3: Santos in view of Yasuo teaches: The method of claim 1 (as shown above), wherein receiving the two-dimensional sketch data comprises receiving a user input through the two-dimensional sketching user interface (Yasuo: The designer first designates a sketch plane for creating the first feature. When the XZ plane is designated as the sketch plane, the screen changes to the sketch mode shown in FIG. 5 and the GUI screen changes to a view perpendicular to the sketch plane. The designer sketches a two-dimensional profile using the input device 1 such as a mouse,” [0053]) Regarding claim 13: Santos teaches: A system comprising: a processor; and a memory coupled with and readable by the processor (Santos: The computing device may include at least one processing unit and a form of memory, [0026]) and storing therein a set of instructions which, when executed by the processor, causes the processor to generate a three-dimensional drawing representing an object (Santos: The computing device includes a program product including a machine-readable program code for causing, when executed, the computing device to perform steps, [0026]) by: presenting a two-dimensional sketching user interface (Santos: the present invention may include at least one computing device with a user interface [0026]; Santos: Figure 7); receiving two-dimensional sketch data (Santos: constrained sketches 12 [0027]), the two-dimensional sketch data defining a plurality of nodes, each node of the plurality of nodes comprising one of a plurality of node types, (Santos: Instructions are collected as a series of nodes in a graph as represented in FIG. 4, [0041]) and the plurality of nodes defining a two-dimensional sketch (see Note 1C) of at least a portion of the object (Santos: each node representing a type of instruction of a set of instructions; […] wherein the starting sketch is an initial input of the set of instructions [0011]); reading node data for each of the plurality of nodes defined in the two-dimensional sketch data (Santos: A coder would create software capable of describing and executing the concepts of the invention, including but not limited to a system capable of reading: […] a directed graph of instructions 22 (representable through user interface modeling tools 20, as illustrated in FIG. 4 [0031]); and extruding the three-dimensional drawing representing the object (Santos: an extrude instruction configured to generate volume to planar surfaces of the polygon based on user-defined profile [0012]). Santos fails to explicitly teach: receiving two-dimensional sketch data through the two-dimensional sketch user interface Yasuo teaches: presenting a two-dimensional sketching user interface (Yasuo: FIG. 4 shows an example of a GUI screen activated by the three-dimensional CAD system [0025]); receiving two-dimensional sketch data through the two-dimensional sketch user interface (Yasuo: When the XZ plane is designated as the sketch plane, the screen changes to the sketch mode shown in FIG. 5 and the GUI screen changes to a view perpendicular to the sketch plane. The designer sketches a two-dimensional profile using the input device 1 such as a mouse. [0053]), the two-dimensional sketch data defining a plurality of nodes and the plurality of nodes defining a two-dimensional sketch of at least a portion of the object (Yasuo: In step B9, the element dividing means 20 divides the basic figure into two-dimensional elements. At that time, the division is performed so that the number of divided adjacent sides is the same, and the nodes are shared, [0039]; see Note 1A); reading node data for each of the plurality of nodes defined in the two-dimensional sketch data (see Note 1B); and extruding the three-dimensional drawing representing the object (Yasuo: By a three-dimensional operation such as extrusion performed in the process of three-dimensional design, the basic shape region is also simultaneously extruded and converted into a three-dimensional volume, [0014]) from the node data (Yasuo: In step B9, the element dividing means 20 divides the basic figure into two-dimensional elements. At that time, the division is performed so that the number of divided adjacent sides is the same, and the nodes are shared [0039]; see Note 1A). Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to combine the teachings of Yasuo with Santos. Receiving, by the processor of the three-dimensional visualization system, two-dimensional sketch data through the two-dimensional sketch user interface, as in Yasuo, would benefit the Santos teachings by enabling a user to easily draw sketches with a preferred input device (Yasuo: The designer sketches a two-dimensional profile using the input device 1 such as a mouse, [0053]) without having to worry about the intricacies of 3D movement (Yasuo: When the XZ plane is designated as the sketch plane, the screen changes to the sketch mode shown in FIG. 5 and the GUI screen changes to a view perpendicular to the sketch plane [0053]). Regarding claim 15: Santos in view of Yasuo teaches: The system of claim 13 (as shown above), wherein receiving the two-dimensional sketch data comprises receiving a user input through the two-dimensional sketching user interface (Yasuo: The designer first designates a sketch plane for creating the first feature. When the XZ plane is designated as the sketch plane, the screen changes to the sketch mode shown in FIG. 5 and the GUI screen changes to a view perpendicular to the sketch plane. The designer sketches a two-dimensional profile using the input device 1 such as a mouse,” [0053]) Regarding claim 17: Santos teaches: A non-transitory, computer-readable medium comprising a set of instructions stored therein which, when executed by a processor, causes the processor to generate a three-dimensional drawing representing an object (Santos: The computing device includes a program product including a machine-readable program code for causing, when executed, the computing device to perform steps, [0026]) by: presenting a two-dimensional sketching user interface (Santos: the present invention may include at least one computing device with a user interface [0026]; Santos: Figure 7); receiving two-dimensional sketch data (Santos: constrained sketches 12 [0027]), the two-dimensional sketch data defining a plurality of nodes, each node of the plurality of nodes comprising one of a plurality of node types, (Santos: Instructions are collected as a series of nodes in a graph as represented in FIG. 4, [0041]) and the plurality of nodes defining a two-dimensional sketch (see Note 1C) of at least a portion of the object (Santos: each node representing a type of instruction of a set of instructions; […] wherein the starting sketch is an initial input of the set of instructions [0011]); reading node data for each of the plurality of nodes defined in the two-dimensional sketch data (Santos: A coder would create software capable of describing and executing the concepts of the invention, including but not limited to a system capable of reading: […] a directed graph of instructions 22 (representable through user interface modeling tools 20, as illustrated in FIG. 4 [0031]); and extruding the three-dimensional drawing representing the object (Santos: an extrude instruction configured to generate volume to planar surfaces of the polygon based on user-defined profile [0012]). Santos fails to explicitly teach: receiving two-dimensional sketch data through the two-dimensional sketch user interface Yasuo teaches: presenting a two-dimensional sketching user interface (Yasuo: FIG. 4 shows an example of a GUI screen activated by the three-dimensional CAD system [0025]); receiving two-dimensional sketch data through the two-dimensional sketch user interface (Yasuo: When the XZ plane is designated as the sketch plane, the screen changes to the sketch mode shown in FIG. 5 and the GUI screen changes to a view perpendicular to the sketch plane. The designer sketches a two-dimensional profile using the input device 1 such as a mouse. [0053]), the two-dimensional sketch data defining a plurality of nodes and the plurality of nodes defining a two-dimensional sketch of at least a portion of the object (Yasuo: In step B9, the element dividing means 20 divides the basic figure into two-dimensional elements. At that time, the division is performed so that the number of divided adjacent sides is the same, and the nodes are shared, [0039]; see Note 1A); reading node data for each of the plurality of nodes defined in the two-dimensional sketch data (see Note 1B); and extruding the three-dimensional drawing representing the object (Yasuo: By a three-dimensional operation such as extrusion performed in the process of three-dimensional design, the basic shape region is also simultaneously extruded and converted into a three-dimensional volume, [0014]) from the node data (Yasuo: In step B9, the element dividing means 20 divides the basic figure into two-dimensional elements. At that time, the division is performed so that the number of divided adjacent sides is the same, and the nodes are shared [0039]; see Note 1A). Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to combine the teachings of Yasuo with Santos. Receiving, by the processor of the three-dimensional visualization system, two-dimensional sketch data through the two-dimensional sketch user interface, as in Yasuo, would benefit the Santos teachings by enabling a user to easily draw sketches with a preferred input device (Yasuo: The designer sketches a two-dimensional profile using the input device 1 such as a mouse, [0053]) without having to worry about the intricacies of 3D movement (Yasuo: When the XZ plane is designated as the sketch plane, the screen changes to the sketch mode shown in FIG. 5 and the GUI screen changes to a view perpendicular to the sketch plane [0053]). Regarding claim 19: Santos in view of Yasuo teaches: The non-transitory, computer-readable medium of claim 17 (as shown above), wherein receiving the two-dimensional sketch data comprises receiving a user input through the two-dimensional sketching user interface (Yasuo: The designer first designates a sketch plane for creating the first feature. When the XZ plane is designated as the sketch plane, the screen changes to the sketch mode shown in FIG. 5 and the GUI screen changes to a view perpendicular to the sketch plane. The designer sketches a two-dimensional profile using the input device 1 such as a mouse,” [0053]) Claim 2, 14, and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Santos (US 20190066374 A1) in view of Yasuo (JP 2008102767 A) and Fu (US 20130262041 A1). Regarding claim 2: Santos in view of Yasuo teaches: The method of claim 1 (as shown above), further comprising: Santos in view of Yasuo fails to teach: receiving, by the processor of the three-dimensional visualization system, manipulation of the two-dimensional sketch data through the two-dimensional sketching user interface; tracking, by the processor of the three-dimensional visualization system, the received manipulations of the two-dimensional sketch data; and updating, by the processor of the three-dimensional visualization system, the three-dimensional drawing based on the tracking of the received manipulations of the two-dimensional sketch data. Fu teaches: receiving, by the processor of the three-dimensional visualization system, manipulation of the two-dimensional sketch data through the two-dimensional sketching user interface (Fu: a two-dimensional sketch having objects in a view plane of the user is received. […] For example, a user draws lines or curves on a piece of paper or a screen of a computer [0032]; Fu: The sketching tool may be used to edit or add to an already existing 3D model. [0082]); tracking, by the processor of the three-dimensional visualization system, the received manipulations of the two-dimensional sketch data (see Note 2A); and updating, by the processor of the three-dimensional visualization system, the three-dimensional drawing based on the tracking of the received manipulations of the two-dimensional sketch data (Fu: The user draws in two-dimensions to indicate the addition or change relative to the 3D model. The processor 16 extrapolates the drawing relative to the 3D model [0082]). Note 2A: In order for the processor to extrapolate the drawing relative to the 3D model, the processor must inherently track the drawing or ‘received manipulations’ from the user. Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to combine the teachings of Fu with Santos in view of Yasuo. Receiving a manipulation of the sketch from the user, tracking said manipulation, and updating the three dimensional drawing, as in Fu, would benefit the Santos in view of Yasuo teachings by enabling a user to refine a sketch while minimizing errors in the output 3D sketch data (Fu: The geometry of the 2D sketch is altered in the view plane (x, y) or 2D input instead of maintaining the geometry of the 2D sketch and only modifying in the view direction (z). Gaps may be reduced through alteration of the 2D geometry, Abstract). Regarding claim 14: Santos in view of Yasuo teaches: The system of claim 13 (as shown above), wherein the instructions further cause the processor to: Santos in view of Yasuo fails to teach: receiving manipulation of the two-dimensional sketch data through the two-dimensional sketching user interface; tracking the received manipulations of the two-dimensional sketch data; and updating the three-dimensional drawing based on the tracking of the received manipulations of the two-dimensional sketch data. Fu teaches: receiving, by the processor of the three-dimensional visualization system, manipulation of the two-dimensional sketch data through the two-dimensional sketching user interface (Fu: a two-dimensional sketch having objects in a view plane of the user is received. […] For example, a user draws lines or curves on a piece of paper or a screen of a computer [0032]; Fu: The sketching tool may be used to edit or add to an already existing 3D model. [0082]); tracking, by the processor of the three-dimensional visualization system, the received manipulations of the two-dimensional sketch data (see Note 2A); and updating, by the processor of the three-dimensional visualization system, the three-dimensional drawing based on the tracking of the received manipulations of the two-dimensional sketch data (Fu: The user draws in two-dimensions to indicate the addition or change relative to the 3D model. The processor 16 extrapolates the drawing relative to the 3D model [0082]). Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to combine the teachings of Fu with Santos in view of Yasuo. Receiving a manipulation of the sketch from the user, tracking said manipulation, and updating the three dimensional drawing, as in Fu, would benefit the Santos in view of Yasuo teachings by enabling a user to refine a sketch while minimizing errors in the output 3D sketch data (Fu: The geometry of the 2D sketch is altered in the view plane (x, y) or 2D input instead of maintaining the geometry of the 2D sketch and only modifying in the view direction (z). Gaps may be reduced through alteration of the 2D geometry, Abstract). Regarding claim 18: Santos in view of Yasuo teaches: The non-transitory, computer-readable medium of claim of claim 17 (as shown above), wherein the instructions further cause the processor to: Santos in view of Yasuo fails to teach: receiving manipulation of the two-dimensional sketch data through the two-dimensional sketching user interface; tracking the received manipulations of the two-dimensional sketch data; and updating the three-dimensional drawing based on the tracking of the received manipulations of the two-dimensional sketch data. Fu teaches: receiving, by the processor of the three-dimensional visualization system, manipulation of the two-dimensional sketch data through the two-dimensional sketching user interface (Fu: a two-dimensional sketch having objects in a view plane of the user is received. […] For example, a user draws lines or curves on a piece of paper or a screen of a computer [0032]; Fu: The sketching tool may be used to edit or add to an already existing 3D model. [0082]); tracking, by the processor of the three-dimensional visualization system, the received manipulations of the two-dimensional sketch data (see Note 2A); and updating, by the processor of the three-dimensional visualization system, the three-dimensional drawing based on the tracking of the received manipulations of the two-dimensional sketch data (Fu: The user draws in two-dimensions to indicate the addition or change relative to the 3D model. The processor 16 extrapolates the drawing relative to the 3D model [0082]). Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to combine the teachings of Fu with Santos in view of Yasuo. Receiving a manipulation of the sketch from the user, tracking said manipulation, and updating the three dimensional drawing, as in Fu, would benefit the Santos in view of Yasuo teachings by enabling a user to refine a sketch while minimizing errors in the output 3D sketch data (Fu: The geometry of the 2D sketch is altered in the view plane (x, y) or 2D input instead of maintaining the geometry of the 2D sketch and only modifying in the view direction (z). Gaps may be reduced through alteration of the 2D geometry, Abstract). Claim 4, 16, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Santos (US 20190066374 A1) in view of Yasuo (JP 2008102767 A) and Ramani (US 20040249809 A1). Regarding claim 4: Santos in view of Yasuo teaches: The method of claim 1 (as shown above), wherein receiving the two-dimensional sketch data comprises receiving an indication of an image file (Santos: A popular approach, for which several patents have been granted, try to reconstruct 3D meshes from images and/or video. The inventor considers these methods orthogonal to the present invention in the sense that the objective of reconstruction is to reproduce existing objects whereas the present invention intends to create new ones. [0005]) Santos in view of Yasuo fails to teach: wherein receiving the two-dimensional sketch data comprises receiving an indication of an image file through the two-dimensional sketching user interface and reading the indicated image file. Ramani teaches: wherein receiving the two-dimensional sketch data comprises receiving an indication of an image file (Ramani: raster images [0363]) through the two-dimensional sketching user interface (Ramani: 2D Drawing Interface: Users are provided options to send in their scanned drawings to the server. [0213-0214]) and reading the indicated image file (Ramani: A complete raster to vector conversion process includes image acquisition, pre-processing, line tracing, text extraction, shape recognition, topology creation and attribute assignment [0367]; see Note 4A). Note 4A: Ramani teaches that text extraction and shape recognition may be performed on the raster image. Both text extraction and shape recognition require reading the raster image data. Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to combine the teachings of Ramani with Santos in view of Yasuo. Receiving an indication of an image file through a user interface and reading the image file, as in Ramani, would benefit the Santos in view of Yasuo teachings by enabling a user to import a previously created sketch or image rather as opposed to recreating the sketch within the software. Regarding claim 16: Santos in view of Yasuo teaches: The system of claim 13 (as shown above), wherein receiving the two-dimensional sketch data comprises receiving an indication of an image file (Santos: A popular approach, for which several patents have been granted, try to reconstruct 3D meshes from images and/or video. The inventor considers these methods orthogonal to the present invention in the sense that the objective of reconstruction is to reproduce existing objects whereas the present invention intends to create new ones. [0005]) Santos in view of Yasuo fails to teach: wherein receiving the two-dimensional sketch data comprises receiving an indication of an image file through the two-dimensional sketching user interface and reading the indicated image file. Ramani teaches: wherein receiving the two-dimensional sketch data comprises receiving an indication of an image file (Ramani: raster images [0363]) through the two-dimensional sketching user interface (Ramani: 2D Drawing Interface: Users are provided options to send in their scanned drawings to the server. [0213-0214]) and reading the indicated image file (Ramani: A complete raster to vector conversion process includes image acquisition, pre-processing, line tracing, text extraction, shape recognition, topology creation and attribute assignment [0367]; see Note 4A). Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to combine the teachings of Ramani with Santos in view of Yasuo. Receiving an indication of an image file through a user interface and reading the image file, as in Ramani, would benefit the Santos in view of Yasuo teachings by enabling a user to import a previously created sketch or image rather as opposed to recreating the sketch within the software. Regarding claim 20: Santos in view of Yasuo teaches: The non-transitory, computer-readable medium of claim 17 (as shown above), wherein receiving the two-dimensional sketch data comprises receiving an indication of an image file (Santos: A popular approach, for which several patents have been granted, try to reconstruct 3D meshes from images and/or video. The inventor considers these methods orthogonal to the present invention in the sense that the objective of reconstruction is to reproduce existing objects whereas the present invention intends to create new ones. [0005]) Santos in view of Yasuo fails to teach: wherein receiving the two-dimensional sketch data comprises receiving an indication of an image file through the two-dimensional sketching user interface and reading the indicated image file. Ramani teaches: wherein receiving the two-dimensional sketch data comprises receiving an indication of an image file (Ramani: raster images [0363]) through the two-dimensional sketching user interface (Ramani: 2D Drawing Interface: Users are provided options to send in their scanned drawings to the server. [0213-0214]) and reading the indicated image file (Ramani: A complete raster to vector conversion process includes image acquisition, pre-processing, line tracing, text extraction, shape recognition, topology creation and attribute assignment [0367]; see Note 4A). Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to combine the teachings of Ramani with Santos in view of Yasuo. Receiving an indication of an image file through a user interface and reading the image file, as in Ramani, would benefit the Santos in view of Yasuo teachings by enabling a user to import a previously created sketch or image rather as opposed to recreating the sketch within the software. Claims 5 and 6 are rejected under 35 U.S.C. 103 as being unpatentable over Santos (US 20190066374 A1) in view of Yasuo (JP 2008102767 A) and Blender Documentation Team (NPL: Transform Node; hereinafter Blender A). Regarding claim 5: Santos in view of Yasuo teaches: The method of claim 1 (as shown above), Santos in view of Yasuo fails to teach: wherein the plurality of node types comprises a mesh node type, the mesh node type defining one or more of position, rotation, or scaling for the node. Blender A teaches: wherein the plurality of node types comprises a mesh node type, the mesh node type defining one or more of position, rotation, or scaling for the node (Blender A: The Transform Node allows you to move, rotate or scale the geometry, Pg. 1, par. 1; see Note 5A and Note 7A below). Note 5A: The Transform Node taught by Blender A visibly contains the position, rotation, and scaling parameters. Therefore, it is reasonable to consider the position, rotation, and scaling as defined “for the node”. PNG media_image2.png 438 207 media_image2.png Greyscale Transform Node as seen on Pg. 1 of the Blender A reference. Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to combine the teachings of Blender A with Santos in view of Yasuo. Including a mesh node defining one or more of position, rotation, or scaling for the node, as in Blender A, would benefit the Santos in view of Yasuo teachings by enabling a user to freely transform mesh geometry affected by the node graph. Regarding claim 6: Santos in view of Yasuo teaches: The method of claim 1 (as shown above), Santos in view of Yasuo fails to teach: wherein the plurality of node types comprises a mesh feature node type, the mesh feature node type providing for manipulation of one or more features of the node. Blender A teaches: wherein the plurality of node types comprises a mesh feature node type, the mesh feature node type providing for manipulation of one or more features of the node (see Note 6A and Note 7A). Note 6A: In Note 5A, it was shown that Blender A teaches a Transform Node where position, rotation, and scaling are all features of the node. Blender A further teaches that the position, rotation, and scale parameters are inputs, and therefore can be changed by the user. Therefore, it is reasonable to conclude that Blender A teaches a mesh feature node providing for manipulation of one or more features of the node. Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to combine the teachings of Blender A with Santos in view of Yasuo. Including a mesh feature node providing for manipulation of one or more features of the node, as in Blender A, would benefit the Santos in view of Yasuo teachings by enabling a user to freely transform mesh features affected by the node graph. Claim 7 and 11 are rejected under 35 U.S.C. 103 as being unpatentable over Santos (US 20190066374 A1) in view of Yasuo (JP 2008102767 A) and Science Monkey (NPL: Mate Connectors in Onshape). Regarding claim 7: Santos in view of Yasuo teaches: The method of claim 1 (as shown above), Santos in view of Yasuo fails to teach: wherein the plurality of node types comprises a connector node type, the connector node type comprising a position property and a direction property defining point on a mesh for use by other nodes. Santos in view of Yasuo and Science Monkey teaches: wherein the plurality of node types comprises a connector node type, the connector node type comprising a position property and a direction property defining point on a mesh for use by other nodes (see Note 7A and Note 7B). Note 7A: Santos teaches: “A user may define a starting (or root) sketch 24 as a labeled polygon or similar and will proceed to apply instructions to it in order to arrive at a 3D model of its design” [0032]. That is, each operation performed by the user may be considered an “instruction” and therefore be added as a corresponding node: “Instructions are collected as a series of nodes in a graph as represented in FIG. 4,” [0041]. Note 7B: Science Monkey creates a mate connector as part of creating a sketch at 6:46 in the video. The mate connector is shown to have a position in 3D space. Then, Science Monkey re-aligns the mate connector at a 30 degree angle. PNG media_image3.png 654 1294 media_image3.png Greyscale Science Monkey places a mate connector at 6:46. PNG media_image4.png 803 1427 media_image4.png Greyscale PNG media_image5.png 803 1427 media_image5.png Greyscale Science Monkey changes the direction of the mate connector at 8:22 in the video. It is shown by the video that Science Monkey gives instructions to the software to create a mate connector on a mesh with a specific position and direction. In Note 7A, it was shown that all instructions may be turned into nodes in a node graph. Santos teaches: “The list of possible instructions is constrained only by a user's imaginations and includes all algorithms in the computing device graphics literature.” [0031] Therefore, it would be obvious to one of ordinary skill in the art to generate a mate connector node or a “connector node” comprising a position property and a direction property defining point on a mesh for use by other nodes. Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to combine the teachings of Science Monkey with Santos in view of Yasuo. Including a connector node comprising a position property and a direction property defining point on a mesh for use by other nodes, as in Science Monkey, would benefit the Santos in view of Yasuo teachings by enabling a user to freely connect mesh geometry affected by the node graph. Regarding claim 11: Santos in view of Yasuo teaches: The method of claim 1 (as shown above), Santos in view of Yasuo fails to teach: wherein the plurality of node types comprises a mate node type, the mate node type comprising properties to link a plurality of other nodes. Science Monkey teaches: wherein the plurality of node types comprises a mate node type, the mate node type comprising properties to link a plurality of other nodes (see Note 11A and Note 7A above). Note 11A: Science Monkey showcases that a “mate” may be connected to another mate at 5:43 to 5:46 in the video. PNG media_image6.png 803 1429 media_image6.png Greyscale Selecting the first mate at 5:43. PNG media_image7.png 801 1430 media_image7.png Greyscale Selecting a second mate. PNG media_image8.png 799 1427 media_image8.png Greyscale Connecting the first mate with the second mate. It is shown by the video that Science Monkey gives instructions to the software to link one mate to another. In Note 7A, it was shown that all instructions may be turned into nodes in a node graph. Santos teaches: “The list of possible instructions is constrained only by a user's imaginations and includes all algorithms in the computing device graphics literature.” [0031] Therefore, it would be obvious to one of ordinary skill in the art to generate a mate node comprising properties to link a plurality of other nodes. Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to combine the teachings of Science Monkey with Santos in view of Yasuo. Including a mate node comprising properties to link a plurality of other nodes, as in Science Monkey, would benefit the Santos in view of Yasuo teachings by enabling a user to freely link mesh geometry affected by the node graph. Claim 8 is rejected under 35 U.S.C. 103 as being unpatentable over Santos (US 20190066374 A1) in view of Yasuo (JP 2008102767 A) and Blender Documentation Team: (NPL: Image Texture Node; hereinafter Blender B). Santos in view of Yasuo teaches: The method of claim 1 (as shown above), Santos in view of Yasuo fails to teach: wherein the plurality of node types comprises a texture layer node type, the texture layer node type comprising a path to an image and properties to scale, blend, or colorize the image to other nodes. Blender B teaches: wherein the plurality of node types comprises a texture layer node type, the texture layer node type comprising a path to an image (Blender B: The Image Texture is used to add an image file as a texture, Pg. 1, par. 1; see Note 8A) and properties to scale (Blender B: Texture coordinate for texture look-up, Pg. 1, par. 2; see Note 8B), blend, or colorize the image to other nodes. Note 8A: Adding an image file as a texture requires obtaining the path to the image file. Note 8B: It is well known in the art that texture coordinates (also known as UV coordinates) may be used to offset, rotate, or scale an image texture on a mesh. Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to combine the teachings of Blender B with Santos in view of Yasuo. Including a texture layer node comprising a path to an image and properties to scale, blend, or colorize the image to other nodes, as in Blender B, would benefit the Santos in view of Yasuo teachings by enabling a user to add textures to a mesh in order to simulate a real object. Claim 9 is rejected under 35 U.S.C. 103 as being unpatentable over Santos (US 20190066374 A1) in view of Yasuo (JP 2008102767 A), Blender Documentation Team: (NPL: Image Texture Node; hereinafter Blender B) and Blender Documentation Team: (NPL: Principled BSDF, hereinafter Blender C). Santos in view of Yasuo and Blender B teaches: The method of claim 8 (as shown above), Santos in view of Yasuo and Blender B fails to teach: wherein the plurality of node types comprises a material node type, the material node type comprising a plurality of ordered texture nodes. Blender C teaches: wherein the plurality of nodes comprise a material node (Blender C: This “Uber” shader includes multiple layers to create a wide variety of materials, Pg. 1, par. 2), the material node comprising a plurality of ordered texture nodes (Blender C: Image textures painted or baked from software like Substance Painter® may be directly linked to the corresponding parameters in this shader, Pg. 1, par. 1; see Note 9A). Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to combine the teachings of Blender C with Santos in view of Yasuo and Blender B. Including a texture layer node comprising a path to an image and properties to scale, blend, or colorize the image to other nodes, as in Blender C, would benefit the Santos in view of Yasuo and Blender B teachings by enabling a user to add textures to a mesh in order to simulate a real object. Note 9A: The Principled BSDF node has inputs for image textures listed in a specific order. Therefore, it is reasonable to conclude that the Principled BSDF node will link to texture nodes in order. Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Santos (US 20190066374 A1) in view of Yasuo (JP 2008102767 A) and Blender Documentation Team: (NPL: Light Node; hereinafter Blender D). Santos in view of Yasuo teaches: The method of claim 1 (as shown above), Santos in view of Yasuo fails to teach: wherein the plurality of node types comprises a light node type, the light node type defining properties affecting light or color on a surface of the three-dimensional drawing. Blender D teaches: wherein the plurality of nodes comprise a light node, the light node defining properties affecting light or color on a surface (Blender D: The Light Output node is used to output light information to a light object, Pg. 1, par. 1; Blender D: Inputs: Surface: Shading for the surface of the light object, Pg. 1, par. 2) of the three-dimensional drawing. Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to combine the teachings of Blender D with Santos in view of Yasup. Including a light node, the light node defining properties affecting light or color on a surface of the three-dimensional drawing, as in Blender D, would benefit the Santos in view of Yasuo and Blender D teachings by enabling a user to add lighting effects to a scene in order to simulate a real object. Claim 12 is rejected under 35 U.S.C. 103 as being unpatentable over Santos (US 20190066374 A1) in view of Yasuo (JP 2008102767 A), Onshape (NPL: Note; hereinafter Onshape A), and Blender Documentation Team (NPL: Measure, hereinafter Blender E). Santos in view of Yasuo teaches: The method of claim 1 (as shown above), Santos in view of Yasuo fails to teach: wherein the plurality of node types comprises an annotation and dimension node type, the annotation and dimension node type comprising text and properties to attach to connector nodes. Onshape A teaches: an annotation tool, the annotation comprising text and properties to attach to connector nodes (Onshape A: Add single or multi-line text notes to any drawing, wherever you want, and use them to fill in the title blocks as well. You have the ability to define the size of the text box as well as format the text itself and optionally include a leader or many leaders, Pg. 1, par. 1; see Note 12A). Note 12A: Onshape A showcases that annotations can be added to a mesh. In Note 7A, it was shown that all instructions may be turned into nodes in a node graph. Santos teaches: “The list of possible instructions is constrained only by a user's imaginations and includes all algorithms in the computing device graphics literature.” [0031] Therefore, it would be obvious to one of ordinary skill in the art to generate an annotation node comprising text and properties to attach to connector nodes. Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to combine the teachings of Onshape A with Santos in view of Yasuo. Including a annotation node comprising text and properties to attach to other nodes, as in Onshape A, would benefit the Santos in view of Yasuo teachings by enabling a user to annotate their sketch without modifying the underlying geometry. Santos in view of Yasuo and Onshape A still fails to teach: wherein the plurality of nodes comprise an annotation and dimension node, the annotation and dimension node comprising text and properties to attach to connector nodes. Blender E teaches: a dimension tool comprising text and properties to attach to connector nodes (Blender E: The Measure tool is an interactive tool where you can drag lines in the scene to measure distances or angles, Pg. 1, par. 1; see Note 12B). Note 12B: Blender E showcases that dimensions can be added to a mesh in the Figure on Pg. 1. In Note 7A, it was shown that all instructions may be turned into nodes in a node graph. Santos teaches: “The list of possible instructions is constrained only by a user's imaginations and includes all algorithms in the computing device graphics literature.” [0031] Therefore, it would be obvious to one of ordinary skill in the art to generate a dimension node comprising text and properties to attach to connector nodes. Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to combine the teachings of Blender E with Santos in view of Yasuo and Onshape A. Including a annotation node comprising text and properties to attach to other nodes, as in Blender D, would benefit the Santos in view of Yasuo and Onshape A teachings by enabling a user to measure their sketch with real-world units. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Akihisa (JP 2010009394 A) teaches a method where a three-dimensional model can be updated via a two-dimensional sketch. Martos (NPL: Tech Tip: Using Textures and Blends in Render Studio) teaches blending and colorizing textures for materials. Note that it may be helpful to view the live webpage as the videos on the webpage showcase details not described in the text. THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to VINCENT ALEXANDER PROVIDENCE whose telephone number is (571)270-5765. The examiner can normally be reached Monday-Thursday 8:30-5:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, King Poon can be reached at (571)270-0728. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /VINCENT ALEXANDER PROVIDENCE/Examiner, Art Unit 2617 /KING Y POON/Supervisory Patent Examiner, Art Unit 2617
Read full office action

Prosecution Timeline

Feb 26, 2024
Application Filed
Oct 09, 2025
Non-Final Rejection — §103
Jan 14, 2026
Response Filed
Feb 04, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12586303
GEOMETRY-AWARE THREE-DIMENSIONAL SYNTHESIS IN ALL ANGLES
2y 5m to grant Granted Mar 24, 2026
Patent 12530847
IMAGE GENERATION FROM TEXT AND 3D OBJECT
2y 5m to grant Granted Jan 20, 2026
Patent 12530808
Predictive Encoding/Decoding Method and Apparatus for Azimuth Information of Point Cloud
2y 5m to grant Granted Jan 20, 2026
Patent 12524946
METHOD FOR GENERATING FIREWORK VISUAL EFFECT, ELECTRONIC DEVICE, AND STORAGE MEDIUM
2y 5m to grant Granted Jan 13, 2026
Patent 12380621
COMPUTER-IMPLEMENTED SYSTEMS AND METHODS FOR GENERATING ENHANCED MOTION DATA AND RENDERING OBJECTS
2y 5m to grant Granted Aug 05, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
83%
Grant Probability
99%
With Interview (+25.0%)
2y 5m
Median Time to Grant
Moderate
PTA Risk
Based on 18 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month