Prosecution Insights
Last updated: April 19, 2026
Application No. 18/428,091

COMPUTER-READABLE RECORDING MEDIUM STORING POSTURE SPECIFYING PROGRAM, POSTURE SPECIFYING METHOD, AND INFORMATION PROCESSING APPARATUS

Non-Final OA §102§103§112
Filed
Jan 31, 2024
Examiner
CHANG, DANIEL CHEOLJIN
Art Unit
2669
Tech Center
2600 — Communications
Assignee
Fujitsu Limited
OA Round
1 (Non-Final)
89%
Grant Probability
Favorable
1-2
OA Rounds
2y 6m
To Grant
99%
With Interview

Examiner Intelligence

Grants 89% — above average
89%
Career Allow Rate
117 granted / 132 resolved
+26.6% vs TC avg
Moderate +12% lift
Without
With
+11.7%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
25 currently pending
Career history
157
Total Applications
across all art units

Statute-Specific Performance

§101
8.1%
-31.9% vs TC avg
§103
53.4%
+13.4% vs TC avg
§102
14.1%
-25.9% vs TC avg
§112
20.7%
-19.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 132 resolved cases

Office Action

§102 §103 §112
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Notice to Applicants This communication is in response to the Application filed on 01/31/2024. Claims 1-18 are pending. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claim 5, 6, 11, 12, 17 and 18 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 5 recites the limitations of “the skeletal information” (line 3). There is insufficient antecedent basis for this limitation in the claim. It is unclear if “the skeletal information” is referring back to “the skeleton information” in claim 1 or something else. Clarification/explanation is required. With respect to claim 11, arguments analogous to those presented for claim 5, are applicable. With respect to claim 17, arguments analogous to those presented for claim 5, are applicable. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim 1, 5, 7, 11, 13 and 17 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Nakayama et al. (U.S. Publication No. 2022/0083769) (hereafter, "Nakayama"). Regarding claim 1, Nakayama teaches a non-transitory computer-readable recording medium ([0002] Embodiments described herein relate generally to a work estimation apparatus, a method and a non-transitory computer-readable storage medium) storing a posture specification program causing a computer to execute ([0059] The posture estimation unit 121 may specify a worker, based on the video data. Specifically, the posture estimation unit 121 identifies a detected worker by using worker identification data. The worker identification data includes for example, a learned model (worker identification model) trained to identify a worker from video data. The worker identification model is trained in advance such that the worker can be identified from a face photograph of a worker and a photograph of the clothes of the worker): generating skeleton information indicating two dimensional coordinates of a plurality of joints of a person based on visual information which is obtained by capturing the person ([0057] The learned models used to estimate the posture of a person include, for example, a two-dimensional skeleton estimation model that estimates a skeleton of a person of video data on a two-dimensional image, a three-dimensional skeleton estimation model that estimates a three-dimensional skeleton by applying a two-dimensional skeleton estimation result; [0058] The two-dimensional skeleton estimation model is trained in advance such that a person can be detected from video data and a skeleton can be detected from the detected person; [0078] After the video data is acquired, the posture estimation unit 121 estimates a posture of the worker, based on the video data. Specifically, the posture estimation unit 121 detects a skeleton of a person from video data, using a two-dimensional skeleton estimation model); setting angles of the plurality of joints and a direction of a part of the person based on the skeleton information ([0090] The posture estimation unit 121 calculates an angle by which the waist is bent, based on angle θ1 formed by vector v1 and vector v2, the vector v1 representing the direction from the midpoint of the hips (key point KP13 of the “right hip” and key point KP14 of the “left hip”) of the three-dimensional human skeleton model to the midpoint between the feet (key point KP17 of the “right foot” and key point KP18 of the “left foot”), and the vector v2 representing the direction from, the midpoint of the hips to the neck (key point KP6 of the “neck”). The posture estimation unit 121 further calculates an angle by which the waist is twisted, based on angle θ2 formed by vector v3 and vector v4, the vector v3 representing the direction from the right hip to the left hip of the three-dimensional human skeleton model, and the vector v4 representing the direction from the right shoulder (key point KP11 of the “right shoulder”) to the left shoulder (key point KP12 of the “left shoulder”)); and specifying a posture of the person based on the angles of the plurality of joints and the direction of the part of the person ([0090] The posture estimation unit 121 calculates an angle by which the waist is bent, based on angle θ1 formed by vector v1 and vector v2, the vector v1 representing the direction from the midpoint of the hips (key point KP13 of the “right hip” and key point KP14 of the “left hip”) of the three-dimensional human skeleton model to the midpoint between the feet (key point KP17 of the “right foot” and key point KP18 of the “left foot”), and the vector v2 representing the direction from, the midpoint of the hips to the neck (key point KP6 of the “neck”). The posture estimation unit 121 further calculates an angle by which the waist is twisted, based on angle θ2 formed by vector v3 and vector v4, the vector v3 representing the direction from the right hip to the left hip of the three-dimensional human skeleton model, and the vector v4 representing the direction from the right shoulder (key point KP11 of the “right shoulder”) to the left shoulder (key point KP12 of the “left shoulder”). The posture estimation unit 121 classifies (estimates) the states of the “back”, based on whether or not each of the angles θ1 and θ2 exceeds 20 degrees; [0095] posture estimation results using the behavior estimation model are expressed as combinations of the states of a plurality of body parts. Specifically, a posture estimation result corresponds to a combination of state classification symbols of a plurality of body parts shown in the Table 19 of FIG. 7). Regarding claim 5, Nakayama teaches all the limitations of claim 1 above. Nakayama teaches further comprising: specifying an angle of a joint based on the skeletal information corresponding to the posture specified by the specifying the posture ([0089] The states of the “back” of the three-dimensional human skeleton model can be classified, for example, by an angle by which the waist is bent and an angle by which the waist twisted. Specifically, the states of the “back” can be distinguished by detecting whether or not the waist is bent by 20 degrees or more and whether or not the waist is twisted by 20 degrees or more; [0090]). With respect to claim 7, arguments analogous to those presented for claim 1, are applicable. With respect to claim 11, arguments analogous to those presented for claim 5, are applicable. With respect to claim 13, arguments analogous to those presented for claim 1, are applicable. With respect to claim 17, arguments analogous to those presented for claim 5, are applicable. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim 2, 3, 8, 9, 14 and 15 are rejected under 35 U.S.C. 103 as being unpatentable over Nakayama et al. (U.S. Publication No. 2022/0083769) (hereafter, "Nakayama") in view of UTSUNOMIYA et al. (U.S. Publication No. 2015/0003687) (hereafter, "UTSUNOMIYA") and further in view of NAITO et al. (U.S. Publication No. 2020/0188736) (hereafter, "NAITO"). Regarding claim 2, Nakayama teaches all the limitations of claim 1 above. Nakayama does not expressly teach wherein the setting of the direction of the part includes setting, as the direction of the part, one of an upward direction and a downward direction based on the direction of the part when an angle of the direction of the part with respect to a horizontal direction is equal to or larger than a first angle. However, UTSUNOMIYA teaches wherein the setting of the direction of the part includes setting, as the direction of the part, one of an upward direction and a downward direction ([0231] FIG. 24F illustrates index values for forward flexion and rearward flexion of the neck. FIG. 24G illustrates index values for forward flexion and rearward flexion of the thoracolumbar area) based on the direction of the part when an angle of the direction of the part with respect to a horizontal direction ([0144] as illustrated in FIG. 9B, for example, the analyzing unit 142 calculates an angle “θ2” of the axis from the joint “2e” corresponding to the right shoulder to the joint “2i” corresponding to the left shoulder, with respect to the horizontal direction on the x-z plane. In that situation, the analyzing unit 142 calculates a straight line that passes through the coordinate information (x5,z5) of the joint “2e” corresponding to the right shoulder and the coordinate information (x9,z9) of the joint “2i” corresponding to the left shoulder in a predetermined frame and further calculates the angle formed by the calculated straight line and a straight line parallel to the x-axis as the angle “θ2”. In other words, the analyzing unit 142 calculates a degree of deviation of the body in a rotation direction centered on the body axis; [0145] the analyzing unit 142 calculates an angle “θ3” of the axis (the body axis) from the joint “2a” corresponding to the head to the joint “2c” corresponding to the lumbar, with respect to the vertical direction on the x-y plane. In that situation, the analyzing unit 142 calculates a straight line that passes through the coordinate information (x1,z1) of the joint “2a” corresponding to the head and the coordinate information (x3,z3) of the joint “2c” corresponding to the lumbar in a predetermined frame and further calculates the angle formed by the calculated straight line and a straight line parallel to the y-axis as the angle “θ3”; [0248] & [0250]). It would have been obvious before the effective filing date of the claimed invention to one having ordinary skill in the art to modify the device and method of Nakayama to incorporate the step/system of determining an upward or a downward direction as a direction of a body part and determining a degree of deviation of the body based on an angle of a body part with respect to a horizontal direction taught by UTSUNOMIYA. The suggestion/motivation for doing so would have been to improve the accuracy of evaluation by analyzing angles of a body part ([0173] the motion information processing apparatus 100 is able to set the designating parts in medically significant positions and thus makes it possible to perform the evaluation more accurately). Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predicted results. UTSUNOMIYA does not expressly teach ... is equal to or larger than a first angle. However, NAITO teaches … is equal to or larger than a first angle ([0089] The spine angle (Spine_Angle of an incline) is an angle formed by the Z axis and a line segment passing through joint No. 0 and joint No. 2. For example, a spine angle of “θA1 or smaller” indicates that the center line of a competitor's body is along the vertically upward direction; [0136] when a posture with a waist angle of 135° or larger and a knee angle of 90° or larger is defined as a posture of a stretched body, the evaluation unit 164 judges, based on the graph 30, that a stretch or bend posture of the player 10 is a “bending posture”). It would have been obvious before the effective filing date of the claimed invention to one having ordinary skill in the art to modify the device and method of combination of Nakayama and UTSUNOMIYA to incorporate the step/system of determining a direction by using a predetermined angle taught by NAITO. The suggestion/motivation for doing so would have been to improve the accuracy of measuring the angle of body parts ([0047] In a scoring competition, a referee judges the success or failure of a skill and the degree of perfection of the skill, based on a body orientation angle of a player, but, it is difficult to accurately measure the angle with human's eyes). Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predicted results. Therefore, it would have been obvious to combine Nakayama and UTSUNOMIYA with NAITO to obtain the invention as specified in claim 2. Regarding claim 3, Nakayama teaches all the limitations of claim 1 above. Nakayama does not expressly teach wherein the setting of the direction of the part includes setting, as the direction of the part, one of a left direction and a right direction based on the direction of the part when an angle of the direction of the part with respect to a vertical direction is equal to or larger than a second angle. However, UTSUNOMIYA teaches wherein the setting of the direction of the part includes setting, as the direction of the part, one of a left direction and a right direction ([0256] FIG. 25A illustrates an index value for the abduction and the adduction of the shoulder. FIG. 25E illustrates an index value for the external rotation and the internal rotation of the shoulder) based on the direction of the part when an angle of the direction of the part with respect to a vertical direction ([0145] as illustrated in FIG. 9C, the analyzing unit 142 calculates an angle “θ3” of the axis (the body axis) from the joint “2a” corresponding to the head to the joint “2c” corresponding to the lumbar, with respect to the vertical direction on the x-y plane. In that situation, the analyzing unit 142 calculates a straight line that passes through the coordinate information (x1,z1) of the joint “2a” corresponding to the head and the coordinate information (x3,z3) of the joint “2c” corresponding to the lumbar in a predetermined frame and further calculates the angle formed by the calculated straight line and a straight line parallel to the y-axis as the angle “θ3”. In other words, the analyzing unit 142 calculates the degree of the left-right angle (the left-right angle for the subject); [0144] the analyzing unit 142 calculates an angle “θ2” of the axis from the joint “2e” corresponding to the right shoulder to the joint “2i” corresponding to the left shoulder, with respect to the horizontal direction on the x-z plane. In that situation, the analyzing unit 142 calculates a straight line that passes through the coordinate information (x5,z5) of the joint “2e” corresponding to the right shoulder and the coordinate information (x9,z9) of the joint “2i” corresponding to the left shoulder in a predetermined frame and further calculates the angle formed by the calculated straight line and a straight line parallel to the x-axis as the angle “θ2”; [0261] & [0262]). It would have been obvious before the effective filing date of the claimed invention to one having ordinary skill in the art to modify the device and method of Nakayama to incorporate the step/system of determining a left and right direction as a direction of a body part and determining a degree of deviation of the body based on an angle of a body part with respect to a vertical direction taught by UTSUNOMIYA. The suggestion/motivation for doing so would have been to improve the accuracy of evaluation by analyzing angles of a body part ([0173] the motion information processing apparatus 100 is able to set the designating parts in medically significant positions and thus makes it possible to perform the evaluation more accurately). Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predicted results. UTSUNOMIYA does not expressly teach ... is equal to or larger than a second angle. However, NAITO teaches … is equal to or larger than a second angle ([0089] The spine angle (Spine_Angle of an incline) is an angle formed by the Z axis and a line segment passing through joint No. 0 and joint No. 2. For example, a spine angle of “θA1 or smaller” indicates that the center line of a competitor's body is along the vertically upward direction; [0140] When the arm angle of the cross is smaller than 45°, the evaluation unit 164 judges that the cross has been successful; [0158] The evaluation unit 164 judges a shift in a twist angle of the body during a salto, based on a shift in the right-left direction of the waist; [0123]). It would have been obvious before the effective filing date of the claimed invention to one having ordinary skill in the art to modify the device and method of combination of Nakayama and UTSUNOMIYA to incorporate the step/system of determining a direction by using a predetermined angle taught by NAITO. The suggestion/motivation for doing so would have been to improve the accuracy of measuring the angle of body parts ([0047] In a scoring competition, a referee judges the success or failure of a skill and the degree of perfection of the skill, based on a body orientation angle of a player, but, it is difficult to accurately measure the angle with human's eyes). Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predicted results. Therefore, it would have been obvious to combine Nakayama and UTSUNOMIYA with NAITO to obtain the invention as specified in claim 3. With respect to claim 8, arguments analogous to those presented for claim 2, are applicable. With respect to claim 9, arguments analogous to those presented for claim 3, are applicable. With respect to claim 14, arguments analogous to those presented for claim 2, are applicable. With respect to claim 15, arguments analogous to those presented for claim 3, are applicable. Claim 4, 10 and 16 are rejected under 35 U.S.C. 103 as being unpatentable over Nakayama et al. (U.S. Publication No. 2022/0083769) (hereafter, "Nakayama") in view of KANNAN et al. (U.S. Publication No. 2022/0114751) (hereafter, "KANNAN"). Regarding claim 4, Nakayama teaches all the limitations of claim 1 above. Nakayama teaches wherein the specifying the posture includes: repeatedly executing the specifying the posture every time the angles of the plurality of joints and the direction of the part of the person are set; and ([0118] Where the video data is acquired in real time, the process flow may return to step ST110 after the processing of step ST130, and the subsequent processes may be repeated; [0076] The flowchart of FIG. 4 illustrates details of the processing of step ST120 shown in FIG. 3; [0078] After the video data is acquired, the posture estimation unit 121 estimates a posture of the worker, based on the video data; [0090] The posture estimation unit 121 calculates an angle by which the waist is bent, based on angle θ1 formed by vector v1 and vector v2, the vector v1 representing the direction from the midpoint of the hips; [0083] After the three-dimensional human skeleton model is generated, the posture estimation unit 121 estimates a behavior and a posture of the person from the time series data on the three-dimensional human skeleton model). Nakayama does not expressly teach correcting a type of the posture based on a pattern of consecutive postures specified by the specifying the posture. However, KANNAN teaches correcting a type of the posture based on a pattern of consecutive postures specified by the specifying the posture ([0056] the AR content controller (160) is configured to estimate the posture of each object of the plurality of the objects based on features derived from the detected multiple body key-points to classify the posture of each object of the plurality of the objects in the scene; [0057] the AR content controller (160) is configured to classify the action of each object of the plurality of the objects based on the classified postures and multiple body key-points identified over a current frame and multiple past frames). It would have been obvious before the effective filing date of the claimed invention to one having ordinary skill in the art to modify the device and method of combination of Nakayama to incorporate the step/system of classifying the posture of each object based on analyzing the sequence of postures over time (the pattern) taught by KANNAN. The suggestion/motivation for doing so would have been to improve to predict the posture of each object in the scene and accuracy AR effects in the real-world environment ([0016] time interleaving … pose features obtained from multiple objects in a single camera frame, and providing each object pose feature as input to a simultaneous real-time classification model, one after the other, to predict the posture of all objects in the single camera frame before the next camera frame arrives ... applying … the simultaneous real-time classification model to predict the posture of each object in each frame of the scene; [0006] the existing methods/electronic device(s) do not consider multi-object interactions and intents to insert the AR text and the AR effects accurately in the real-world environment. Thus, it is desired to provide a useful alternative for inserting/generating the AR content in the electronic device). Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predicted results. Therefore, it would have been obvious to combine Nakayama with KANNAN to obtain the invention as specified in claim 4. With respect to claim 10, arguments analogous to those presented for claim 4, are applicable. With respect to claim 16, arguments analogous to those presented for claim 4, are applicable. Claim 6, 12 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Nakayama et al. (U.S. Publication No. 2022/0083769) (hereafter, "Nakayama") in view of NAITO et al. (U.S. Publication No. 2020/0188736) (hereafter, "NAITO"). Regarding claim 6, Nakayama teaches all the limitations of claim 5 above. Nakayama does not expressly teach further comprising: calculating a score for each of postures based on the angle of the joint of the respective postures and a reference angle of the joint of the respective postures; and calculating a total score of the postures. However, NAITO teaches further comprising: calculating a score for each of postures based on the angle of the joint of the respective postures and a reference angle of the joint of the respective postures ([0113] The E evaluation index defines the number of points deducted in the E score in accordance with a value of an evaluation item. For example, when an evaluation item “the arm angle of the cross” corresponds to “1° to 15°”, “0.1” points are deducted from the E score. When the evaluation item “the arm angle of the cross” corresponds to “16° to 30°”, “0.3” points are deducted from the E score. When the evaluation item “the arm angle of the cross” corresponds to “31° to 45°”, “0.5” points are deducted from the E score; [0144] When “the arm angle of the cross” is in a range of “1° to 15°”, the evaluation unit 164 deducts “0.1” from the E score. When “the arm angle of the cross” is in a range of “16° to 30°”, the evaluation unit 164 deducts “0.3” from the E score. When “the arm angle of the cross” is in a range of “31° to 45°”, the evaluation unit 164 deducts “0.5” from the E score; [0174]); and calculating a total score of the postures ([0159] the evaluation unit 164 calls the evaluation items corresponding to the determined postures. For example, the evaluation unit 164 reads out an E evaluation index corresponding to each of the evaluation items, and determines a reference value for point deduction. The reference value is 0 (with no point deducted), 0.1 (small error), 0.3 (middle error), or 0.5 (large error). The evaluation unit 164 determines the total of points deducted determined by the evaluation items, and subtracts from 10 points the total of the points deducted, so as to confirm the E score; [0174]). It would have been obvious before the effective filing date of the claimed invention to one having ordinary skill in the art to modify the device and method of Nakayama to incorporate the step/system of calculating a score for each of postures by using the angle of the joint of the respective postures and a predetermined angle of the joint of the each postures and determining a total score of the postures taught by NAITO. The suggestion/motivation for doing so would have been to improve the accuracy of measuring the angle of body parts ([0047] In a scoring competition, a referee judges the success or failure of a skill and the degree of perfection of the skill, based on a body orientation angle of a player, but, it is difficult to accurately measure the angle with human's eyes). Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predicted results. Therefore, it would have been obvious to combine Nakayama with NAITO to obtain the invention as specified in claim 6. With respect to claim 12, arguments analogous to those presented for claim 6, are applicable. With respect to claim 18, arguments analogous to those presented for claim 6, are applicable. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to DANIEL C. CHANG whose telephone number is (571)270-1277. The examiner can normally be reached Monday-Thursday and Alternate Fridays 8:00-5:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chan S. Park can be reached at (571) 272-7409. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DANIEL C CHANG/Examiner, Art Unit 2669 /CHAN S PARK/Supervisory Patent Examiner, Art Unit 2669
Read full office action

Prosecution Timeline

Jan 31, 2024
Application Filed
Jan 18, 2026
Non-Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12592097
REAL-TIME, FINE-RESOLUTION HUMAN INTRA-GAIT PATTERN RECOGNITION BASED ON DEEP LEARNING MODELS
2y 5m to grant Granted Mar 31, 2026
Patent 12579672
STEREO VISION-BASED HEIGHT CLEARANCE DETECTION
2y 5m to grant Granted Mar 17, 2026
Patent 12573047
Control Method, Device, Equipment and Storage Medium for Interactive Reproduction of Target Object
2y 5m to grant Granted Mar 10, 2026
Patent 12548296
Spatially Preserving Flattening in Deep Learning Neural Networks
2y 5m to grant Granted Feb 10, 2026
Patent 12541868
Image Registration Method and Apparatus, Electronic Apparatus, and Storage Medium
2y 5m to grant Granted Feb 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
89%
Grant Probability
99%
With Interview (+11.7%)
2y 6m
Median Time to Grant
Low
PTA Risk
Based on 132 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month