Prosecution Insights
Last updated: April 19, 2026
Application No. 18/743,726

METHOD FOR GENERATING A 3D MODEL

Non-Final OA §DP
Filed
Jun 14, 2024
Examiner
BRIER, JEFFERY A
Art Unit
2613
Tech Center
2600 — Communications
Assignee
ResMed
OA Round
1 (Non-Final)
77%
Grant Probability
Favorable
1-2
OA Rounds
2y 11m
To Grant
85%
With Interview

Examiner Intelligence

Grants 77% — above average
77%
Career Allow Rate
650 granted / 849 resolved
+14.6% vs TC avg
Moderate +9% lift
Without
With
+8.7%
Interview Lift
resolved cases with interview
Typical timeline
2y 11m
Avg Prosecution
16 currently pending
Career history
865
Total Applications
across all art units

Statute-Specific Performance

§101
18.1%
-21.9% vs TC avg
§103
23.0%
-17.0% vs TC avg
§102
19.4%
-20.6% vs TC avg
§112
29.4%
-10.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 849 resolved cases

Office Action

§DP
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Information Disclosure Statement The information disclosure statement filed 06/14/2024 fails to comply with 37 CFR 1.98(a)(2), which requires a legible copy of each cited foreign patent document; each non-patent literature publication or that portion which caused it to be listed; and all other information or that portion which caused it to be listed. Non-Patent Literature document 3 was not provided in this application and it is not present in parent non-provisional US Patent application 17/796,475. Thus, it has been lined through on the IDS form and not considered. Response to Preliminary Amendment The Preliminary Amendment filed on 10/02/2024 has been entered. Response to Preliminary Remarks Applicant's Preliminary Remarks filed 10/02/2024 concerning the Preliminary Amendment have been considered and the amendment has been entered. CLAIM INTERPRETATION The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. Claims 1-19 have been interpreted under 35 U.S.C. 112(f) (pre-AIA 35 U.S.C. 112, sixth paragraph) to not invoke 35 U.S.C. 112(f) (pre-AIA 35 U.S.C. 112, sixth paragraph) claim interpretation. The examiner notes various claim elements are claimed to be “configured to” and those elements are considered to have structure. Claim Objections Claims 1-19 are objected to because of the following informalities: Claim 1 claims “generate the three-dimensional model of the at least part of the head based on them position of the longitudinal axis; and” in which “them” should be changed to the. Claim 1 claims “provide data for an item of facewear comprising the therapy mask based on the generated three-dimensional modal.” in which “modal” should be changed to model. Appropriate correction is required. Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13. The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer. Claims 1-19 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 2-17, 19, and 20 of U.S. Patent No. 12,033,278. Although the claims at issue are not identical, they are not patentably distinct from each other because pending claims 1-19 are system claim versions of the patented method claims. It would have been obvious to one of ordinary skill in the art to draft system claim versions of the patented method claims in order to extend the patent coverage. These system claims add to the patented claim computer comprising server, processor, memory, and network which were added to more completely form a system claim from the patented method claim with conventional server computer components and the claimed “provide data for an item of facewear comprising the therapy mask based on the generated three-dimensional modal” is an obvious modification of the patented “producing, based on the three-dimensional model of the head, a mask for providing a gas supply to a user” claimed in claim 19. Refer to the following table which correlates pending claims 1-19 with patented claims 2-17, 19, and 20 of U.S. Patent No. 12,033,278. Claims filed on 10/02/2025 1. (new) A computer system for obtaining a therapy mask using a generated three- dimensional model of at least part of a head including at least a part of a face, the computer system comprising: a server configured to communicate over a network, the server comprising a processor and memory, wherein the server is configured to: receive a distance image of the head obtained using a distance-measuring device having an imaging position, the distance image comprising for each of a two-dimensional array of distance pixels, a respective distance value indicating a distance from the imaging position to a corresponding point in a field of view of the distance-measuring device, the distance image including a face portion corresponding to the at least part of the face; estimate the position, relative to the imaging position, of a longitudinal axis of the model of the head based on at least one dimension of the at least part of the face; generate the three-dimensional model of the at least part of the head based on them position of the longitudinal axis; and provide data for an item of facewear comprising the therapy mask based on the generated three-dimensional modal. 2. (new) The computer system of claim 1, wherein the computer system is configured to: based on the longitudinal axis, define a plurality of spatial regions; allocate each distance value to a corresponding one of the spatial regions; and for each spatial region to which one or more distance values have been allocated, generate a corresponding portion of the three-dimensional model of the head based on the distance values allocated to that spatial region. 3. (new) The computer system of claim 2, wherein at a plurality of positions along the longitudinal axis, a respective first plurality of the spatial regions extend from that position along the longitudinal axis orthogonally to the longitudinal axis and subtend respective angles about the axis. 4. (new) The computer system of claim 2, wherein for a crown point on the longitudinal axis, a second plurality of the spatial regions extend in a direction away from the crown point at respective angles inclined to the longitudinal axis and subtending corresponding solid angles about the crown point. 5. (new) The computer system of claim 1, wherein the distance-measuring device is also an electromagnetic radiation image capturing device, wherein the electromagnetic radiation image capturing device is configured to capture a two-dimensional electromagnetic radiation image of the at least part of the face, the electromagnetic radiation image including a face portion in the electromagnetic radiation image corresponding to the at least part of the face. 6. (new) The computer system of claim 5, wherein the electromagnetic radiation image is a visual image. 7. (new) The computer system of claim 5, wherein the server is configured to identify the at least one dimension of the at least part of the face based on the face portion of the electromagnetic radiation image. 8. (new) The computer system of claim 5, wherein the server is configured to identify the at least one dimension of the at least part of the face based on the face portion of the distance image. 9. (new) The computer system of claim 8, wherein the server is configured to identify the at least one dimension of the at least part of the face based on the face portion of the electromagnetic radiation image; and in which the electromagnetic radiation image is used to identify points on the at least part of the face, corresponding points are found in a three dimensional space of the distance image, and the dimension is defined as the distance in three dimensional space between the identified points in the three-dimensional space of the distance image. 10. (new) The computer system of claim 1, wherein the server is configured to identify at least one dimension of the at least part of the face is based on data input by a user or data retrieved from a database. 11. (new) The computer system of claim 1, wherein to estimate the position of a longitudinal axis, the server is configured to multiply the dimension of the at least part of the face by at least one predefined physiognomy ratio. 12. (new) The computer system of claim 1, wherein the server is further configured to: receive an obtained additional distance image comprising additional distance values, wherein a positional relationship between the distance-measuring device and the at least part of a head when obtaining the additional distance image is different from a positional relationship between the distance-measuring device and the at least part of a head when obtaining an original distance image; and repeat reception of an obtained additional distance image, wherein the positional relationship between the distance-measuring device and the at least part of a head is changed for each repetition. 13. (new) The computer system of claim 12, wherein the server is further configured to: perform alignment of the original distance image and the additional distance image; and calculate an angle of rotation of the at least part of the face between the original distance image and the additional distance image with respect to the longitudinal axis. 14. (new) The computer system of claim 2, wherein the server is further configured to: determine which of the spatial regions contain at least one distance value obtained from at least one distance image; determine the face portion by finding the largest connected group of spatial regions that contain at least one distance value, wherein a spatial region is considered to be connected to another spatial region if it is adjacent to that spatial region; and delete any distance values which are not part of the face portion. 15. (new) The computer system of claim 2, wherein the server is further configured to: identify a spatial region for which no distance values have been allocated; identify whether any adjacent spatial regions have been allocated distance values; convert any distance values allocated to adjacent spatial regions into axis distance values, wherein axis distance values define a distance in three-dimensional space from the longitudinal axis; and calculate an estimated axis distance value for the spatial region for which no distance values have been allocated based on the axis distance values of the adjacent regions. 16. (new) The computer system of claim 1, wherein the distance-measuring device is a mobile telephone. 17. (new) The computer system of claim 5, wherein the server is further configured to define a colour of one or more portions of the three-dimensional model of at least part of the head based on the electromagnetic radiation image. 18. (new) The computer system of claim 1, wherein the therapy mask is for providing a gas supply for treating sleep apnoea. 19. (new) The computer system of claim 18, wherein the provided data for an item of facewear comprises design data, and wherein the computer system is configured to provide the design data for fabricating the item of facewear. US 12,033,278 B2 19. A method according to claim 1, further comprising producing, based on the three-dimensional model of the head, a mask for providing a gas supply to a user. 1. A method for generating a three-dimensional model of at least part of a head including at least a part of a face, comprising: obtaining a distance image of the head using a distance-measuring device having an imaging position, the distance image comprising for each of a two-dimensional array of distance pixels, a respective distance value indicating a distance from the imaging position to a corresponding point in a field of view of the distance-measuring device, the distance image including a face portion corresponding to the at least part of the face; estimating the position, relative to the imaging position, of a longitudinal axis of the model of the head based on at least one dimension of the at least part of the face; and generating the three dimensional model of the at least part of the head based on them position of the longitudinal axis. 19. A method according to claim 1, further comprising producing, based on the three-dimensional model of the head, a mask for providing a gas supply to a user. 2. A method according to claim 1, further comprising: based on the longitudinal axis, defining a plurality of spatial regions; allocating each distance value to a corresponding one of the spatial regions; and for each spatial region to which one or more distance values have been allocated, generating a corresponding portion of the three-dimensional model of the head based on the distance values allocated to that spatial region. 3. A method according claim 2, wherein at a plurality of positions along the longitudinal axis a respective first plurality of the spatial regions extend from that position along the longitudinal axis orthogonally to the longitudinal axis and subtend respective angles about the axis. 4. A method according to claim 2, wherein for a crown point on the longitudinal axis, a second plurality of the spatial regions extend in a direction away from the crown point at respective angles inclined to the longitudinal axis and subtending corresponding solid angles about the crown point. 7. A method according to claim 1, wherein the distance measuring device is also an electromagnetic radiation image capturing device, the method further comprising capturing a two-dimensional electromagnetic radiation image of the at least part of the face using the electromagnetic radiation image capturing device, the electromagnetic radiation image including a face portion of the electromagnetic radiation image corresponding to the at least part of the face. 8. A method according to claim 7, wherein the electromagnetic radiation image is a visual image. 9. A method according to claim 7, further comprising a step of identifying the at least one dimension of the at least part of the face based on the face portion of the electromagnetic radiation image. 11. A method according to claim 1, further comprising a step of identifying the at least one dimension of the at least part of the face based on the face portion of the distance image. 12. A method according to claim 11, further comprising a step of identifying the at least one dimension of the at least part of the face based on the face portion of the electromagnetic radiation image; and in which the electromagnetic radiation image is used to identify points on the at least part of the face, corresponding points are found in a three dimensional space of the distance image, and the dimension is defined as the distance in three dimensional space between the identified points in the three-dimensional space of the distance image. 13. A method according to claim 1, wherein the step of identifying at least one dimension of the at least part of the face is based on data input by a user or data retrieved from a database. 14. A method according to claim 1, wherein the step of estimating the position of a longitudinal axis comprises multiplying the dimension of the at least part of the face by at least one predefined physiognomy ratio. 15. A method according to claim 1, further comprising: obtaining an additional distance image comprising additional distance values, wherein the positional relationship between the distance-measuring device and the at least part of a head when obtaining the additional distance image is different from the positional relationship between the distance-measuring device and the at least part of a head when obtaining an original distance image; and optionally repeating the above steps of obtaining an additional distance image, wherein the positional relationship between the distance-measuring device and the at least part of a head is changed for each repetition of the steps. 16. A method according to claim 15, further comprising: performing alignment of the original distance image and the additional distance image; and calculating an angle of rotation of the at least part of the face between the original distance image and the additional distance image with respect to the longitudinal axis. 5. A method according to claim 2, further comprising: determining which of the spatial regions contain at least one distance value obtained from at least one distance image; determining the face portion by finding the largest connected group of spatial regions that contain at least one distance value, wherein a spatial region is considered to be connected to another spatial region if it is adjacent to that spatial region; and deleting any distance values which are not part of the face portion. 6. A method according to claim 2, further comprising: identifying a spatial region for which no distance values have been allocated; identifying whether any adjacent spatial regions have been allocated distance values; converting any distance values allocated to adjacent spatial regions into axis distance values, wherein axis distance values define a distance in three-dimensional space from the longitudinal axis; and calculating an estimated axis distance value for the spatial region for which no distance values have been allocated based on the axis distance values of the adjacent regions. 17. A method according to claim 1, wherein the distance-measuring device is a mobile telephone. 10. A method according to claim 7, further comprising defining a colour of one or more portions of the three dimensional model of at least part of the head based on the electromagnetic radiation image. 20. A method according to claim 19, wherein the mask for providing a gas supply is for treating sleep apnoea. 19. A method according to claim 1, further comprising producing, based on the three-dimensional model of the head, a mask for providing a gas supply to a user. Prior Art The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Yang et al., US 2014/0160123, describes generating three-dimensional model of a head/face of a user based upon depth frames and color frames but silent with regard to axis. Allowable Subject Matter Claims 1-19 would be allowable if rewritten or amended to overcome the claim objections set forth in this Office action and if proper terminal disclaimer is filed. The following is a statement of reasons for the indication of allowable subject matter: Similar to the reasons for allowance in the parent application the prior art of record fails to teach or suggest in the context of independent claim 1: “estimate the position, relative to the imaging position, of a longitudinal axis of the model of the head based on at least one dimension of the at least part of the face; generate the three-dimensional model of the at least part of the head based on them position of the longitudinal axis;”. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to JEFFERY A BRIER whose telephone number is (571)272-7656. The examiner can normally be reached on Mon-Fri from 8:30am-3:00pm. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Xiao M Wu, can be reached at telephone number 571-272-7761. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from Patent Center. Status information for published applications may be obtained from Patent Center. Status information for unpublished applications is available through Patent Center for authorized users only. Should you have questions about access to Patent Center, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) Form at https://www.uspto.gov/patents/uspto-automated- interview-request-air-form. JEFFERY A. BRIER Primary Examiner Art Unit 2613 /JEFFERY A BRIER/Primary Examiner, Art Unit 2613
Read full office action

Prosecution Timeline

Jun 14, 2024
Application Filed
Feb 06, 2026
Non-Final Rejection — §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602883
SYSTEMS AND METHODS FOR PROSPECTIVE ACTION DISPLAY AND EXECUTION THROUGH AUGMENTED REALITY
2y 5m to grant Granted Apr 14, 2026
Patent 12602885
SIGNAL PROCESSING APPARATUS, CONTROL METHOD FOR SIGNAL PROCESSING APPARATUS, AND STORAGE MEDIUM
2y 5m to grant Granted Apr 14, 2026
Patent 12594834
TEMPERATURE BASED RESISTIVE BRAKING CAPACITY
2y 5m to grant Granted Apr 07, 2026
Patent 12586315
INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND PROGRAM
2y 5m to grant Granted Mar 24, 2026
Patent 12586324
SPINE LEVEL DETERMINATION USING AUGMENTED REALITY
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
77%
Grant Probability
85%
With Interview (+8.7%)
2y 11m
Median Time to Grant
Low
PTA Risk
Based on 849 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month