Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1-22 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Regarding claims 1, 9, and 16, the second recitation of “a trained complete selective classifier” renders the claim indefinite because it is unclear if antecedence is to the first recitation of the same or if another classifier is being claimed.
Regarding claims 1, 9, and 16, “the trained selective classifier” lacks antecedent basis and renders the claim indefinite because it is unclear if it refers to “a trained complete selective classifier” and, if so, to which one.
Regarding claims 2, 10, and 17 “the trained selective classifier” lacks antecedent basis and renders the claim indefinite because it is unclear if it refers to “a trained complete selective classifier” and, if so, to which one.
Regarding claims 3, 11, and 18, “the trained selective classifier” lacks antecedent basis and renders the claim indefinite because it is unclear if it refers to “a trained complete selective classifier” and, if so, to which one.
Regarding claim 5, “the trained selective classifier” lacks antecedent basis and renders the claim indefinite because it is unclear if it refers to “a trained complete selective classifier” and, if so, to which one.
Regarding claims 6, 13, and 20 “the trained selective classifier” lacks antecedent basis and renders the claim indefinite because it is unclear if it refers to “a trained complete selective classifier” and, if so, to which one.
Claims 2-8, 10-15, and 17-22 are indefinite by virtue of dependency on an indefinite claim.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-22 rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Claim 1
Step 1: Is the claim to a process, machine, manufacture, or composition of matter?
Yes. Claim 1 is drawn to a process.
Step 2A, Prong One: Does the claim recite an abstract idea, law of nature, or natural phenomenon?
Yes. Claim 1 recites the following abstract ideas:
“preparing a trained complete selective classifier” - This is an observation, evaluation, judgement, or opinion, i.e. a concept performed in the human mind. See MPEP 2106.04(a)(2), III.
“disregard the existing trained selection mechanism” - This is an observation, evaluation, judgement, or opinion, i.e. a concept performed in the human mind. See MPEP 2106.04(a)(2), III.
“use, as a basis for an alternate selection mechanism, at least one classification prediction value” - This is an observation, evaluation, judgement, or opinion, i.e. a concept performed in the human mind. See MPEP 2106.04(a)(2), III.
Step 2A, Prong Two: Does the claim recite additional elements that integrate the judicial exception into a practical application?
No. Claim 1 recites the following additional elements:
“for a trained complete selective classifier having an existing trained selection mechanism, modifying the trained selective classifier to:” - merely reciting the words “apply it” (or an equivalent) with the judicial exception, or merely including instructions to implement an abstract idea on a computer, or merely using a computer as a tool to perform an abstract idea. See MPEP 2106.05(f).
Step 2B: Does the claim recite additional elements that amount ot significantly more than the judicial exception?
No. Claim 1 recites the following additional elements:
“for a trained complete selective classifier having an existing trained selection mechanism, modifying the trained selective classifier to:” - merely reciting the words “apply it” (or an equivalent) with the judicial exception, or merely including instructions to implement an abstract idea on a computer, or merely using a computer as a tool to perform an abstract idea. See MPEP 2106.05(f).
Claim 2
Step 1: Is the claim to a process, machine, manufacture, or composition of matter?
Yes. Claim 2 is drawn to a process.
Step 2A, Prong One: Does the claim recite an abstract idea, law of nature, or natural phenomenon?
Yes. Claim 2 recites the following abstract ideas:
“commencing with an untrained selective classifier” - This is an action performable by a human. See MPEP 2106.04(a)(2), III.
Step 2A, Prong Two: Does the claim recite additional elements that integrate the judicial exception into a practical application?
No. Claim 2 recites the following additional elements:
“training the untrained selective classifier with a modified loss function to obtain the trained selective classifier; wherein the modified loss function has at least one added term, relative to an original loss function, wherein the at least one added term decreases entropy:” - merely reciting the words “apply it” (or an equivalent) with the judicial exception, or merely including instructions to implement an abstract idea on a computer, or merely using a computer as a tool to perform an abstract idea. See MPEP 2106.05(f).
Step 2B: Does the claim recite additional elements that amount ot significantly more than the judicial exception?
No. Claim 2 recites the following additional elements:
“training the untrained selective classifier with a modified loss function to obtain the trained selective classifier; wherein the modified loss function has at least one added term, relative to an original loss function, wherein the at least one added term decreases entropy:” - merely reciting the words “apply it” (or an equivalent) with the judicial exception, or merely including instructions to implement an abstract idea on a computer, or merely using a computer as a tool to perform an abstract idea. See MPEP 2106.05(f).
Claim 3
Step 1: Is the claim to a process, machine, manufacture, or composition of matter?
Yes. Claim 3 is drawn to a process.
Step 2A, Prong One: Does the claim recite an abstract idea, law of nature, or natural phenomenon?
Yes. Claim 3 recites the following abstract ideas:
“commencing with an untrained selective classifier” - This is an action performable by a human. See MPEP 2106.04(a)(2), III.
Step 2A, Prong Two: Does the claim recite additional elements that integrate the judicial exception into a practical application?
No. Claim 3 recites the following additional elements:
“training the untrained selective classifier with an original loss function for the selective classifier to obtain the trained selective classifier” - merely reciting the words “apply it” (or an equivalent) with the judicial exception, or merely including instructions to implement an abstract idea on a computer, or merely using a computer as a tool to perform an abstract idea. See MPEP 2106.05(f).
Step 2B: Does the claim recite additional elements that amount ot significantly more than the judicial exception?
No. Claim 3 recites the following additional elements:
“training the untrained selective classifier with an original loss function for the selective classifier to obtain the trained selective classifier” - merely reciting the words “apply it” (or an equivalent) with the judicial exception, or merely including instructions to implement an abstract idea on a computer, or merely using a computer as a tool to perform an abstract idea. See MPEP 2106.05(f).
Claim 4
Step 1: Is the claim to a process, machine, manufacture, or composition of matter?
Yes. Claim 4 is drawn to a process.
Step 2A, Prong One: Does the claim recite an abstract idea, law of nature, or natural phenomenon?
Yes. Claim 4 recites the following abstract ideas:
“uses, as the basis for the alternate selection mechanism, one of (i) predictive entropy for classification or (ii) maximum predictive class logit” - This is an observation, evaluation, judgement, or opinion, i.e. a concept performed in the human mind. See MPEP 2106.04(a)(2), III.
Step 2A, Prong Two: Does the claim recite additional elements that integrate the judicial exception into a practical application?
No. Claim 4 does not recite additional elements.
Step 2B: Does the claim recite additional elements that amount ot significantly more than the judicial exception?
No. Claim 4 does not recite additional elements.
Claim 5
Step 1: Is the claim to a process, machine, manufacture, or composition of matter?
Yes. Claim 5 is drawn to a process.
Step 2A, Prong One: Does the claim recite an abstract idea, law of nature, or natural phenomenon?
Yes. Claim 5 recites the following abstract ideas:
The abstract ideas of the parent claim.
Step 2A, Prong Two: Does the claim recite additional elements that integrate the judicial exception into a practical application?
No. Claim 5 recites the following additional elements:
“receiving the trained selective classifier” - Adding insignificant extra-solution activity to the judicial exception, e.g., mere data gathering in conjunction with a law of nature or abstract idea does not integrate the abstract idea(s) into a practical application or provide significantly more. See MPEP 2106.05(g).
Step 2B: Does the claim recite additional elements that amount ot significantly more than the judicial exception?
No. Claim 5 recites the following additional elements:
“receiving the trained selective classifier” - Adding insignificant extra-solution activity to the judicial exception, e.g., mere data gathering in conjunction with a law of nature or abstract idea does not integrate the abstract idea(s) into a practical application or provide significantly more. See MPEP 2106.05(g).
Claim 6
Step 1: Is the claim to a process, machine, manufacture, or composition of matter?
Yes. Claim 6 is drawn to a process.
Step 2A, Prong One: Does the claim recite an abstract idea, law of nature, or natural phenomenon?
Yes. Claim 6 recites the following abstract ideas:
The abstract ideas of the parent claim.
Step 2A, Prong Two: Does the claim recite additional elements that integrate the judicial exception into a practical application?
No. Claim 6 does not recite additional elements.
Step 2B: Does the claim recite additional elements that amount ot significantly more than the judicial exception?
No. Claim 6 does not recite additional elements.
Claim 7
Step 1: Is the claim to a process, machine, manufacture, or composition of matter?
Yes. Claim 7 is drawn to a process.
Step 2A, Prong One: Does the claim recite an abstract idea, law of nature, or natural phenomenon?
Yes. Claim 7 recites the following abstract ideas:
“uses a value of an abstention logit” - This is an observation, evaluation, judgement, or opinion, i.e. a concept performed in the human mind. See MPEP 2106.04(a)(2), III.
“ignores the abstention logit” - This is an observation, evaluation, judgement, or opinion, i.e. a concept performed in the human mind. See MPEP 2106.04(a)(2), III.
Step 2A, Prong Two: Does the claim recite additional elements that integrate the judicial exception into a practical application?
No. Claim 7 recites the following additional elements:
“the existing trained selection mechanism” - merely reciting the words “apply it” (or an equivalent) with the judicial exception, or merely including instructions to implement an abstract idea on a computer, or merely using a computer as a tool to perform an abstract idea. See MPEP 2106.05(f).
“the alternate selection mechanism” - merely reciting the words “apply it” (or an equivalent) with the judicial exception, or merely including instructions to implement an abstract idea on a computer, or merely using a computer as a tool to perform an abstract idea. See MPEP 2106.05(f).
Step 2B: Does the claim recite additional elements that amount ot significantly more than the judicial exception?
No. Claim 7 recites the following additional elements:
“the existing trained selection mechanism” - merely reciting the words “apply it” (or an equivalent) with the judicial exception, or merely including instructions to implement an abstract idea on a computer, or merely using a computer as a tool to perform an abstract idea. See MPEP 2106.05(f).
“the alternate selection mechanism” - merely reciting the words “apply it” (or an equivalent) with the judicial exception, or merely including instructions to implement an abstract idea on a computer, or merely using a computer as a tool to perform an abstract idea. See MPEP 2106.05(f).
Claim 8
Step 1: Is the claim to a process, machine, manufacture, or composition of matter?
Yes. Claim 8 is drawn to a process.
Step 2A, Prong One: Does the claim recite an abstract idea, law of nature, or natural phenomenon?
Yes. Claim 8 recites the following abstract ideas:
The abstract ideas of the parent claim.
Step 2A, Prong Two: Does the claim recite additional elements that integrate the judicial exception into a practical application?
No. Claim 8 does not recite additional elements.
Step 2B: Does the claim recite additional elements that amount ot significantly more than the judicial exception?
No. Claim 8 does not recite additional elements.
Claim 9
Step 1: Is the claim to a process, machine, manufacture, or composition of matter?
Yes. Claim 9 is drawn to a machine.
Step 2A, Prong One: Does the claim recite an abstract idea, law of nature, or natural phenomenon?
Yes. Claim 9 recites the following abstract ideas:
The abstract ideas of claims 1 and 9 are the same.
Step 2A, Prong Two: Does the claim recite additional elements that integrate the judicial exception into a practical application?
No. Claim 9 recites the following additional elements:
“a data processing system comprising at least one processor and memory coupled to the processor, wherein the memory contains instructions which, when implemented by the at least one processor, cause the processor to implement a method” - merely reciting the words “apply it” (or an equivalent) with the judicial exception, or merely including instructions to implement an abstract idea on a computer, or merely using a computer as a tool to perform an abstract idea. See MPEP 2106.05(f).
Step 2B: Does the claim recite additional elements that amount ot significantly more than the judicial exception?
No. Claim 9 recites the following additional elements:
“a data processing system comprising at least one processor and memory coupled to the processor, wherein the memory contains instructions which, when implemented by the at least one processor, cause the processor to implement a method” - merely reciting the words “apply it” (or an equivalent) with the judicial exception, or merely including instructions to implement an abstract idea on a computer, or merely using a computer as a tool to perform an abstract idea. See MPEP 2106.05(f).
Claim 16
Step 1: Is the claim to a process, machine, manufacture, or composition of matter?
Yes. Claim 16 is drawn to a machine.
Step 2A, Prong One: Does the claim recite an abstract idea, law of nature, or natural phenomenon?
Yes. Claim 16 recites the following abstract ideas:
The abstract ideas of claims 1 and 16 are the same.
Step 2A, Prong Two: Does the claim recite additional elements that integrate the judicial exception into a practical application?
No. Claim 16 recites the following additional elements:
“a computer program product comprising tangible non-transitory computer-readable media containing instructions which, when executed by at least one processor of a computer, cause the computer to implement a method” - merely reciting the words “apply it” (or an equivalent) with the judicial exception, or merely including instructions to implement an abstract idea on a computer, or merely using a computer as a tool to perform an abstract idea. See MPEP 2106.05(f).
Step 2B: Does the claim recite additional elements that amount ot significantly more than the judicial exception?
No. Claim 16 recites the following additional elements:
“a computer program product comprising tangible non-transitory computer-readable media containing instructions which, when executed by at least one processor of a computer, cause the computer to implement a method” - merely reciting the words “apply it” (or an equivalent) with the judicial exception, or merely including instructions to implement an abstract idea on a computer, or merely using a computer as a tool to perform an abstract idea. See MPEP 2106.05(f).
Claims 10-15 and 17-22
Mutatis mutandis, claims 10-15 and 17-22 are ineligible for the same reasons as claims 2-8.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-6 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Geifman (Geifman Y, El-Yaniv R. Selectivenet: A deep neural network with an integrated reject option. InInternational conference on machine learning 2019 May 24 (pp. 2151-2159). PMLR.)
Regarding claim 1, Geifman teaches a method for preparing a trained complete selective classifier (§7.1, “We trained a version of SelectiveNet whose main body block is based on the VGG-16 architecture”), the method comprising:
for a trained complete selective classifier having an existing trained selection mechanism (§7.1, “We trained a version of SelectiveNet whose main body block is based on the VGG-16 architecture”), modifying the trained selective classifier to: disregard the existing trained selection mechanism; and use, as a basis for an alternate selection mechanism, at least one classification prediction value (§7.1, “The standard classifiers were constructed using the well-known SR and MC-dropout confidence estimates applied to a trained network consisting of a multi-class head (identical to f in SelectiveNet) on top of the same main body block of SelectiveNet.”).
To clarify, Geifman discloses a SelectiveNet architecture and alongside this disclosure is the disclosure of a trained network consisting of a multi-class head (identical to f in SelectiveNet) on top of the same main body block of SelectiveNet, see Figure 1. Since each architecture shares the main body block of SelectiveNet, the utilization of an SR classifier on top of the main body block in lieu of the selection head of SelectiveNet constitutes disregarding the trained selection mechanism and using the SR instead, where “SR” is the Softmax Response, see §6.2.
Regarding claim 2, Geifman teaches all of the limitations of claim 1, further comprising, before modifying the trained selective classifier:
commencing with an untrained selective classifier; training the untrained selective classifier with a modified loss function to obtain the trained selective classifier (§7.1, “We trained a version of SelectiveNet whose main body block is based on the VGG-16 architecture”, §4.2, the overall training objective is a convex combination of selective loss and auxiliary loss); wherein the modified loss function has at least one added term, relative to an original loss function, wherein the at least one added term decreases entropy (§4.2, the auxiliary loss can be considered an added term relative to an “original” loss function. Since the use of the auxiliary head prevents overfitting, it reduces entropy).
Regarding claim 3, Geifman teaches all of the limitations of claim 1, further comprising,
before modifying the trained selective classifier: commencing with an untrained selective classifier; and training the untrained selective classifier with an original loss function for the selective classifer to obtain the trained selective classifier (§7.1, “We trained a version of SelectiveNet whose main body block is based on the VGG-16 architecture”, §4.2, Equation (2) shows the loss function).
Regarding claim 4, Geifman teaches all of the limitations of claim 1, wherein the method uses, as the basis for the alternate selection mechanism, maximum predictive class logit (§6.2, “Softmax Response”, “…maximum softmax value activation…”).
Regarding claim 5, Geifman teaches all of the limitations of claim 1, further comprising
before modifying the trained selective classifier: receiving the trained selective classifier (this is self-evident, something cannot be modified if it has not been received, i.e. if it is not present it can’t be modified).
Regarding claim 6, Geifman teaches all of the limitations of claim 1, wherein
the trained selective classifier is a SelectiveNet network; the existing trained selection mechanism is a selection head (see Figure 1).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim(s) 1-6, 9-13, and 16-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Geifman (Geifman Y, El-Yaniv R. Selectivenet: A deep neural network with an integrated reject option. InInternational conference on machine learning 2019 May 24 (pp. 2151-2159). PMLR.)
Regarding claim 1, Geifman teaches a method for preparing a trained complete selective classifier (§7.1, “We trained a version of SelectiveNet whose main body block is based on the VGG-16 architecture”), the method comprising:
for a trained complete selective classifier having an existing trained selection mechanism (§7.1, “We trained a version of SelectiveNet whose main body block is based on the VGG-16 architecture”), modifying the trained selective classifier to: disregard the existing trained selection mechanism; and use, as a basis for an alternate selection mechanism, at least one classification prediction value (§7.1, “The standard classifiers were constructed using the well-known SR and MC-dropout confidence estimates applied to a trained network consisting of a multi-class head (identical to f in SelectiveNet) on top of the same main body block of SelectiveNet.”).
To clarify, Geifman discloses a SelectiveNet architecture and alongside this disclosure is the disclosure of a trained network consisting of a multi-class head (identical to f in SelectiveNet) on top of the same main body block of SelectiveNet, see Figure 1. Since each architecture shares the main body block of SelectiveNet, the utilization of an SR classifier on top of the main body block in lieu of the selection head of SelectiveNet constitutes disregarding the trained selection mechanism and using the SR instead, where “SR” is the Softmax Response, see §6.2.
Arguendo, consider the notion that one does not consider the above disclosure to encompass “modifying the trained selective classifier to: disregard the existing trained selection mechanism; and use, as a basis for an alternate selection mechanism, at least one classification prediction value”.
Geifman discloses a SelectiveNet architecture and alongside this disclosure is the disclosure of a trained network consisting of a multi-class head (identical to f in SelectiveNet) on top of the same main body block of SelectiveNet, see Figure 1, using the SR instead, where “SR” is the Softmax Response, see §6.2. Furthermore, Geifman discloses that on the SVHN dataset, SR provides a 0.77 percent improvement over SelectiveNet at a coverage of 0.95, see Table 3.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to alter the methodology of Geifman to include “modifying the trained selective classifier to: disregard the existing trained selection mechanism; and use, as a basis for an alternate selection mechanism, at least one classification prediction value” in order to ensure adequate performance across each coverage level for the SVHM dataset, in particular at a coverage of 0.95, while still maintaining SelectiveNet for other coverage values, e.g. 0.80.
Regarding claim 2, Geifman as modified teaches all of the limitations of claim 1, further comprising, before modifying the trained selective classifier:
commencing with an untrained selective classifier; training the untrained selective classifier with a modified loss function to obtain the trained selective classifier (§7.1, “We trained a version of SelectiveNet whose main body block is based on the VGG-16 architecture”, §4.2, the overall training objective is a convex combination of selective loss and auxiliary loss); wherein the modified loss function has at least one added term, relative to an original loss function, wherein the at least one added term decreases entropy (§4.2, the auxiliary loss can be considered an added term relative to an “original” loss function. Since the use of the auxiliary head prevents overfitting, it reduces entropy).
Regarding claim 3, Geifman as moidified teaches all of the limitations of claim 1, further comprising,
before modifying the trained selective classifier: commencing with an untrained selective classifier; and training the untrained selective classifier with an original loss function for the selective classifer to obtain the trained selective classifier (§7.1, “We trained a version of SelectiveNet whose main body block is based on the VGG-16 architecture”, §4.2, Equation (2) shows the loss function).
Regarding claim 4, Geifman as modified teaches all of the limitations of claim 1, wherein the method uses, as the basis for the alternate selection mechanism, maximum predictive class logit (§6.2, “Softmax Response”, “…maximum softmax value activation…”).
Regarding claim 5, Geifman as modified teaches all of the limitations of claim 1, further comprising
before modifying the trained selective classifier: receiving the trained selective classifier (this is self-evident, something cannot be modified if it has not been received, i.e. if it is not present it can’t be modified).
Regarding claim 6, Geifman as modified teaches all of the limitations of claim 1, wherein
the trained selective classifier is a SelectiveNet network; the existing trained selection mechanism is a selection head (see Figure 1).
Regarding claims 9-13, Geifman according to claims 1-4 and 6 teaches the claimed method of claims 9-13, but does not explicitly recite a data processing system comprising at least one processor and memory coupled to the processor, wherein the memory contains instructions which, when implemented by the at least one processor, cause the at least one processor to implement the method.
The Examiner takes Official Notice that it is old and well known to utilize a data processing system comprising at least one processor and memory coupled to the processor, wherein the memory contains instructions which, when implemented by the at least one processor, cause the at least one processor to implement a machine learning method in order to efficiently perform repeated calculations.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Geifman to provide a data processing system comprising at least one processor and memory coupled to the processor, wherein the memory contains instructions which, when implemented by the at least one processor, cause the at least one processor to implement the method of claims 1-4 and 6 in order to efficiently perform the calculations necessary to produce the results in Geifman.
Regarding claims 16-20, Geifman according to claims 1-4 and 6 teaches the claimed method of claims 16-20, but does not explicitly recite a computer program product comprising tangible non-transitory computer-readable media containing instructions which, when executed by at least one processor of a computer, cause the computer to implement the method.
The Examiner takes Official Notice that it is old and well known to utilize a computer program product comprising tangible non-transitory computer-readable media containing instructions which, when executed by at least one processor of a computer, cause the computer to implement a machine learning method in order to efficiently perform repeated calculations.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Geifman to provide a computer program product comprising tangible non-transitory computer-readable media containing instructions which, when executed by at least one processor of a computer, cause the computer to implement the method of claims 1-4 and 6 in order to efficiently perform the calculations necessary to produce the results in Geifman.
Claim(s) 7-8, 14-15, and 21-22 is/are rejected under 35 U.S.C. 103 as being unpatentable over Geifman (Geifman Y, El-Yaniv R. Selectivenet: A deep neural network with an integrated reject option. InInternational conference on machine learning 2019 May 24 (pp. 2151-2159). PMLR.) in view of Huang (Huang, Lang, Chao Zhang, and Hongyang Zhang. "Self-adaptive training: beyond empirical risk minimization." Advances in neural information processing systems 33 (2020): 19365-19376.)
Regarding claims 7-8, Geifman, or in the alternative Geifman as modified, teaches all of the limitations of claim 1, but does not teach wherein: the existing trained selection mechanism uses a value of an abstention logit; and the alternate selection mechanism ignores the abstention logit. wherein the trained selective classifier is one of (i) a Self-Adaptive Training network or (ii) a Deep Gamblers network.
Huang teaches a Self-Adaptive Training Network for selective classification which uses a value of an abstention logit (§4.2) which allows for a selective classifier to be trained in an end-to-end fashion and improves performance (§4.2-4.3 and Table 4).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to begin with a Self-Adaptive Training Network as a selective classifier in order to improve performance, thereby providing “the existing trained selection mechanism uses a value of an abstention logit; and the alternate selection mechanism ignores the abstention logit. wherein the trained selective classifier is one of (i) a Self-Adaptive Training network or (ii) a Deep Gamblers network.”
Regarding claims 14-15, Geifman as modified according to claims 7-8 teaches the system of claims 14-15.
Regarding claims 21-22, Geifman as modified according to claims 7-8 teaches the computer program product of claims 21-22.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SCHYLER S SANKS whose telephone number is (571)272-6125. The examiner can normally be reached 06:30 - 15:30 Central Time, M-F.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Michael Huntley can be reached at (303) 297-4307. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/SCHYLER S SANKS/Primary Examiner, Art Unit 2129