Prosecution Insights
Last updated: April 19, 2026
Application No. 18/728,730

METHOD FOR PROCESSING A TRANSACTION, SYSTEM AND CORRESPONDING PROGRAM

Final Rejection §101§103
Filed
Jul 12, 2024
Examiner
YONO, RAVEN E
Art Unit
3694
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Ingenico Belgium
OA Round
2 (Final)
39%
Grant Probability
At Risk
3-4
OA Rounds
2y 6m
To Grant
72%
With Interview

Examiner Intelligence

Grants only 39% of cases
39%
Career Allow Rate
69 granted / 175 resolved
-12.6% vs TC avg
Strong +32% interview lift
Without
With
+32.5%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
32 currently pending
Career history
207
Total Applications
across all art units

Statute-Specific Performance

§101
40.5%
+0.5% vs TC avg
§103
31.3%
-8.7% vs TC avg
§102
3.0%
-37.0% vs TC avg
§112
19.9%
-20.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 175 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims • This action is in reply to the amendments filed on January 20, 2026. • Claims 1 and 11-12 have been amended and are hereby entered. • Claim 3 has been canceled. • Claims 1-2 and 4-12 are currently pending and have been examined. • This action is made FINAL. Priority Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55. Response to Arguments Applicant’s arguments filed January 20, 2026 have been fully considered but they are not persuasive. The Examiner is withdrawing the 35 USC § 112 rejections due to Applicant’s amendments. The Examiner is withdrawing the 35 USC § 102 rejections due to Applicant’s amendments. New 35 USC § 103 rejections have been entered due to Applicant’s amendments. Applicant’s arguments with respect to 35 USC § 101 have been fully considered and are not persuasive. Regarding Applicant’s argument on page 9-10, that the claims do not recite an abstract idea, the Examiner respectfully disagrees. As indicated in the 35 USC § 101 rejection below, the claimed inventions allows for determining whether a user is in an insecure environment while transacting. The Specification at pages 6-7 states: “Among the devices often used to entry sensitive information, payment terminals are considered secure because they are used to process payments. The new terminals can also be used for other purposes, depending on the merchant's business analysis.30 Th, the terminal may be used to execute value-added services or commercial applications on which personal information may be requested. Even if the data provided to the terminal is processed securely in the terminal itself, one cannot be sure that the user is not being spied on when entering their personal data. For example, entering a PIN code is a critical step during a payment transaction at a payment terminal. During this entry phase, the attacker looks over the user's shoulder to see what digits are entered when entering the pin. Thanks to the implemented technique, we proactively signal to the user, the customer or even the merchant that the surrounding physical context is potentially risky, in order to allow them to make the decision whether or not to continue data entry. For a payment, the processing of the transaction can be automatically stopped when a change in the environment surrounding the terminal (for example the payment terminal) is noted.” The Specification and claims focus on an improvement to the determining whether a user is in an insecure environment while performing a transaction, which is a fundamental economic principle or practice of mitigating risk and a commercial and legal interaction including sales activities or behaviors which falls within the category of Certain Methods of Organizing Human Activity and therefore is an abstract idea. Regarding Applicant’s arguments on pages 9-10, that the Examiner does not analyze claim language or identify the recitation of the abstract idea in the claims, the Examiner respectfully disagrees. As an initial matter, the Examiner respectfully disagrees. The Examiner notes that each claim was given the full and proper analysis under the test set forth by the Supreme Court and the Patent Subject Matter Eligibility analysis (see MPEP 2106). Furthermore, the Examiner notes that the 101 rejection below does analyzes claim language and identifies the recitation of the abstract idea; in the instant application, the claims recite a method for processing a transaction including a provision, by a user, of sensitive data; initializing the transaction requiring entry of the sensitive data; obtaining, after initialization, data representative of an environment of the user; analyzing the data previously obtained; and modifying an implementation context of the transaction in response to the analyzing of the data delivering a result representative of an unsecured environment; wherein the obtaining comprises, obtaining at least one data representative of an environment; and the analyzing comprises, in response to the data representative of the environment being not in compliance with an expected value, providing the result representative of an unsecure environment. The claimed inventions allows for determining whether a user is in an insecure environment while transacting, which is a fundamental economic principle or practice of mitigating risk and a commercial and legal interaction including sales activities or behaviors. Regarding Applicant’s arguments on pages 10-11, that the claims do not recite fundamental economic principle or practices, the Examiner respectfully disagrees. In response to Applicant’s reliance upon the examples of fundamental economic principles or practices in the MPEP, the Examiner notes that the cases listed in the MPEP are meant to be examples, and are not meant to be an exhaustive list; nor are the cases meant to limit what is considered a fundamental economic practice. Furthermore, the Supreme Court expressly declined to limit the categories of unpatentable abstract ideas. The Supreme Court’s opinion in Alice states that these enumerated categories were not intended as exclusive. Specifically, the Court wrote: “In any event, we need not labor to delimit the precise contours of the ‘abstract ideas’ category in this case.” Alice, 134 S. Ct. at 2357; accord Content Extraction, 776 F.3d at 1347. While the instant application may or may not be analogous to the decisions listed by the Applicant, the claims of the instant application recite mitigating risk. The claims, therefore, recite a fundamental economic practice. Regarding Applicant’s arguments on pages 11-12, that the claims integrate a practical application, the Examiner respectfully disagrees. Under the Patent Subject Matter Eligibility analysis, Step 2A, prong two, integration into a practical application requires an additional element(s) or a combination of additional elements in the claim to apply, rely on, or use the judicial exception in a manner that imposes a meaningful limit on the judicial exception, such that the claim is more than a drafting effort designed to monopolize the exception. Limitations that are not indicative of integration into a practical application are those that are mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea.-see MPEP 2106.05(f). Here, the claims recite a digital terminal; the digital terminal comprises at least one processor and at least one non-transitory computer readable medium comprising instructions stored thereon which when executed by the processor configure the digital terminal to perform claim functions; non-transitory computer readable medium comprising a computer program stored therein comprising computer code instructions for implementing a method; an image pickup device connected to the terminal; multimedia data; sensors of the digital terminal; electric radio environment of the digital terminal such that they amount to no more than mere instructions to apply the exception using generic computer components (see MPEP 2106.05(f)). Furthermore, and in response to Applicant’s arguments on page 12 where Applicant argues an improvement to technology, in determining whether a claim integrates a judicial exception into a practical application, a determination is made of whether the claimed invention pertains to an improvement in the functioning of the computer itself or any other technology or technical field (i.e., a technological solution to a technological problem). Here, the claims recite generic computer components, i.e., a generic processor, a memory storing a computer program executable by the processor to perform the claimed method steps and system functions. The processor, memory and system are recited at a high level of generality and are recited as performing generic computer functions customarily used in computer applications. Furthermore, the Specification describes a problem and improvement to a business or commercial process at least at pages 6-7, stating: “Among the devices often used to entry sensitive information, payment terminals are considered secure because they are used to process payments. The new terminals can also be used for other purposes, depending on the merchant's business analysis.30 Th, the terminal may be used to execute value-added services or commercial applications on which personal information may be requested. Even if the data provided to the terminal is processed securely in the terminal itself, one cannot be sure that the user is not being spied on when entering their personal data. For example, entering a PIN code is a critical step during a payment transaction at a payment terminal. During this entry phase, the attacker looks over the user's shoulder to see what digits are entered when entering the pin. Thanks to the implemented technique, we proactively signal to the user, the customer or even the merchant that the surrounding physical context is potentially risky, in order to allow them to make the decision whether or not to continue data entry. For a payment, the processing of the transaction can be automatically stopped when a change in the environment surrounding the terminal (for example the payment terminal) is noted.” Regarding Applicant’s arguments on page 12, that the claims recite significantly more than the abstract idea, the Examiner respectfully disagrees. The limitations are directed to an abstract idea and when determining if the claims are directed to significantly more, the additional limitations of the claims in addition to the abstract idea are analyzed. In the instant application, the additional elements of the claim include a digital terminal; the digital terminal comprises at least one processor and at least one non-transitory computer readable medium comprising instructions stored thereon which when executed by the processor configure the digital terminal to perform claim functions; non-transitory computer readable medium comprising a computer program stored therein comprising computer code instructions for implementing a method; an image pickup device connected to the terminal; multimedia data; sensors of the digital terminal; electric radio environment of the digital terminal. The additional limitations, when considered both individually and in combination, do not affect an improvement to another technology or technological field; the claims do not amount to an improvement to the functioning of the computer itself; and the claims do not move beyond a general link of use of an abstract idea to a particular technological environment. Therefore, the claims merely amount to the application or instructions to apply the abstract idea using a computer, and is considered to amount to nothing more than requiring a generic computer merely to carry out the abstract idea itself. The specifics about the abstract idea do not overcome the rejection. The claims are not patent eligible. Applicant’s arguments with respect to 35 USC § 103 have been fully considered and are not persuasive. Regarding Applicant’s argument on pages 13-14, that the cited art of record does not teach the obtaining comprises, obtaining, from sensors of the digital terminal, at least one data representative of an electric radio environment of the digital terminal; the analyzing comprises, in response to the data representative of the electric radio environment being not in compliance with an expected value, providing the result representative of an unsecured environment, the Examiner respectfully disagrees. As discussed in the 103 rejection below, Singh discloses the obtaining comprises, obtaining, from sensors of the digital terminal, at least one data representative of an environment of the digital terminal; and the analyzing comprises, in response to the data representative of the environment being not in compliance with an expected value, providing the result representative of an unsecured environment at least at [0022], describing identifying a person other than the user and viewing position identifiers including a person identifier function for identifying one or more persons captured in the image, and at least at [0047]-[0048] describing in result to detecting more than one person captured, obfuscating the screen. And, Neves discloses a radio environment at least at [0119], describing transmitting data via radio. The cited art of record therefore teaches this limitation. Regarding Applicant’s arguments on pages 14-15, that the Examiner ignores the feature of claim 7, the Examiner respectfully disagrees. Claim 7 recites method steps including the step of “when such data are already present in the database for the current face, incrementing a counter relating to these characteristics for this current face.” In the scenario of a face not being present in the database, this step would not be completed. Therefore this step is optional. The Examiner notes that if the claims were to state, for example, “determining that data is already present in the database for the current face, and in response to determining that data is already present in the database for the current face, incrementing a counter relating to these characteristics for this current face,” then the step would be required and therefore would have patentable weight. For the reasons above, Applicant’s arguments are not persuasive. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-2 and 4-12 are rejected under 35 U.S.C. 101 because the claimed invention recites an abstract idea without significantly more. Independent claims 1, 11, and 12 are directed to a method (claim 1) and an apparatus (claims 11 and 12). Therefore, on its face, each independent claim 1, 11, and 12 are directed to a statutory category of invention under Step 1 of the Patent Subject Matter Eligibility analysis (see MPEP 2106.03). Under Step 2A, Prong One of the Patent Subject Matter Eligibility analysis (see MPEP 2106.04), claims 1, 11, and 12 recite, in part, a system, a method, and an apparatus of organizing human activity. Using the limitations in claim 1 to illustrate, the claim recites a method for processing a transaction including a provision, by a user, of sensitive data; initializing the transaction requiring entry of the sensitive data; obtaining, after initialization, data representative of an environment of the user; analyzing the data previously obtained; and modifying an implementation context of the transaction in response to the analyzing of the data delivering a result representative of an unsecured environment; wherein the obtaining comprises, obtaining at least one data representative of an environment; and the analyzing comprises, in response to the data representative of the environment being not in compliance with an expected value, providing the result representative of an unsecure environment. The Specification at pages 6-7 states: “Among the devices often used to entry sensitive information, payment terminals are considered secure because they are used to process payments. The new terminals can also be used for other purposes, depending on the merchant's business analysis.30 Th, the terminal may be used to execute value-added services or commercial applications on which personal information may be requested. Even if the data provided to the terminal is processed securely in the terminal itself, one cannot be sure that the user is not being spied on when entering their personal data. For example, entering a PIN code is a critical step during a payment transaction at a payment terminal. During this entry phase, the attacker looks over the user's shoulder to see what digits are entered when entering the pin. Thanks to the implemented technique, we proactively signal to the user, the customer or even the merchant that the surrounding physical context is potentially risky, in order to allow them to make the decision whether or not to continue data entry. For a payment, the processing of the transaction can be automatically stopped when a change in the environment surrounding the terminal (for example the payment terminal) is noted.” The limitations, as drafted, is a process that, under its broadest reasonable interpretation, covers fundamental economic principles or practices and commercial and legal interactions (certain methods of organizing human activity), but for the recitation of generic computer components. The claims as a whole recite a method of organizing human activity. The claimed inventions allows for determining whether a user is in an insecure environment while transacting, which is a fundamental economic principle or practice of mitigating risk and a commercial and legal interaction including sales activities or behaviors. The mere nominal recitation of a digital terminal does not take the claim out of the methods of organizing human activity grouping. Thus, the claims recite an abstract idea. Under Step 2A, Prong Two of the Patent Subject Matter Eligibility analysis (see MPEP 2106.04), the judicial exception is not integrated into a practical application. In particular, the additional elements of a digital terminal; the digital terminal comprises at least one processor and at least one non-transitory computer readable medium comprising instructions stored thereon which when executed by the processor configure the digital terminal to perform claim functions; non-transitory computer readable medium comprising a computer program stored therein comprising computer code instructions for implementing a method; an image pickup device connected to the terminal; multimedia data; sensors of the digital terminal; electric radio environment of the digital terminal are recited at a high-level of generality (i.e., as a generic processor performing a generic computer function of receiving sensitive data, initializing a transaction, obtaining multimedia data of the environment of the user, analyze data, modify implementation of the transaction) such that they amount to no more than mere instructions to apply the exception using a generic computer components (see MPEP 2106.05(f)). Accordingly, the combination of the additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claims are directed to an abstract idea. Under Step 2B of the Patent Subject Matter Eligibility analysis (see MPEP 2106.05), the claim(s) does/do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional elements in the claims amount to no more than mere instructions to apply the exception using generic computer components. Mere instructions to apply an exception using generic computer components cannot provide an inventive concept. The claims are not patent eligible. The dependent claims have been given the full two part analysis including analyzing the additional limitations both individually and in combination. The dependent claim(s) when analyzed both individually and in combination are also held to be patent ineligible under 35 U.S.C. 101 because for the same reasoning as above and the additional recited limitation(s) fail(s) to establish that the claim(s) is/are not directed to an abstract idea. Dependent claims 3 simply further describes the technological environment. Dependent claims 2, 4-10 simply help to define the abstract idea. The additional limitations of the dependent claim(s) when considered individually and as an ordered combination do not amount to significantly more than the abstract idea. Viewing the claim limitations as an ordered combination does not add anything further than looking at the claim limitations individually. When viewed either individually, or as an ordered combination, the additional limitations do not amount to a claim as a whole that is significantly more than the abstract idea. Accordingly, claims 1-12 are ineligible. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-2, 4-5, 8, and 10-12 are rejected under 35 U.S.C. 103 as being unpatentable over US 20070150827 A1 (“Singh”) in view of US 20210303717 A1 (“Neves”). Regarding claim 1, Singh discloses a method for processing a transaction including a provision, by a user, of sensitive data on a digital terminal, wherein the method is implemented by the digital terminal and comprises (see at least FIG. 3A-3B. accessing sensitive data, see at least [0019] and [0037]. Login transaction, see at least [0023] and [0004]. Entry of username and password.) initializing by the digital terminal, the transaction requiring entry of the sensitive data (User may log into an application on a device, see at least [0023] and [0004], and see at least [0016]. Logging in requires user to enter username and password, see at least [0039].); obtaining after initialization, through an image pickup device connected to the terminal, multimedia data representative of an environment of the user (Identifying a person other than a user of user interface with respect to information on user interface. In one embodiment, image capture device may capture images of one or more persons. The captured images may be communicated to viewing position identifier. Viewing position identifier may include a person identifier function for identifying one or more persons in the captured images. For example, person may be a user of user interface, and person may be a person other than a user of user) interface or an onlooker of information shown on user interface. Image capture device may capture images of persons and communicate the captured images to function. See at least [0022]. See also FIG. 3A, step 300-306.); analyzing the multimedia data previously obtained (Viewing position identifier may include a person identifier function for identifying one or more persons in the captured images. For example, person may be a user of user interface, and person may be a person other than a user of user) interface or an onlooker of information shown on user interface. Image capture device may capture images of persons and communicate the captured images to function. Based on the images, function may determine the positions of persons with respect to user interface. See at least [0022]. See also FIG. 3A, step 300-306.); and modifying an implementation context of the transaction in response to the analyzing of the multimedia data delivering a result representative of an unsecured environment (Function may determine whether to perform an action or not to protect information on interface from the identified person. The performance of an action may include obfuscating or reformatting information shown on user interface such that the information is more difficult for a person to perceive. For example, a display screen may be blurred in order to make text information more difficult to perceive. In yet another example, a display screen may be darkened. In another example, a display screen may be turned off in order to prevent an onlooker from perceiving information. In yet another example, colors on a display screen may be inversed. In another example, a zoom level of the information, be it text or images, may be decreased to prevent an onlooker from perceiving information. See at least [0047]-[0048]. See also FIG. 3B, step 320-322.), wherein: the obtaining comprises, obtaining, from sensors of the digital terminal, at least one data representative of an environment of the digital terminal (Identifying a person other than a user of user interface with respect to information on user interface. In one embodiment, image capture device may capture images of one or more persons. The captured images may be communicated to viewing position identifier. Viewing position identifier may include a person identifier function for identifying one or more persons in the captured images. For example, person may be a user of user interface, and person may be a person other than a user of user) interface or an onlooker of information shown on user interface. Image capture device may capture images of persons and communicate the captured images to function. See at least [0022]. See also FIG. 3A, step 300-306.); and the analyzing comprises, in response to the data representative of the environment being not in compliance with an expected value, providing the result representative of an unsecured environment (Viewing position identifier may include a person identifier function for identifying one or more persons in the captured images. For example, person may be a user of user interface, and person may be a person other than a user of user) interface or an onlooker of information shown on user interface. Image capture device may capture images of persons and communicate the captured images to function. Based on the images, function may determine the positions of persons with respect to user interface. See at least [0022]. See also FIG. 3A, step 300-306. Function may determine whether to perform an action or not to protect information on interface from the identified person. The performance of an action may include obfuscating or reformatting information shown on user interface such that the information is more difficult for a person to perceive. For example, a display screen may be blurred in order to make text information more difficult to perceive. In yet another example, a display screen may be darkened. In another example, a display screen may be turned off in order to prevent an onlooker from perceiving information. In yet another example, colors on a display screen may be inversed. In another example, a zoom level of the information, be it text or images, may be decreased to prevent an onlooker from perceiving information. See at least [0047]-[0048]. See also FIG. 3B, step 320-322.). While Singh discloses an environment, Singh does not expressly disclose an electric radio environment. However, Neves discloses an electric radio environment (For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, include CD, laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. See at least [0119].). From the teaching of Neves, it would have been obvious to one having ordinary skill in the art before the effective filing date to modify the environment of Singh to be a radio environment, as taught by Neves, in order to prevent unintentional disclosure of data (see Neves at least at [0025]), and in order to improve protection of private or sensitive data (see Neves at least at [0002]-[0004]). Regarding claim 2, the combination of Singh and Neves discloses the limitations of claim 1, as discussed above, and Singh further discloses the analyzing comprises searching, within the multimedia data, a set of at least one face (identifying a person other than a user of user interface. See at least [0022]. Analyzing face, see at least [0042].); in response to a number of faces being equal to one, searching, within the detected face, an orientation of the eyes of the detected face (identifying a person other than a user of user interface. See at least [0022]. Analyzing face, see at least [0042]. Function may detect the position of the eyes of persons in captured images. Based on the determined position of the eyes with respect to interface, function may determine whether a person is a user or a potential onlooker. For example, if the eyes are determined to be within a predetermined distance and/or positioned in front of interface, the image of the person corresponding with the eyes may be identified as a user. Images associated with other eyes may be identified as eyes corresponding with persons other than a user or as a potential onlooker. See at least [0028]. See also [0030].); and providing the result representative of an insecure environment (Function may determine whether to perform an action or not to protect information on interface from the identified person. The performance of an action may include obfuscating or reformatting information shown on user interface such that the information is more difficult for a person to perceive. For example, a display screen may be blurred in order to make text information more difficult to perceive. In yet another example, a display screen may be darkened. In another example, a display screen may be turned off in order to prevent an onlooker from perceiving information. In yet another example, colors on a display screen may be inversed. In another example, a zoom level of the information, be it text or images, may be decreased to prevent an onlooker from perceiving information. See at least [0047]-[0048]. See also FIG. 3B, step 320-322.). While Singh discloses providing the result, Singh does not expressly disclose providing in response to the orientation of the eyes of the detected face indicating that the user is not looking at a screen of the terminal. However, Neves discloses providing in response to the orientation of the eyes of the detected face indicating that the user is not looking at a screen of the terminal (To prevent unintentional disclosure of data, techniques described herein may support concealment of data during a live data viewing session. Concealment may be triggered based on events indicating that a disclosure may occur. Such an event may include detecting that the user is not actively viewing the data. See at least [0025]. See also [0039], describing monitoring eye direction). From the teaching of Neves, it would have been obvious to one having ordinary skill in the art before the effective filing date to modify the providing of Singh to be in response to the orientation of the eyes of the detected face indicating that the user is not looking at a screen of the terminal, as taught by Neves, in order to prevent unintentional disclosure of data (see Neves at least at [0025]), and in order to improve protection of private or sensitive data (see Neves at least at [0002]-[0004]). Regarding claim 4, the combination of Singh and Neves discloses the limitations of claim 1, as discussed above, and Singh further discloses the analyzing comprises searching, within the multimedia data, a set of at least one face (image capture device 110 may capture images of one or more persons. The captured images may be communicated to viewing position identifier 112. Viewing position identifier 112 may include a person identifier function 116 for identifying one or more persons in the captured images. For example, person 118 may be a user of user interface 102, and person 108 may be a person other than a user of user) interface 102 or an onlooker of information shown on user interface 102. Image capture device 110 may capture images of persons 108 and 118 and communicate the captured images to function 116. Based on the images, function 116 may determine the positions of persons 108 and 118 with respect to user interface 102. Further, function 116 can compare the determined positions and identify the person closest to user interface 102 as being a user of user interface 102 based on an assumption that the closest person to a user interface is the user of the user interface. In this example, person 118 may be determined to be closer to user interface 102 than person 108. Therefore, person 118 may be identified as the user of user interface 102, and person 108 may be identified as a person other than the user or as a potential onlooker to information on user interface 102. See at least [0022].); in response to a number of faces being greater than or equal to two, providing the result representative of an insecure environment (The performance of an action may include obfuscating or reformatting information shown on user interface such that the information is more difficult for a person to perceive. For example, a display screen may be blurred in order to make text information more difficult to perceive. In yet another example, a display screen may be darkened. In another example, a display screen may be turned off in order to prevent an onlooker from perceiving information. In yet another example, colors on a display screen may be inversed. In another example, a zoom level of the information, be it text or images, may be decreased to prevent an onlooker from perceiving information. See at least [0048].). Regarding claim 5, the combination of Singh and Neves discloses the limitations of claim 2, as discussed above, and Singh further discloses the analyzing further comprises: in response to the data representative of the environment is in compliance with said expected value, searching, within the multimedia data, a set of at least one face (Function may detect the position of the eyes of persons in captured images. Based on the determined position of the eyes with respect to interface, function may determine whether a person is a user or a potential onlooker. Function may determine that a person is an authorized user based on a person's face in a captured image. For example, function may be able to detect faces in captured images and discriminate between different faces. See at least [0028]-[0029].). While Singh discloses an environment, Singh does not expressly disclose an electric radio environment. However, Neves discloses an electric radio environment (For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, include CD, laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. See at least [0119].). From the teaching of Neves, it would have been obvious to one having ordinary skill in the art before the effective filing date to modify the environment of Singh to be a radio environment, as taught by Neves, in order to prevent unintentional disclosure of data (see Neves at least at [0025]), and in order to improve protection of private or sensitive data (see Neves at least at [0002]-[0004]). Regarding claim 8, the combination of Singh and Neves discloses the limitations of claim 1, as discussed above, and Singh further discloses the modifying an implementation context of the transaction comprises: displaying, to the user, a warning message relating to the detection of the insecure environment; alerting a terminal manager (The performance of an action may include alerting a user of user interface to the presence of a potential onlooker. An alert may include indicating the position of a potential onlooker to a user of user interface. For example, output interface may display text indicating a potential onlooker, such as “Potential onlooker located 4 feet behind you, and 45 degrees to your left”. In another example, output interface may display an image indicating the potential onlooker. In the displayed image, the onlooker can be indicated by a flag, circle, or otherwise differentiated in the image. See at least [0052]-[0053].); hiding at least one information displayed on a screen of the terminal; and/or blocking an entry area of the sensitive data (The performance of an action may include obfuscating or reformatting information shown on user interface such that the information is more difficult for a person to perceive. For example, a display screen may be blurred in order to make text information more difficult to perceive. In yet another example, a display screen may be darkened. In another example, a display screen may be turned off in order to prevent an onlooker from perceiving information. In yet another example, colors on a display screen may be inversed. In another example, a zoom level of the information, be it text or images, may be decreased to prevent an onlooker from perceiving information. See at least [0048].). Regarding claim 10, the combination of Singh and Neves discloses the limitations of claim 8, as discussed above, and Singh further discloses subsequent to the displaying of the warning message relating to the detection of the insecure environment, the method comprises receiving, by the digital terminal, data representative of an acceptance, by the user, of a continuation of entry of the sensitive data (Once alerted to a potential onlooker, the user may act to prevent the potential onlooker from perceiving the information. For example, the user may stop entering information via input interface. In another example, the user may stop information from being displayed on output interface. In yet another example, the user may reposition user interface such that the potential onlooker cannot perceive the information. See at least [0054]. The Examiner interprets a user repositioning as representative of acceptance by the user a continuation of entry of the sensitive data.). Claim 11 has similar limitations found in claim 1 above, and therefore is rejected by the same art and rationale. And Singh discloses a digital terminal adapted for processing a transaction comprising a provision, by the user, of sensitive data on the digital terminal, wherein the digital terminal comprises: at least one processor; and at least one non-transitory computer readable medium comprising instructions stored thereon which when executed by the at least one processor configure the digital terminal to perform claim functions (see at least [0008] and [0016].). Claim 12 has similar limitations found in claim 1 above, and therefore is rejected by the same art and rationale. And Singh discloses a non-transitory computer readable medium comprising a computer program product stored therein comprising program code instructions for implementing a method for processing a transaction including claim functions (see at least [0008] and [0016].). Claims 6-7 and 9 are rejected under 35 U.S.C. 103 as being unpatentable over Singh in view of Neves, and in further view of US 20190005311 A1 (“Komatsu”). Regarding claim 6, the combination of Singh and Neves discloses the limitations of claim 1, as discussed above, and Singh further discloses the modifying an implementation context of the transaction comprises, in response to at least two faces being detected within the multimedia data, calculating biometric characteristics of at least one among the at least two detected faces (Identifying a user and an onlooker, see at least [0022]. Detecting eye movement of a potential onlooker, see at least [0034]. See also [0036].). Singh does not expressly disclose updating, within the terminal itself, or a device connected to the terminal, a database of detected face characteristics. However, Komatsu discloses updating, within the terminal itself, or a device connected to the terminal, a database of detected face characteristics (Updating facial image database, see at least [0115]. See also [0151]-[0152].). From the teaching of Komatsu, it would have been obvious to one having ordinary skill in the art before the effective filing date to modify Singh to update within the terminal a database of detected face characteristics, as taught by Komatsu, in order to improve user authentication including improving speed and accuracy of user authentication (see Komatsu at least at [0003]-[0010]). Regarding claim 7, the combination of Singh, Neves, and Komatsu discloses the limitations of claim 6, as discussed above, and Singh further discloses the updating the database of detected face characteristics comprises, identifying composite identifiers of facial characteristics corresponding to the characteristics calculated for a current face (Analyzing image data to determine a persons eye or facial orientation to determine if the user is an onlooker, see at least [0034] and [0036]. Using eye/facial data to determine friendliness of a user, see at least [0040]-[0042].), and, when such data are already present in the database for the current face, incrementing a counter relating to these characteristics for this current face (The Examiner notes that this step is optional and is therefore given little patentable weight.). While Singh discloses identifying identifiers, Singh does not expressly disclose identifying via a search, within the database, for the identifiers. However, Komatsu discloses identifying via a search, within the database, for the identifiers (searching a facial image database, see at least [0125]. See also [0196].). From the teaching of Komatsu, it would have been obvious to one having ordinary skill in the art before the effective filing date to modify the identifying of Singh to be performed via a search within the database, as taught by Komatsu, in order to improve user authentication including improving speed and accuracy of user authentication (see Komatsu at least at [0003]-[0010]). Regarding claim 9, the combination of the combination of Singh, Neves, and Komatsu discloses the limitations of claim 7, as discussed above, and Singh further discloses the modifying an implementation context of the transaction comprises: displaying, to the user, a warning message relating to the detection of the insecure environment; alerting a terminal manager (The performance of an action may include alerting a user of user interface to the presence of a potential onlooker. In one example, an alert may include blinking a light emitting diode (LED) or other suitable light emitter on system to indicate to the user of the presence of a potential onlooker. In another example, an alert may include sounding an alarm. In yet another example, an alert may include indicating the position of a potential onlooker to a user of user interface. For example, output interface may display text indicating a potential onlooker, such as “Potential onlooker located 4 feet behind you, and 45 degrees to your left”. In another example, output interface may display an image indicating the potential onlooker. In the displayed image, the onlooker can be indicated by a flag, circle, or otherwise differentiated in the image. Once alerted to a potential onlooker, the user may act to prevent the potential onlooker from perceiving the information. See at least [0052]-[0054].); hiding at least one information displayed on a screen of the terminal; and/or blocking an entry area of the sensitive data (Function may determine whether to perform an action or not to protect information on interface from the identified person. The performance of an action may include obfuscating or reformatting information shown on user interface such that the information is more difficult for a person to perceive. For example, a display screen may be blurred in order to make text information more difficult to perceive. In yet another example, a display screen may be darkened. In another example, a display screen may be turned off in order to prevent an onlooker from perceiving information. In yet another example, colors on a display screen may be inversed. In another example, a zoom level of the information, be it text or images, may be decreased to prevent an onlooker from perceiving information. See at least [0047]-[0048]. See also FIG. 3B, step 320-322.); and the modifying the implementation context of the transaction comprises: when the data representative of the electric radio environment is not in compliance with an expected value, or when the counter associated with the characteristics calculated for a current face exceeds a predetermined ceiling, performing the alerting of the terminal manager (The Examiner notes that this step is optional and is therefore given little patentable weight.); when the orientation of the eyes of the detected face indicates that the user is not looking at the terminal screen, or when the number of faces is greater than or equal to two, performing the hiding at least one information displayed on the terminal screen (The Examiner notes that this step is optional and is therefore given little patentable weight.); and when the number of faces is greater than or equal to two, performing the displaying, to the user, a warning message relating to the detection of the insecure environment and/or the blocking of an entry area of the sensitive data (The Examiner notes that this step is optional and is therefore given little patentable weight.). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. US 20160098093 A1 (“Cheon”) discloses receive eye gaze information of a user and user gesture information from an external apparatus; a display unit configured to display a preset screen; and a controlling unit configured to control the display unit so that a preset object is displayed on the preset screen if it is determined that an eye gaze of the user is directed toward the preset screen using the received eye gaze information and a user gesture corresponding to the received gesture information is sensed. US 20190239069 A1 (“Toyota”) discloses identify the user by a given authentication method of the plurality of authentication methods, the authentication methods being associated with actions allowed to users, receive an instruction to execute an action from the user, make a determination whether the instruction to execute the action is allowed based on the given authentication method, and execute the action or restrict the action from being executed based on a result of the determination. THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to RAVEN E YONO whose telephone number is (313)446-6606. The examiner can normally be reached Monday - Friday 8-5PM EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Bennett M Sigmond can be reached at (303) 297-4411. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /RAVEN E YONO/Primary Examiner, Art Unit 3694
Read full office action

Prosecution Timeline

Jul 12, 2024
Application Filed
Sep 16, 2025
Non-Final Rejection — §101, §103
Jan 20, 2026
Response Filed
Feb 05, 2026
Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12548022
SYSTEMS AND METHODS FOR EXECUTING REAL-TIME ELECTRONIC TRANSACTIONS USING API CALLS
2y 5m to grant Granted Feb 10, 2026
Patent 12518276
SYSTEMS AND METHODS FOR SECURE TRANSACTION REVERSAL
2y 5m to grant Granted Jan 06, 2026
Patent 12511637
METHOD, APPARATUS, AND DEVICE FOR ACCESSING AGGREGATION CODE PAYMENT PAGE, AND MEDIUM
2y 5m to grant Granted Dec 30, 2025
Patent 12489647
SECURELY PROCESSING A CONTINGENT ACTION TOKEN
2y 5m to grant Granted Dec 02, 2025
Patent 12481992
AUTHENTICATING A TRANSACTION
2y 5m to grant Granted Nov 25, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
39%
Grant Probability
72%
With Interview (+32.5%)
2y 6m
Median Time to Grant
Moderate
PTA Risk
Based on 175 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month