Prosecution Insights
Last updated: April 19, 2026
Application No. 17/281,444

File Transfer Method and Electronic Device

Non-Final OA §103
Filed
Mar 30, 2021
Examiner
SHALU, ZELALEM W
Art Unit
2145
Tech Center
2100 — Computer Architecture & Software
Assignee
Huawei Technologies Co., Ltd.
OA Round
7 (Non-Final)
29%
Grant Probability
At Risk
7-8
OA Rounds
3y 2m
To Grant
48%
With Interview

Examiner Intelligence

Grants only 29% of cases
29%
Career Allow Rate
31 granted / 108 resolved
-26.3% vs TC avg
Strong +19% interview lift
Without
With
+19.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 2m
Avg Prosecution
34 currently pending
Career history
142
Total Applications
across all art units

Statute-Specific Performance

§101
14.3%
-25.7% vs TC avg
§103
63.4%
+23.4% vs TC avg
§102
8.1%
-31.9% vs TC avg
§112
10.8%
-29.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 108 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . 2. This action is in response to the amendment filed on 12/19/2025. Claims 1-5, 7, 20-24, 26, 41, 43-46, and 48-50 are pending in this application. Applicant Response 3. In Applicant’s response dated 12/19/2025, Applicant amended claims 1-2, 20-21, 41, and 43 and argued against all objections and rejections previously set forth in the Office Action dated 08/26/2025. Continued Examination under 37 CFR 1.114 4. A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 12/19/2025 has been entered. Examiner Comments 5. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-5, 7, 20-24, 26, 41, 43-46, and 48-50 are rejected under 35 U.S.C. 103 as being unpatentable over Forutanpour (Pub. No.: US 20110083111 A1, Pub. Date: 2011-04-07) in view of MOOSAVI (Pub. No.: US 20120220221 Al, Pub. Date:2012-08-30) in further view of Jain (Pub. No.: US 20140280706 A1, Pub. Date: 2014-09-18) Regarding independent Claim 1, Forutanpour teaches a file transfer method (see Forutanpour: Fig.1, [0070], “a process flow diagram of a method 100 for implementing file sharing functionality in response to intuitive user gestures according to the various aspects.”), wherein the file transfer method comprises: displaying a first interface, wherein the first interface is a first graphical user interface of a first application (see Forutanpour: Fig.3, [0090], “A user may open a file, such as drawings by Leonardo da Vinci, and touch the touchscreen 308 using a finger 306 to activate the file sharing functionality.”, i.e., the drawing is the first graphical user interface of a first application.”); obtaining a BLUETOOTH address of a second device when a distance between a first near- field communication (NFC) chip of a first device and an electronic tag of the second device or a second NFC chip of the second device is less than a preset threshold (see Forutanpour: Fig.1, [0072], “the computing device may discover other computing devices located at relatively close distances. For example, a computing device configured with a Bluetooth.RTM. transceiver may be configured to discover the presence of like-equipped computing devices located within about 100 meters, depending on the Bluetooth power class employed. Bluetooth.RTM. is a standard communications protocol primarily designed for low power consumption, short range (power-class-dependent: 1 meter, 10 meters, 100 meters) communications between devices equipped with low-cost transceiver microchips. Bluetooth.RTM. makes it possible for these devices to communicate with each other when they are within range.”), wherein the electronic tag comprises device information of the second device, and wherein the device information comprises the BLUETOOTH address or a tag identification (ID) for obtaining the BLUETOOTH address (see Forutanpour Fig.1, [0077], “At block 114 the computing device may identify targeted particular nearby computing device to which the user intends to send the selected file based upon information regarding the file transfer gesture, such as its direction, distance, speed, etc., the orientation of computing device, such as its compass heading and tilt angle, and the relative location of nearby computing devices.” … [0079], a targeted computing device may simply provide the requesting computing device with communication access data, such as Bluetooth identification, email address, or telephone number which can be used to complete the file transfer.”) transferring, [based on gesture direction and proximity distance of the second device], a first file corresponding to the first interface to the second device through a communication connection established based on the BLUETOOTH address (see Forutanpour Fig.11, [0082], “Once the intended recipient computing device is identified, and transfer of the file is verified and/or authorized, at block 118 the computing device may transmit the selected file to the targeted device using identified access data and communication network.”) As shown above, Forutanpour teaches a file sharing mechanism between computing devices by activating a file sharing functionality, discovering nearby computing devices, establishing a wireless link with the discovered computing devices, determining locations of the discovered nearby computing devices, detecting a file sharing gesture, identifying a targeted computing device based on the detected file sharing gesture, and transmitting a file sharing message to the targeted computing device.( see [0004]). Forutanpour does not teach the system wherein: storing a plurality of preset scenarios each comprising scenario information, wherein the scenario information for each preset scenario comprises application identification information, wherein the application identification information for the preset scenario indicates an identifiable application or is null for non-identifiable applications; obtaining first information indicating that the first device is in a first scenario, wherein the first scenario is a first preset scenario from the preset scenarios and the first information matches the scenario information of the first preset scenario; receiving, from the second device, second information indicating the second device is in a second scenario, wherein the second scenario is a second preset scenario from the preset scenarios and the second information matches the scenario information of the second preset scenario, wherein the first preset scenario comprises first application identification information of a preset identifiable application, and wherein the second preset scenario comprises second application identification information of a non-identifiable application; transferring, based on a first comparison result indicating that the preset identifiable application has a higher priority than the non-identifiable application, a first file corresponding to the first interface to the second device through a communication connection established based on the BLUETOOTH address; and receiving, from the second device via the communication connection, based on a second comparison result indicating that the preset identifiable application has a lower priority than the non-identifiable application, a second file corresponding to a second interface, wherein the second interface is a second graphical user interface displayed by the second device. However, MOOSAVI teaches the file transfer method wherein: storing a plurality of preset scenarios each comprising scenario information (see MOOSAVI: Fig.4, [0031], illustrating the storing orientation stated of computing device to determine the direction of file sharing, “For example 57-Transfer digital content data from Device A to Device B, 60 -Transfer digital content data between Device A to Device B, 59-Transfer digital content data from Device B to Device A, which indicate three different preset file transfer scenarios information ( Block 57, 60 and 59) that are stored ), wherein the scenario information for each preset scenario comprises application identification information (see MOOSAVI: for example Fig.5, [0032], “the mobile device 31' is initially displaying a contact "business card" ("Business Card A") on the display 70' prior to contact with the stationary mobile device 41'”, i.e. the photo gallery application is identified as the an application identification that indicate a preset scenario .), wherein the application identification information for the preset scenario indicates an identifiable application or is null for non-identifiable applications (see MOOSAVI: for example Fig.5, [0032], “the mobile device 31' is initially displaying a contact "business card" ("Business Card A") on the display 70' prior to contact with the stationary mobile device 41', i.e. the Business Card A is an identifiable application and device B represents null for non-identifiable applications because no application is displayed) obtaining first information indicating that the first device is in a first scenario (see MOOSAVI: Fig.5, [0032], mobile device 31' is facing upwardly relative to a mobile device 41 and displaying a contact "business card" ("Business Card A"), wherein the first scenario is a first preset scenario from the preset scenarios information (see MOOSAVI: Fig.4, [0031], preset scenarios, 57-Transfer digital content data from Device A to Device B,) and the first information matches the scenario information of the first preset scenario (see MOOSAVI: Fig.5, [0032], “mobile device 31' is initially displaying a contact "business card" ("Business Card A") on the display 70' prior to contact with the stationary mobile device 41'. That is, displaying a digital content item (e.g., contact, image, etc.) on the display may designate that item for transfer upon movement of the mobile device 31' into contact with the mobile device 41', although in other embodiments content may be pre-selected for transfer via other approach.”) receiving, from the second device, second information indicating the second device is in a second scenario (see MOOSAVI: Fig.3, [0022], “beginning at Block 50, the controller 34 may be configured to communicate with the electronic device 41 via the NFC device 33 when in proximity therewith. The controller 34 is further configured to determine an orientation of the mobile device 31, at Block 51, and to determine a direction of communication with respect to the electronic device 41 based upon the determined orientation,.”, i.e., the second device the (mobile device 41) operation information communicates that the orientation is downward and is considered as the second scenario), wherein the second scenario is a second preset scenario from the preset scenarios and the second information matches the scenario information of the second preset scenario (see MOOSAVI: Fig.7, [0033], “In the case where both mobile devices 31', 41' are facing upwardly just prior to the bump (i.e., they bumped each other), then the controllers 33', 43' exchange respective digital content data, at Block 60'. An example implementation of this case is shown in FIGS. 7 and 8. As seen in FIG. 7, each of the mobile devices 31', 41' is displaying a business card ("Business Card A", "Business Card B") on its respective display 70', 80' and are schematically illustrated as facing upwardly relative to one another. After the bump occurs, the digital business cards are exchanged between the mobile devices 31', 41', as seen in FIG.8”), wherein the first preset scenario comprises first application identification information of a preset identifiable application (see MOOSAVI: Fig.5, [0032], “The mobile device 31' is initially displaying a contact "business card" ("Business Card A") on the display 70' prior to contact with the stationary mobile device 41'.”, i.e. the business card is a photo gallery identifiable application information), and wherein the second preset scenario comprises second application identification information of a non-identifiable application (see MOOSAVI: Fig.5, [0025], the mobile device 31 (the transmitting device) is oriented generally upward, and the mobile device 41 (the receiving device) is oriented generally downward (a non-identifiable application). See also MOOSAVI: Fig.4, [0025], describing “the mobile device 31 (the transmitting device) is oriented generally upward (a first priority of the first preset scenario), and the mobile device 41 (the receiving device) is oriented generally downward (a second priority of the second preset scenario),) see also [0032], stating mobile device 31' is initially displaying a contact "business card" ("Business Card A") on the display 70' (a first priority of the first preset scenario) and mobile device 41’ displaying no information (downward (a second priority of the second preset scenario)), transferring, [… ], the first file corresponding to the first interface to the second device through a communication connection established based on the BLUETOOTH address (see MOOSAVI: Fig:4, [0031], “the mobile device 31' has an upward orientation prior to the bump but the mobile device 41' does not, the controller 34' conveys its digital content data to the controller 44' via the NFC circuit 33', 43', at Blocks 56', 57'.”, i.e. first device 31 orientation is up indication a first priority of the first scenario and the up orientation have the highest priority resulting tin the transfer of file from first device resulting in the transfer digital content data from Device A to Device B, (31 to second device 10) ). See also Fig.6, [0032], stating “Once the bump occurs, the controller 34' transmits the business card to the controller 44', as seen in FIG. 6. This transfer may be accomplished by the NFC circuits 33', 43'”), (see MOOSAVI: Fig:4, [0031], illustrating that the first mobile device upward orientation have the highest priority than the second mobile device downward orientation ), and receiving, from the second device via the communication connection, based on a second comparison result indicating that the first priority is lower than the second priority a second file corresponding to a second interface (see MOOSAVI: Fig:4, [0031], “the mobile device 31' does not have an upward orientation prior to the bump but the mobile device 41' does, then the controller 44' conveys its digital content data to the controller 34' via the NFC circuits 33', 43', at Block 59'.”, i.e. , 59-Transfer digital content data from Device B to Device A . The first device 31 orientation is down that indicate a first scenario is lower than the second scenario resulting the transfer of file from second device 41 to the first device 10), wherein the second interface is a second graphical user interface displayed by the second device (see MOOSAVI: [0033], “each of the mobile devices 31', 41' is displaying a business card ("Business Card A", "Business Card B") on its respective display 70', 80' and are schematically illustrated as facing upwardly relative to one another. After the bump occurs, the digital business cards are exchanged between the mobile devices 31', 41', as seen in FIG. 8.”) Because Forutanpour and MOOSAVI are in the same/similar field of endeavor of file sharing between computing devices, accordingly, it would have been obvious to a person of ordinary skill in the art, before the effective filing date of the invention, to modify the teaching of Forutanpour to include the system that determine different scenarios between different devices and identify the file transfer priority based on the devises scenarios or conditions to transfer file between devices as taught by MOOSAVI. One would have been motivated to make such a combination in order to provide automatic file transmission that is fast, efficient, and conveniently completed using efficient communication medium to alleviate user experience. As shown above, Forutanpour teaches device to device file transfer or file sharing mechanism using gesture based user interface and applies drag and drop, flick or other gesture to initiate a file transfer between devices. MOOSAVI disclosed a mechanism for directional file sharing or file transfer decision making system that is based on mobile device orientation cue and NFC communication. MOOSAVI teaches transferring content based on the identifiable and non-identifiable digital content data such as contacts (e.g., digital business cards), audio, video, images, addresses or appointments, documents, applications, financial information, security information, etc. can be transferred between devices based on based upon the respective determined orientations. The mobile device is configured to determine the orientation of the mobile communications device and the direction of communication prior to establishing NFC communications with the at least one electronic device. Forutanpour and MOOSAVI does not teach the system wherein: transferring, based on a first comparison result indicating that the preset identifiable application has a higher priority than the non-identifiable application, a first file corresponding to the first interface to the second device through a communication connection established based on the BLUETOOTH address. However, Jain teaches the file transfer system wherein: transferring (see Jain: Fig.4, [0040], “If it is determined in block 416 that the content should be sent immediately, the content is sent over whatever network may be available in block 418 and the process ends.”), based on a first comparison result indicating that the preset identifiable application has a higher priority than the non-identifiable application (see Jain: Fig.2, [027], “a capability matrix that can be maintained by, be accessible to, and communicated by the devices of FIG. 1. In an embodiment a capability matrix 202 (i.e. preset identifiable application) can be maintained in each device in storage elements 113 and 143, or can be located in the database 116 and may be accessible to a sending device 102 and a receiving device 132. A capability matrix may include an identification table 202 that includes information relating to addresses and capabilities of particular devices”), a first file corresponding to the first interface to the second device through a communication connection established based on the BLUETOOTH address (see Jain: Fig.1, [0016], “the system and method for prioritizing file transfer can be implemented in a mobile device that operates over RF frequencies referred to as the "Bluetooth" communication band, RF frequencies identified by the IEEE 802.11b/g/n standard, in a mobile device that operates over cellular communication frequencies, and can be implemented in mobile devices that operate on any radio frequency on any type of network.”). In other words, Jain discloses a file transfer system that stores a preset file transfer priority settings, evaluating conditions associated with the priority settings and transmitting a file when the priority condition is satisfied. Because Forutanpour, MOOSAVI and Jain are in the same/similar field of endeavor of file sharing between computing devices, accordingly, it would have been obvious to a person of ordinary skill in the art, before the effective filing date of the invention, to modify the teaching of Forutanpour to include the priority based file transfer control that store a preset a preset file transfer priority setting based on a first comparison result indicating that the first priority is higher than the second priority as taught by Jain. One would have been motivated to make such a combination in order to improve user experience by preventing unintended transfers, reducing user interaction and provide automatic file transmission that is fast, efficient, and conveniently completed using efficient communication medium to alleviate user experience. Regarding Claim 2, Forutanpour, MOOSAVI, and JAIN teaches all the limitations of Claim 1. Forutanpour further teaches method further comprising: displaying, based on a third comparison result indicating that the preset identifiable application has a same priority as the non-identifiable application, a third interface, wherein the third interface prompts a user to select a file transfer direction between the first device and the second device, and wherein the third interface comprises a first direction option for the first file to transfer the first file to the second device, and a second direction option for the second device to transfer the second file to the first device (see MOOSAVI: Fig.4, [0033], “In the case where both mobile devices 31', 41' are facing upwardly just prior to the bump (i.e., they bumped each other), then the controllers 33', 43' exchange respective digital content data, at Block 60'.”0 receiving, a first operation from a user in the third interface, wherein the first operation indicates a first selection of the first direction option (see MOOSAVI: Fig.4, [0033], An example implementation of this case is shown in FIGS. 7 and 8. As seen in FIG. 7, each of the mobile devices 31', 41' is displaying a business card ("Business Card A", "Business Card B") on its respective display 70', 80' and are schematically illustrated as facing upwardly relative to one another. After the bump occurs, the digital business cards are exchanged between the mobile devices 31', 41', as seen in FIG. 8.”), and transferring, in response to the first operation, the first file to the second device via the communication connection (see MOOSAVI: Fig.4, [0033], “] In the case where both mobile devices 31', 41' are facing upwardly just prior to the bump (i.e., they bumped each other), then the controllers 33', 43' exchange respective digital content data, at Block 60'.); It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the invention, to modify the teaching of Forutanpour to include the system that determine different scenarios between different devices and identify the file transfer priority based on the devises scenarios or conditions to transfer file between devices as taught by MOOSAVI. One would have been motivated to make such a combination in order to provide automatic file transmission that is fast, efficient, and conveniently completed using efficient communication medium to alleviate user experience. Regarding Claim 3, Forutanpour, MOOSAVI, and JAIN teaches all the limitations of Claim 2. Forutanpour further teaches method further comprising: receiving, a second operation from the user in the third interface, wherein the second operation indicates the second selection of the first direction option (see Forutanpour: Fig.28, [0120] “the requesting computing device 10a may be configured to identify and request file sharing simultaneously from many different targeted computing devices 10b-10d by recognizing a closing hand gesture. In this aspect, a user may place all fingers of one hand near the outside borders on the touchscreen 308 and bring the fingers toward the launch pad 304. The requesting computing device 10a may be configured to recognize such a closing hand gesture, and in response transmit a file sharing request to all linked computing devices 10b-10d. Upon receiving such a file sharing request, the other computing devices 10b-10d may transmit files to the requesting computing device 10a according to access data included in the file sharing request”) sending, a notification message to the second device in response to the second operation, wherein the notification message indicates the second device to transfer the second file to the first device (see Forutanpour: Fig.30, [0122], “For example, if the picture and name of the owner of the targeted computing device is available, such as in the contact database or transmitted by the targeted computing device, the requesting computing device 10a may display a verification prompt pane 3300 including the profile picture and the name of the owner of the targeted computing device 10 along with a prompt for the user to verify the transmission of the requests to the targeted computing device 10. For example, the user may press a "Yes" or "No" soft key to verify the transmission of the request message.”), and receiving, the second file from the second device via the communication connection ( see Forutanpour: Fig.30, [0122], “if the picture and name of the owner of the targeted computing device is available, such as in the contact database or transmitted by the targeted computing device, the requesting computing device 10a may display a verification prompt pane 3300 including the profile picture and the name of the owner of the targeted computing device 10 along with a prompt for the user to verify the transmission of the requests to the targeted computing device 10. For example, the user may press a "Yes" or "No" soft key to verify the transmission of the request message.”) Regarding Claim 4, Forutanpour, MOOSAVI, and JAIN teaches all the limitations of Claim 1. MOOSAVI further teaches method further comprising: wherein the preset scenarios comprise the first scenario and the second scenario (see MOOSAVI: for example Fig.5, [0032], “ the mobile device 31' is initially displaying a contact "business card" ("Business Card A") on the display 70' prior to contact with the stationary mobile device 41', (i.e. first application information form example a picture of the business card in the photo gallery application and the first user operation information is the user the moment at which the business card is displayed). That is, displaying a digital content item (e.g., contact, image, etc.) on the display may designate that item for transfer upon movement of the mobile device 31' into contact with the mobile device 41', although in other embodiments content may be pre-selected for transfer via other approaches, as discussed above. Once the bump occurs, the controller 34' transmits the business card to the controller 44', as seen in FIG. 6. This transfer may be accomplished by the NFC circuits 33', 43', or in some embodiments the transfer may be performed via a separate communications path, such as via the wireless transceivers 36', 46' (e.g., Bluetooth, etc.).”), see also Fig.4, [0031], illustrating the preset scenarios) It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the invention, to modify the teaching of Forutanpour to include the system that determine different scenarios between different devices and identify the file transfer priority based on the devises scenarios or conditions to transfer file between devices as taught by MOOSAVI. One would have been motivated to make such a combination in order to provide automatic file transmission that is fast, efficient, and conveniently completed using efficient communication medium to alleviate user experience. Regarding Claim 5, Forutanpour, MOOSAVI, and JAIN teaches all the limitations of Claim 4. Forutanpour further teaches method further comprising: the first application identification information (see for example Forutanpour: Fig.3-6, [0090], “A user may open a file, such as drawings by Leonardo da Vinci, and touch the touchscreen 308 using a finger 306 to activate the file sharing functionality.”), and wherein the file transfer method further comprises: obtaining, third application identification information of the first application to which the first interface belongs (see Forutanpour: for example, Fig.3-6, [0090], “The user may be required to touch the touchscreen 308 for a set period of time and/or at a specific location to activate the file sharing functionality. For example, the user may be required to touch the touchscreen 308 (second application identification information) and hold the touch for at least 5 seconds before the file sharing function is activated. As a further example, the file sharing functionality may be activated by voice command, such as "share this file," similar to how other mobile device functions can be activated by speaking a particular command. Alternatively, the user may be required to touch a specific location on the touchscreen 308, such as the center of the screen, to activate the file sharing function. In a further aspect, a certain amount of pressure applied to the touchscreen display may activate the file sharing functionality.”); and, associating the first device with the first scenario when the third application identification information matches forth application identification information in the first scenario (see Forutanpour: Fig.4A-4D, [0091], “he functions key 402 may be specifically designated for activating the file sharing functionality, or may be designed for multiple functions. If other functions are associated with the function key 402, a GUI menu 404 may be presented to allow the user to select a particular desired function, as illustrated in FIG. 4B. As illustrated in FIG. 4C, the user may choose the file sharing functionality by touching the word "Share" as it appears on the menu 404 using a finger 306. As illustrated in FIG. 4D, activation of the file sharing functionality may be identified by the display of a launch pad 304.”) Regarding Claim 7, Forutanpour, MOOSAVI, and JAIN teaches all the limitations of Claim 4. MOOSAVI further teaches the file transfer method wherein the scenario information (see for example Fig.4, [0031], illustrating scenario information), comprises: the first user operation information (see MOOSAVI :Fig.4, [0029], “upon detection of the bump, at Block 55', the NFC circuits 33', 43' may begin communicating (e.g., they may be turned on or otherwise enabled for communication), at Block 51', and the controllers 34', 44' may advantageously determine whether their respective mobile devices 31', 41' are facing upward, for example, at the time of the bump, at Block 56'. In the case where the NFC circuits 33', 43' are powered on or awakened from a sleep mode by the physical contact of the mobile devices 31', 41', then these circuits may optionally be turned off or returned to a power-saving or sleep mode once NFC communications are complete, at Block 62'.”) and wherein the file transfer method further comprises: obtaining, first user operation information corresponding to the first interface, wherein the first user operation information comprises a first moment or a second moment, wherein the first moment is first device starts to display the first interface (see MOOSAVI :Fig.5, [0032], upon detection of the bump, at Block 55', the NFC circuits 33', 43' may begin communicating (e.g., they may be turned on or otherwise enabled for communication), at Block 51', and the controllers 34', 44' may advantageously determine whether their respective mobile devices 31', 41' are facing upward, for example, at the time of the bump, at Block 56'. In the case where the NFC circuits 33', 43' are powered on or awakened from a sleep mode by the physical contact of the mobile devices 31', 41', then these circuits may optionally be turned off or returned to a power-saving or sleep mode once NFC communications are complete, at Block 62'.”), wherein the second moment is when the first device starts to display the first interface and either a first user operation, or a second user operation (see MOOSAVI: Fig.4, [0032], “the mobile device 31' is schematically illustrated as facing upwardly relative to a mobile device 41' (see also FIG. 10). In this example embodiment, the mobile device 31' further includes a touch screen display 70', input buttons 71', and an audio output transducer 75', all of which are carried by the housing 32' and coupled to the controller 34'”), wherein the first device receives the first user operation after the first device displays the first interface (see MOOSAVI: Fig.4, [0029], “upon detection of the bump, at Block 55', the NFC circuits 33', 43' may begin communicating (e.g., they may be turned on or otherwise enabled for communication), at Block 51', and the controllers 34', 44' may advantageously determine whether their respective mobile devices 31', 41' are facing upward, for example, at the time of the bump, at Block 56'. ”), wherein the first device receives the second user operation within a third preset time period (see MOOSAVI: Fig.4, [0031], “In the case wherein the mobile device 31' has an upward orientation prior to the bump but the mobile device 41' does not, the controller 34' conveys its digital content data to the controller 44' via the NFC circuit 33', 43', at Blocks 56', 57'. In the reverse case, i.e., wherein the mobile device 31' does not have an upward orientation prior to the bump but the mobile device 41' does, then the controller 44' conveys its digital content data to the controller 34' via the NFC circuits 33', 43', at Block 59'.), and wherein an end moment of the third preset time period is a current moment (see MOOSAVI: Fig.4, [0029], “In the case where the NFC circuits 33', 43' are powered on or awakened from a sleep mode by the physical contact of the mobile devices 31', 41', then these circuits may optionally be turned off or returned to a power-saving or sleep mode once NFC communications are complete, at Block 62'.); and associating the first device with the first scenario when the first user operation information matches second user operation information in the first scenario (see MOOSAVI: Fig.4, [0031], “In the case wherein the mobile device 31' has an upward orientation prior to the bump but the mobile device 41' does not, the controller 34' conveys its digital content data to the controller 44' via the NFC circuit 33', 43', at Blocks 56', 57'”) It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the invention, to modify the teaching of Forutanpour to include the system that determine different scenarios between different devices and identify the file transfer priority based on the devises scenarios or conditions to transfer file between devices as taught by MOOSAVI. One would have been motivated to make such a combination in order to provide automatic file transmission that is fast, efficient, and conveniently completed using efficient communication medium to alleviate user experience. Regarding independent Claim 20 and 41, Claim 20 is directed to a device claim and Claim 41 is directed to a non-transitory computer-readable medium claim and both the claims have similar/same claim limitation as Claim 1 and are rejected with the same rationale. Regarding Claim 21-24 and 26, Claims 21-24 and 26 are directed to a device claim and the claims have similar/same claim limitations as Claims 2-5 and 7 respectively and are rejected with the same rationale. Regarding Claim 43-46, Claims 43-46 are directed to a device claim and the claims have similar/same claim limitations as Claims 2-5 respectively and are rejected with the same rationale. Regarding Claim 48, Forutanpour, MOOSAVI, and JAIN teaches all the limitations of Claim 1. MOOSAVI further teaches the file transfer method further comprising: obtaining third application identification information of the first application (see MOOSAVI: Fig.4, [0030], “wherein the mobile device 31' has an upward orientation prior to the bump but the mobile device 41' does not, the controller 34' conveys its digital content data to the controller 44' via the NFC circuit 33', 43', at Blocks 56', 57'. In the reverse case, i.e., wherein the mobile device 31' does not have an upward orientation prior to the bump but the mobile device 41' does, then the controller 44' conveys its digital content data to the controller 34' via the NFC circuits 33', 43', at Block 59'.”); associating, when the third application identification information matches the first application identification information indicating an identifiable application, the first device with the first preset scenario, wherein the first scenario is the first preset scenario (see MOOSAVI: Fig.8, [0033], “In the case where both mobile devices 31', 41' are facing upwardly just prior to the bump (i.e., they bumped each other), then the controllers 33', 43' exchange respective digital content data, at Block 60'. An example implementation of this case is shown in FIGS. 7 and 8. As seen in FIG. 7, each of the mobile devices 31', 41' is displaying a business card ("Business Card A", "Business Card B") on its respective display 70', 80' and are schematically illustrated as facing upwardly relative to one another. After the bump occurs, the digital business cards are exchanged between the mobile devices 31', 41', as seen in FIG. 8.”); and associating, when the third application identification information matches the second application identification information indicating non-identifiable applications, the first device with the second preset scenario, wherein the first scenario is the second preset scenario (see MOOSAVI: for example Fig.5, [0032], “ the mobile device 31' is initially displaying a contact "business card" ("Business Card A") on the display 70' prior to contact with the stationary mobile device 41', (i.e. the Business Card A is an identifiable application and device B represents null for non-identifiable applications because no application is displayed It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the invention, to modify the teaching of Forutanpour to include the system that determine different scenarios between different devices and identify the file transfer priority based on the devises scenarios or conditions to transfer file between devices as taught by MOOSAVI. One would have been motivated to make such a combination in order to provide automatic file transmission that is fast, efficient, and conveniently completed using efficient communication medium to alleviate user experience. Regarding Claim 49, Forutanpour, MOOSAVI, and JAIN teaches all the limitations of Claim 1. MOOSAVI further teaches the file transfer method further comprising: obtaining third application identification information of the first application (see MOOSAVI: [0033], “In the case where both mobile devices 31', 41' are facing upwardly just prior to the bump (i.e., they bumped each other), then the controllers 33', 43' exchange respective digital content data, at Block 60'. An example implementation of this case is shown in FIGS. 7 and 8. As seen in FIG. 7, each of the mobile devices 31', 41' is displaying a business card ("Business Card A", "Business Card B") on its respective display 70', 80' and are schematically illustrated as facing upwardly relative to one another. After the bump occurs, the digital business cards are exchanged between the mobile devices 31', 41', as seen in FIG. 8.”); and matching, based on the third application identification information, the first application to an identifiable application or a non-identifiable application (see MOOSAVI: [0033], In the examples of FIGS. 5-8, both of the mobile devices 31', 41' have orientation detection capabilities (i.e., the orientation sensors 35', 45'). However, referring again to FIG. 1, in some embodiments the electronic device 41 may be stationary or not have orientation detection capabilities. For example, the electronic device 41 may be a personal computer, etc., with NFC capabilities (e.g., an NFC peripheral) but no orientation detection capabilities. Yet, it may still be desirable to utilize the motion of the mobile device 31 to not only transmit digital content data to the electronic device 41, but also to receive digital content data therefrom (or exchange digital content data therewith). These different operations may be invoked by using only the detected orientation of the mobile device 31 to signify a respective data transfer operation.”) It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the invention, to modify the teaching of Forutanpour to include the system that determine different scenarios between different devices and identify the file transfer priority based on the devises scenarios or conditions to transfer file between devices as taught by MOOSAVI. One would have been motivated to make such a combination in order to provide automatic file transmission that is fast, efficient, and conveniently completed using efficient communication medium to alleviate user experience. Regarding Claim 50, Forutanpour, MOOSAVI, and JAIN teaches all the limitations of Claim 1. MOOSAVI further teaches the file transfer method wherein: a first possibility that the first device sends the first file to the second device when the first scenario is the first preset scenario is higher than a second possibility that the first device sends the first file to the second device when the first scenario is the second preset scenario (see MOOSAVI: Fig.4, [0031], “In the case wherein the mobile device 31' has an upward orientation prior to the bump but the mobile device 41' does not, the controller 34' conveys its digital content data to the controller 44' via the NFC circuit 33', 43', at Blocks 56', 57'. In the reverse case, i.e., wherein the mobile device 31' does not have an upward orientation prior to the bump but the mobile device 41' does, then the controller 44' conveys its digital content data to the controller 34' via the NFC circuits 33', 43', at Block 59'.”) It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the invention, to modify the teaching of Forutanpour to include the system that determine different scenarios between different devices and identify the file transfer priority based on the devises scenarios or conditions to transfer file between devices as taught by MOOSAVI. One would have been motivated to make such a combination in order to provide automatic file transmission that is fast, efficient, and conveniently completed using efficient communication medium to alleviate user experience. Response to Arguments Applicant’s arguments with respect to claim(s) have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Applicant argues the newly added claim limitation against the cited reference of Forutanpour, MOOSAVI, however, the JAIN has been cited to teach said limitations. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. PGPUB NUMBER: INVENTOR-INFORMATION: TITLE / DESCRIPTION US 20130325949 A1 VIRANI; Barket Title: System And Method For Sharing Items Between Electronic Devices Description: The following relates to systems and methods for sharing items between electronic devices. US 20140181686 A1 Shin; Jungeun Title: ELECTRONIC DEVICE AND CONTROL METHOD THEREOF Description: The present invention relates to an electronic device and, more particularly, to an electronic device providing a user interface (UI) for sharing items through short-range communication (or near field communication), and a control method thereof. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ZELALEM W SHALU whose telephone number is (571)272-3003. The examiner can normally be reached M- F 0800am- 0500pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Cesar Paula can be reached on (571) 272-4128. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Zelalem Shalu/Examiner, Art Unit 2145 /CESAR B PAULA/Supervisory Patent Examiner, Art Unit 2145
Read full office action

Prosecution Timeline

Mar 30, 2021
Application Filed
Mar 09, 2023
Non-Final Rejection — §103
Jun 15, 2023
Response Filed
Sep 17, 2023
Final Rejection — §103
Jan 16, 2024
Response after Non-Final Action
Feb 20, 2024
Request for Continued Examination
Feb 23, 2024
Response after Non-Final Action
Mar 18, 2024
Non-Final Rejection — §103
Jun 20, 2024
Response Filed
Sep 26, 2024
Final Rejection — §103
Dec 20, 2024
Response after Non-Final Action
Jan 14, 2025
Request for Continued Examination
Jan 22, 2025
Response after Non-Final Action
Feb 08, 2025
Non-Final Rejection — §103
May 13, 2025
Response Filed
Aug 21, 2025
Final Rejection — §103
Nov 24, 2025
Response after Non-Final Action
Dec 19, 2025
Request for Continued Examination
Dec 24, 2025
Response after Non-Final Action
Jan 08, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12477016
AUTOMATION OF VISUAL INDICATORS FOR DISTINGUISHING ACTIVE SPEAKERS OF USERS DISPLAYED AS THREE-DIMENSIONAL REPRESENTATIONS
2y 5m to grant Granted Nov 18, 2025
Patent 12468969
METHODS FOR CORRELATED HISTOGRAM CLUSTERING FOR MACHINE LEARNING
2y 5m to grant Granted Nov 11, 2025
Patent 12419611
PATIENT MONITOR, PHYSIOLOGICAL INFORMATION MEASUREMENT SYSTEM, PROGRAM TO BE USED IN PATIENT MONITOR, AND NON-TRANSITORY COMPUTER READABLE MEDIUM IN WHICH PROGRAM TO BE USED IN PATIENT MONITOR IS STORED
2y 5m to grant Granted Sep 23, 2025
Patent 12153783
User Interfaces and Methods for Generating a New Artifact Based on Existing Artifacts
2y 5m to grant Granted Nov 26, 2024
Patent 12120422
SYSTEMS AND METHODS FOR CAPTURING AND DISPLAYING MEDIA DURING AN EVENT
2y 5m to grant Granted Oct 15, 2024
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

7-8
Expected OA Rounds
29%
Grant Probability
48%
With Interview (+19.0%)
3y 2m
Median Time to Grant
High
PTA Risk
Based on 108 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month