Prosecution Insights
Last updated: April 19, 2026
Application No. 18/374,344

DEVICE INPUT CONTROL BASED ON SPATIAL ALIGNMENT OF DISPLAYS

Final Rejection §103§112
Filed
Sep 28, 2023
Examiner
ZHOU, HONG
Art Unit
2629
Tech Center
2600 — Communications
Assignee
Motorola Mobility LLC
OA Round
4 (Final)
77%
Grant Probability
Favorable
5-6
OA Rounds
2y 5m
To Grant
94%
With Interview

Examiner Intelligence

Grants 77% — above average
77%
Career Allow Rate
674 granted / 876 resolved
+14.9% vs TC avg
Strong +18% interview lift
Without
With
+17.5%
Interview Lift
resolved cases with interview
Typical timeline
2y 5m
Avg Prosecution
16 currently pending
Career history
892
Total Applications
across all art units

Statute-Specific Performance

§101
1.9%
-38.1% vs TC avg
§103
54.6%
+14.6% vs TC avg
§102
23.8%
-16.2% vs TC avg
§112
12.0%
-28.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 876 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment Applicant’s amendment filed on August 29, 2025 has been entered. Claims 1, 9, 11 and 16 have been amended. Claims 15 and 19 have been cancelled. Claim 22 has been added. Claims 1-14, 16-18 and 20-22 are pending in this application. Response to Arguments 3. Applicant’s arguments see pages 10-11 of the Remarks, filed August 29, 2025, with respect to the 112 rejections have been considered but they are not persuasive. On pages 11-12 of the response the Applicant argues that the term “proximity data” has been changed to “data” and that withdrawal of the rejection is respectfully requested, however, the amendments to the claims do not overcome the rejection as explained below. Therefore, the rejection is maintained. Applicant’s arguments, see pages 13-14 of the response, with respect to the rejection of the claims under 35 U.S.C. 103 in view of the amendments have been fully considered but they are not persuasive. The Examiner respectfully submits that Bian discloses the amended limitation of “determining a location of the cursor on the second electronic device based on a difference between dimensions of the first electronic device to dimensions of the second electronic device; and performing an action indicated by the location of the cursor on the second electronic device”. For example, Figs 5-6 illustrate a difference between resolution dimensions of the PC 502 and the mobile phone 601 and paragraph [0168] of Bian clearly disclose that “the mobile phone may further determine, based on x1 and resolution (for example, a height B) of the PC, a proportion of the location at which the cursor slides out of the display of the PC to a height of the mobile phone. Based on the proportion and resolution of the mobile phone, a specific location at which the cursor is displayed on the right edge may be determined” which means that a specific location of the cursor 602 displayed on the right edge of the mobile phone 601 is determined based on a difference between a height B of the PC and a height of the mobile phone. Therefore, the rejection is maintained. Claim Rejections - 35 USC § 112 4. The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112: The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention. 5. Claims 1-14, 16-18 and 20-22 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. Regarding claims 1, 11 and 16, the specification does not provide support for “the data received from a proximity sensor of the first electronic device or the second electronic device” and “receiving, from a sensor of the first electronic device or the second electronic device, orientation data indicating an angle of the first electronic device relative to the second electronic device and indicating a location of the second electronic device relative to an edge of the first electronic device.” Paragraph [0054] of the PGPub describes detecting a location of the second electronic device relative to the first electronic device (e.g., whether the electronic device 102 is to the left of the electronic device 104, the electronic device 102 is to the right of the electronic device 104, and so forth) by receiving data from sensors or microphones. Paragraphs [0033]-[0034] of the PGPub further describes receiving orientation data indicating an angle of the first electronic device relative to a surface or an angle of the second electronic device relative to a surface. however, nowhere is it described that the data is received from a proximity sensor and orientation data indicates an angle of the first electronic device relative to the second electronic device and indicates a location of the second electronic device relative to an edge of the first electronic device. Claims 2-10, 12-15, 17-18 and 20-22 are rejected due to their dependency from claims 1, 11 and 16. Claim Rejections - 35 USC § 103 6. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 7. Claim(s) 1-2, 5, 8-12, 16 and 20-22 are rejected under 35 U.S.C. 103 as being unpatentable over Bian et al. (US 2023/0273812) in view of Kline et al. (US 2019/0302972), and further in view of Marcus et al. (US 2017/0322761). Regarding claim 1, Bian discloses a method, implemented in a first electronic device (Fig. 1; [0087]-[0095], e.g., a PC 101), the method comprising: receiving, from a sensor of a first electronic device or a second electrode device, data indicating a location of the second electronic device relative to an edge of the first electronic device (Fig. 1; [0161], e.g., a sensor of the PC 101 receives an ultrasonic wave from a mobile phone 101, the ultrasonic wave is used to identify the mobile phone 102 is to the right edge of the PC 101); automatically enabling, based at least in part on the first electronic device being connected to the second electronic device, a universal input control on the first electronic device ([0135], [0140], e.g., a keyboard and mouse sharing mode is automatically enabled after the PC 101 establishes a connection to the mobile phone102 and the keyboard and the mouse of the PC 101can be used to control both the PC 101 and the mobile phone 102); receiving, while the universal input control is enabled, an indication of user input including movement of a cursor from the first electronic device to the second electronic device (Figs 5-6; [0164]-[0166], e.g., the keyboard and mouse module of the PC receive a mouse movement input including moving the cursor 503 to slide over the right edge of the display of the PC 101 to the mobile device 102); determining a location of the cursor on the second electronic device based on a difference between dimensions of the first electronic device to dimensions of the second electronic device (Figs 5-6; [0168], e.g., determine a specific location at which the cursor 602 is displayed on the left edge of the mobile phone based on a difference between a height B of the PC and a height B of the mobile phone (e.g., based on a proportion and a height of the PC and a proportion of a height of the mobile phone); performing an action indicated by the location of the cursor on the second electronic device (Fig. 6; [0168], e.g., display the cursor 602 in the specific location at the left edge of the mobile phone). Bian does not disclose the method comprising: receiving data related to a proximity of the first electronic device in relation to a second electronic device that is separate from the first electronic device, the data received from a proximity sensor of the first electronic device or the second electronic device; receiving, from a sensor of the first electronic device or the second electronic device, orientation data indicating an angle of the first electronic device relative to the second electronic device; automatically detecting that the display of the first electronic device is spatially aligned with the display of a second electronic device, including the display of the first electronic device facing approximately a same direction of the display of the second electronic device and located within a proximity of the second electronic device, based on the data and the orientation data; automatically enabling, based at least in part on a display of the first electronic device being spatially aligned with a display of the second electronic device, the universal input control. However, Kline discloses a method comprising: receiving, from a sensor of a first electronic device or a second electronic device, orientation data indicating an angle of the first electronic device relative to the second electronic device ([0018], [0020]-[0021], e.g., the differences in the orientation of the displays must be within an alignment threshold, such as five-seven degrees in each direction. The orientations can be determined and compared by obtaining measurements from accelerometers, gyroscopes and cameras); detecting that a display of the first electronic device is spatially aligned with a display of the second electronic device, including the display of the first electronic device facing approximately a same direction of the display of the second electronic device based on the orientation data (Fig. 3; [0018], [0021], e.g., the determination that the display of the first electronic device is facing approximately a same direction of the display of the second electronic device is made by analyzing mages captured by cameras that are normal to the displays of the display devices), and automatically enabling, based at least in part on the display of the first electronic device being spatially aligned with the display of the second electronic device (Figs 1-3; [0020]-[0022], e.g., detecting a display of a mobile device 110 is spatially aligned with a display of a wearable device 120) and on the first electronic device being connected with the second electronic device (e.g., pair the mobile device 110 with the wearable device 120), controlling the display of the first electronic device with the second electronic device (see [0020], [0022], e.g., switching an active application on the first display device responsive to receiving a selection of one of the open applications on the second display device). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to use the teachings of Kline in the invention of Bian for automatically enabling universal input control on a first electronic device based at least in part on a display of the first electronic device being spatially aligned with a display of a second electronic device because when displays of two devices are spatially aligned, user’s interaction with both display devices will be more effective. Bian in view of Kline does not specifically disclose the method comprising: receiving data related to a proximity of the first electronic device in relation to a second electronic device that is separate from the first electronic device, the data received from a proximity sensor of the first electronic device or the second electronic device; detecting that the display of the first electronic device is spatially aligned with the display of the second electronic device including the display of the first electronic device located within a proximity of the second electronic device based on the data. However, Marcus discloses a method comprising: receiving proximity data related to a first electronic device in relation to a second electronic device that is separate from the first electronic device, the proximity data received from a proximity sensor of the first electronic device or the second electronic device (Figs 3A-3B and 4; [0026]-[0027], [0031], [0036], e.g., receiving proximity data indicating a distance between the first electronic display device 20 and the second electronic display device 42); and detecting that a first display of the first electronic device is spatially aligned with a second display of the second electronic device including the first display of the first electronic device facing approximately a same direction of the second display of the second electronic device and located within a proximity of the second electronic device, based on the proximity data and orientation data (Fig. 7; [0042]-[0043], [0049], e.g., the arrangement of the display group may be determined based upon the proximity data and/or the orientation data associated with each of the body-worn electronic display devices and the arrangement 148 is a straight line where each wearer of a body-worn electronic display device is facing the same direction and standing shoulder to shoulder). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to use the teachings of Marcus in the invention of Bian in view of Kline for detecting a distance between a first electronic device and a second electronic device based on received proximity data and detecting that a display of the first electronic device is spatially aligned with a display of the second electronic device if the distance between the first electronic device and the second electronic device is within a proximity threshold because it would allow a controller to determine a distributed image for display on the first and second displays based on the proximity data ([0018] of Marcus). Regarding claim 2, Bian further discloses the method of claim 1, further comprising communicating, to the second electronic device, that universal input control for the first electronic device and the second electronic device has been enabled ([0137], e.g., by creating the virtual input device responding to a mouse event, the mobile phone communicates to the PC that the keyboard and mouse sharing mode has been enabled). Regarding claim 5, Bian in view of Kline and Marcus further discloses the method of claim 1, wherein the detecting that the display of the first electronic device is spatially aligned with the display of the second electronic device is in response to the display of the first electronic device being positioned at an angle within a threshold number of degrees of an angle at which the display of the second electronic device is positioned (Kline, [0021], e.g., the differences in the orientation of the displays must be within an alignment threshold, such as five-seven degrees). Regarding claim 8, Bian further discloses the method of claim 1, wherein the first electronic device comprises a mobile phone and the second electronic device comprises a laptop computer (Fig. 1, [0127], e.g., the fist electronic device 102 is a mobile phone and the second electronic device is a laptop computer). Regarding claim 9, Bian further discloses the method of claim 1, wherein the cursor is controlled by an input control device, including a pointer control device ([0127], e.g., the cursor is controlled by an input control device 101-1). Regarding claim 10, Bian further discloses the method of claim 1, wherein a user interface displayed on the first electronic device is different than a user interface displayed on the second electronic device, and the user interface displayed on the first electronic device is not displayed on the second electronic device (Fig. 7, e.g., a user interface 704 displayed on the first electronic device is different from a second user interface 703 displayed on the second electrode device). Regarding claim 11, Bian discloses a first electronic device (Fig. 1; e.g., a PC 101), comprising: a processor implemented in hardware (Fig. 2A; e.g., processor 110); and a computer-readable storage memory having stored thereon multiple instructions that, responsive to execution by the processor ([0102], e.g., a memory stores instructions), the processor causes the first electronic device to: receive, from a sensor of the first electronic device or a second electrode device, data indicating a location of the second electronic device relative to an edge of the first electronic device (Fig. 1; [0161], e.g., a sensor of the PC 101 receives an ultrasonic wave from a mobile phone 101, the ultrasonic wave is used to identify the mobile phone 102 is to the right edge of the PC 101); automatically activate, on the first electronic device being connected to a second electronic device, a universal input control on the first electronic device configured to interact with a display of the first electronic device and the display of a second electronic device (Figs 5-6; [0135], [0140], e.g., a keyboard and mouse sharing mode is automatically enabled after the mobile phone102 establishes a connection to a PC 101 and the keyboard and the mouse of the PC 101can be used to interact with the display of the mobile phone 102 and the display of the PC 101); receive, while the universal input control is enabled, an indication of user input including movement of a cursor from the first electronic device to the second electronic device (Figs 5-6; [0164]-[0166], e.g., the keyboard and mouse module of the PC receive a mouse movement input including moving the cursor 503 to slide over the right edge of the display of the PC 101 to the mobile device 102); determining a location of the cursor on the second electronic device based on a difference between dimensions of the first electronic device to dimensions of the second electronic device (Figs 5-6; [0168], e.g., determine a specific location at which the cursor 602 is displayed on the left edge of the mobile phone based on a difference between a height B of the PC and a height B of the mobile phone (e.g., based on a proportion and a height of the PC and a proportion of a height of the mobile phone); perform an action indicated by the location of the cursor on the second electronic device (Fig. 6; [0168], e.g., display the cursor 602 in the specific location at the left edge of the mobile phone). Bian does not disclose the processor causes the first electronic device to: receiving data related to a proximity of the first electronic device in relation to the second electronic device that is separate from the first electronic device, the data received from a proximity sensor of the first electronic device or the second electronic device; receiving, from a sensor of the first electronic device or the second electronic device, orientation data indicating an angle of the first electronic device relative to the second electronic device; automatically detect that the display of the first electronic device is spatially aligned with the display of a second electronic device, including the display of the first electronic device facing approximately a same direction of the display of the second electronic device and located within a proximity of the second electronic device, based on the data and the orientation data; and automatically activate, based at least in part on the display of the first electronic device being spatially aligned with the display of the second electronic device, the universal input control on the first electronic device. However, Kline discloses a method comprising: receiving, from a sensor of a first electronic device or a second electronic device, orientation data indicating an angle of the first electronic device relative to the second electronic device ([0018], [0020]-[0021], e.g., the differences in the orientation of the displays must be within an alignment threshold, such as five-seven degrees in each direction. The orientations can be determined and compared by obtaining measurements from accelerometers, gyroscopes and cameras); automatically detecting that a display of the first electronic device is spatially aligned with a display of the second electronic device, including the display of the first electronic device facing approximately a same direction of the display of the second electronic device based on the orientation data (Fig. 3; [0018], [0021], e.g., the determination that the display of the first electronic device is facing approximately a same direction of the display of the second electronic device is made by analyzing mages captured by cameras that are normal to the displays of the display devices), and automatically enabling, based at least in part on the display of the first electronic device being spatially aligned with the display of the second electronic device (Figs 1-3; [0020]-[0022], e.g., detecting a display of a mobile device 110 is spatially aligned with a display of a wearable device 120) and on the first electronic device being connected with the second electronic device (e.g., pair the mobile device 110 with the wearable device 120), controlling the display of the first electronic device with the second electronic device (see [0020], [0022], e.g., switching an active application on the first display device responsive to receiving a selection of one of the open applications on the second display device). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to use the teachings of Kline in the invention of Bian for automatically enabling universal input control on a first electronic device based at least in part on a display of the first electronic device being spatially aligned with a display of a second electronic device because when displays of two display devices are spatially aligned, user’s interaction with both display devices will be more effective. Bian in view of Kline does not specifically disclose: receiving proximity data related to the first electronic device in relation to a second electronic device that is separate from the first electronic device, the proximity data received from a proximity sensor of the first electronic device or the second electronic device; detecting that the display of the first electronic device is spatially aligned with the display of the second electronic device including the display of the first electronic device located within a proximity of the second electronic device based on the proximity data. However, Marcus discloses a method comprising: receiving proximity data related to a first electronic device in relation to a second electronic device that is separate from the first electronic device, the proximity data received from a proximity sensor of the first electronic device or the second electronic device (Figs 3A-3B and 4; [0026]-[0027], [0031], [0036], e.g., receiving proximity data indicating a distance between the first electronic display device 20 and the second electronic display device 42); and detecting that a first display of the first electronic device is spatially aligned with a second display of the second electronic device including the first display of the first electronic device facing approximately a same direction of the second display of the second electronic device and located within a proximity of the second electronic device, based on the proximity data and orientation data (Fig. 7; [0042]-[0043], [0049], e.g., the arrangement of the display group may be determined based upon the proximity data and/or the orientation data associated with each of the body-worn electronic display devices and the arrangement 148 is a straight line where each wearer of a body-worn electronic display device is facing the same direction and standing shoulder to shoulder). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to use the teachings of Marcus in the invention of Bian in view of Kline for detecting a distance between a first electronic device and a second electronic device based on received proximity data and detecting that a display of the first electronic device is spatially aligned with a display of the second electronic device if the distance between the first electronic device and the second electronic device is within a proximity threshold because it would allow a controller to determine a distributed image for display on the first and second displays based on the proximity data ([0018] of Marcus). Regarding claim 12, Bian further discloses the first electronic device of claim 11, wherein the processor causes the first electronic device to communicate, to the second electronic device, that universal input control for the first electronic device and the second electronic device has been enabled ([0137], e.g., by creating the virtual input device responding to a mouse event, the mobile phone communicates to the PC that the keyboard and mouse sharing mode has been enabled). Regarding claim 16, Bian discloses a first electronic device (Fig. 1; [0087], e.g., a PC 101), comprising: a display (e.g., display 101-2); a connection determination system, implemented at least in part in hardware (Fig. 2A; [0090], [0110], e.g., module 160), configured to: receive, from a sensor of the first electronic device or a second electrode device, data indicating a location of the second electronic device relative to an edge of the first electronic device (Fig. 1; [0161], e.g., a sensor of the PC 101 receives an ultrasonic wave from a mobile phone 101, the ultrasonic wave is used to identify the mobile phone 102 is to the right edge of the PC 101); automatically enable, on the first electronic device being connected to the second electronic device, a universal input control on the first electronic device configured to interact with the display of the first electronic device and the display of the second electronic device (Figs 5-6; [0135], [0140], e.g., a keyboard and mouse sharing mode is automatically enabled after the mobile phone102 establishes a connection to a PC 101 and the keyboard and the mouse of the PC 101can be used to interact with the display of the mobile phone 102 and the PC 101); and an input control module (Fig. 2A; e.g., processor 110), implemented at least in part in hardware, configured to receive, while the universal input control is enabled, an indication of user input including movement of a cursor from the first electronic device to the second electronic device (Figs 5-6; [0164]-[0166], e.g., the keyboard and mouse module of the PC receive a mouse movement input including moving the cursor 503 to slide over the right edge of the display of the PC 101 to the mobile device 102); determine a location of the cursor on the second electronic device based on a difference between dimensions of the first electronic device to dimensions of the second electronic device (Figs 5-6; [0168], e.g., determine a specific location at which the cursor 602 is displayed on the left edge of the mobile phone based on a difference between a height B of the PC and a height B of the mobile phone (e.g., based on a proportion and a height of the PC and a proportion of a height of the mobile phone); perform an action indicated by the location of the cursor on the second electronic device (Fig. 6; [0168], e.g., display the cursor 602 in the specific location at the left edge of the mobile phone). Bian does not specifically disclose the first electronic device comprising: a spatial alignment determination system, implemented at least in part in hardware, configured to: receive data related to a proximity of the first electronic device in relation to the second electronic device that is separate from the first electronic device, the proximity data received from a proximity sensor of the first electronic device or the second electronic device; receive, from a sensor of the first electronic device or the second electronic device, orientation data indicating an angle of the first electronic device relative to the second electronic device; automatically detect that the display of the first electronic device is spatially aligned with the display of a second electronic device, including the display of the first electronic device facing approximately a same direction of the display of the second electronic device and located within a proximity of the second electronic device, based on the proximity data and the orientation data; and automatically enable, based at least in part on the display of the first electronic device being spatially aligned with the display of the second electronic device, the universal input control on the first electronic device. However, Kline discloses a first electronic device (Fig.1; e.g., a mobile device) comprising: a spatial alignment determination system ([0017]-[0018], e.g., the sensors 113 and the camera 114), implemented at least in part in hardware, configured to: receive, from a sensor of the first electronic device or a second electronic device, orientation data indicating an angle of the first electronic device relative to the second electronic device ([0018], [0020]-[0021], e.g., the differences in the orientation of the displays must be within an alignment threshold, such as five-seven degrees in each direction. The orientations can be determined and compared by obtaining measurements from accelerometers, gyroscopes and cameras); automatically detect that a display of the first electronic device is spatially aligned with a display of the second electronic device, including the display of the first electronic device facing approximately a same direction of the display of the second electronic device based on the orientation data (Fig. 3; [0018], [0021], e.g., the determination that the display of the first electronic device is facing approximately a same direction of the display of the second electronic device is made by analyzing mages captured by cameras that are normal to the displays of the display devices), and automatically enable, based at least in part on the display of the first electronic device being spatially aligned with the display of the second electronic device (Figs 1-3; [0020]-[0022], e.g., detecting a display of a mobile device 110 is spatially aligned with a display of a wearable device 120) and on the first electronic device being connected with the second electronic device (e.g., pair the mobile device 110 with the wearable device 120), controlling the display of the first electronic device with the second electronic device (see [0020], [0022], e.g., switching an active application on the first display device responsive to receiving a selection of one of the open applications on the second display device). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to use the teachings of Kline in the invention of Bian for automatically enabling universal input control on a first electronic device based at least in part on a display of the first electronic device being spatially aligned with a display of a second electronic device because when displays of two display devices are spatially aligned, user’s interaction with both display devices will be more effective. Bian in view of Kline does not specifically disclose: receiving proximity data related to the first electronic device in relation to a second electronic device that is separate from the first electronic device, the proximity data received from a proximity sensor of the first electronic device or the second electronic device; detecting that the display of the first electronic device is spatially aligned with the display of the second electronic device including the display of the first electronic device located within a proximity of the second electronic device based on the proximity data. However, Marcus discloses a method comprising: receiving proximity data related to a first electronic device in relation to a second electronic device that is separate from the first electronic device, the proximity data received from a proximity sensor of the first electronic device or the second electronic device (Figs 3A-3B and 4; [0026]-[0027], [0031], [0036], e.g., receiving proximity data indicating a distance between the first electronic display device 20 and the second electronic display device 42); and detecting that a first display of the first electronic device is spatially aligned with a second display of the second electronic device including the first display of the first electronic device facing approximately a same direction of the second display of the second electronic device and located within a proximity of the second electronic device, based on the proximity data and orientation data (Fig. 7; [0042]-[0043], [0049], e.g., the arrangement of the display group may be determined based upon the proximity data and/or the orientation data associated with each of the body-worn electronic display devices and the arrangement 148 is a straight line where each wearer of a body-worn electronic display device is facing the same direction and standing shoulder to shoulder). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to use the teachings of Marcus in the invention of Bian in view of Kline for detecting a distance between a first electronic device and a second electronic device based on received proximity data and detecting that a display of the first electronic device is spatially aligned with a display of the second electronic device if the distance between the first electronic device and the second electronic device is within a proximity threshold because it would allow a controller to determine a distributed image for display on the first and second displays based on the proximity data ([0018] of Marcus). Regarding claim 20, Bian further discloses the first electronic device of claim 16, wherein a user interface displayed on the first electronic device is different than a user interface displayed on the second electronic device, and the user interface displayed on the first electronic device is not displayed on the second electronic device (Fig. 7, e.g., a user interface 704 displayed on the first electronic device is different from a second user interface 703 displayed on the second electrode device). Regarding claim 21, Bian further discloses the method of claim 1, wherein the sensor includes at least one of an accelerometer or a magnetometer ([0018], e.g., accelerometer 113). Regarding claim 22, Bian further discloses the method of claim 1, further comprising a different action based on detecting that the location of the second electronic device has changed to a different edge of the first electronic device (Figs 5-6; [0161], e.g., when it is determined that the mobile phone has changed to the left edge of the PC, the cursor slides out of the left edge of the display of the PC and displays in a specific location on the right edge of the mobile phone). 8. Claim(s) 3-4 and 13-14 are rejected under 35 U.S.C. 103 as being unpatentable over Bian et al. (US 2023/0273812) in view of Kline et al. (US 2019/0302972) and Marcus et al. (US 2017/0322761), and further in view of Chun et al. (US 2017/0075640). Regarding claim 3, Bian in view of Kline and Marcus further discloses the method of claim 1, wherein the detecting that the display of the first electronic device is spatially aligned with the display of the second electronic device is based on determining that both the display of the first electronic device and the display of the second electronic device facing a same direction (Kline, Fig. 3, e.g., the displays of the display devices are facing a same direction). Bian in view of Kline and Marcus does not specifically disclose wherein the detecting that the display of the first electronic device is spatially aligned with the display of the second electronic device is based on determining that both the display of the first electronic device and the display of the second electronic device being positioned approximately vertically. However, Chun discloses a method comprising: detecting that a first display of an electronic device is spatially aligned with a second display of the electronic device is based on determining that both the display of the first electronic device and the display of the second electronic device being positioned approximately vertically and both the display of the first electronic device and the display of the second electronic device facing a same direction (Figs 14A and 16A, [0111]-[0115], [0141], e.g., both the display devices 410 and 420 are being positioned approximately vertically or horizontally and the electronic device 400 can display contents of the first display 410 and the second display 420 in the same direction). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to use the teachings of Chun in the invention of Bian in view of Kline and Marcus for detecting that a display of a first electronic device is spatially aligned with a display of a second electronic device based on determining that both the display of the first electronic device and the display of the second electronic device being positioned approximately vertically and both the display of the first electronic device and the display of the second electronic device facing a same direction so that images can be displayed adjacent to each other on two vertically arranged displays. Regarding claim 4, Bian in view of Kline and Marcus further discloses the method of claim 1, wherein the detecting that the display of the first electronic device is spatially aligned with the display of the second electronic device is based on determining that both the display of the first electronic device and the display of the second electronic device facing a same direction (Kline, Fig. 3, e.g., the displays of the display devices are facing a same direction). Bian in view of Kline and Marcus does not specifically disclose wherein the detecting that the display of the first electronic device is spatially aligned with the display of the second electronic device is based on determining that both the display of the first electronic device and the display of the second electronic device being positioned approximately horizontally. However, Chun discloses a method comprising: detecting that a first display of an electronic device is spatially aligned with a second display of the electronic device is based on determining that both the display of the first electronic device and the display of the second electronic device being positioned approximately horizontally and both the display of the first electronic device and the display of the second electronic device facing a same direction (Figs 14A and 16A, [0111]-[0115], [0141], e.g., both the display devices 410 and 420 are being positioned approximately vertically or horizontally and the electronic device 400 can display contents of the first display 410 and the second display 420 in the same direction). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to use the teachings of Chun in the invention of Bian in view of Kline and Marcus for detecting that a display of a first electronic device is spatially aligned with a display of a second electronic device based on determining that both the display of the first electronic device and the display of the second electronic device being positioned approximately horizontally and both the display of the first electronic device and the display of the second electronic device facing a same direction so that images can be displayed adjacent to each other on two horizontally arranged displays. Regarding claim 13, Bian in view of Kline and Marcus further discloses the first electronic device of claim 11, wherein the processor causes the first electronic device to automatically detect that the display of the first electronic device is spatially aligned with the display of the second electronic device in response to both the display of the first electronic device and the display of the second electronic device facing a same direction (Kline, Fig. 3, e.g., the displays of the display devices are facing the same direction). Bian in view of Kline and Marcus does not specifically disclose wherein the detecting that the display of the first electronic device is spatially aligned with the display of the second electronic device is based on determining that both the display of the first electronic device and the display of the second electronic device being positioned approximately vertically. However, Chun discloses a method comprising: detecting that a first display of an electronic device is spatially aligned with a second display of the electronic device is based on determining that both the display of the first electronic device and the display of the second electronic device being positioned approximately vertically and both the display of the first electronic device and the display of the second electronic device facing a same direction (Figs 14A and 16A, [0111]-[0115], [0141], e.g., both the display devices 410 and 420 are being positioned approximately vertically or horizontally and the electronic device 400 can display contents of the first display 410 and the second display 420 in the same direction). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to use the teachings of Chun in the invention of Bian in view of Kline and Marcus for detecting that a display of a first electronic device is spatially aligned with a display of a second electronic device based on determining that both the display of the first electronic device and the display of the second electronic device being positioned approximately vertically and both the display of the first electronic device and the display of the second electronic device facing a same direction so that images can be displayed adjacent to each other on two vertically arranged displays. Regarding claim 14, Bian in view of Kline and Marcus further discloses the first electronic device of claim 11, wherein the processor causes the first electronic device to automatically detect that the display of the first electronic device is spatially aligned with the display of the second electronic device in response to both the display of the first electronic device and the display of the second electronic device facing a same direction (Kline, Fig. 3, e.g., the displays of the display devices are facing the same direction). Bian in view of Kline and Marcus does not specifically disclose wherein the detecting that the display of the first electronic device is spatially aligned with the display of the second electronic device is based on determining that both the display of the first electronic device and the display of the second electronic device being positioned approximately horizontally. However, Chun discloses a method comprising: detecting that a first display of an electronic device is spatially aligned with a second display of the electronic device is based on determining that both the display of the first electronic device and the display of the second electronic device being positioned approximately horizontally and both the display of the first electronic device and the display of the second electronic device facing a same direction (Figs 14A and 16A, [0111]-[0115], [0141], e.g., both the display devices 410 and 420 are being positioned approximately vertically or horizontally and the electronic device 400 can display contents of the first display 410 and the second display 420 in the same direction). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to use the teachings of Chun in the invention of Bian in view of Kline and Marcus for detecting that a display of a first electronic device is spatially aligned with a display of a second electronic device based on determining that both the display of the first electronic device and the display of the second electronic device being positioned approximately horizontally and both the display of the first electronic device and the display of the second electronic device facing a same direction so that images can be displayed adjacent to each other on two horizontally arranged displays. 9. Claim(s) 6 and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Bian et al. (US 2023/0273812) in view of Kline et al. (US 2019/0302972) and Marcus et al. (US 2017/0322761), and further in view of Pasquero et al. (US 2013/0111370). Regarding claim 6, Bian in view of Kline and Marcus does not specifically disclose the method of claim 1, further comprising disabling, based at least in part on the display of the first electronic device no longer being spatially aligned with the display of the second electronic device, the universal input control. However, Pasquero discloses a method comprising: automatically enabling, based at least in part on a display of a first electronic device being spatially aligned with a display of a second electronic device (Figs 9-10; [0161]-[0167], e.g., detecting a display of an electronic device 201 is spatially aligned with a display of an electronic device 2 if the first electronic device 201 and the second electronic device 201 are oriented long a common plane) and on the first electronic device being connected to the second electronic device ([0160], e.g., the first electronic device 201 and the second electronic device 201 are communicate with each other, see Fig. 4, step 402), a common UI mode and a universal input control on the first electronic device ([0117], [0170], e.g., in the common UI mode, one or more input interface 206 associated with one of the electronic devices 201 may be used to provide input to another one of the electronic devices 201). Pasquero further discloses the method further comprising disabling, based at least in part on the display of the first electronic device no longer being spatially aligned with the display of the second electronic device, the universal input control ([0117], [0189], e.g., if the first electronic device and the second electronic device become oriented along different planes, then the common UI mode and the universal input control may be ended). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to use the teachings of Pasquero in the invention of Bian in view of Kline and Marcus for disabling universal input control based at least in part on the display of the first electronic device no longer being spatially aligned with the display of the second electronic device so that orientation data from the first electronic device and the second electronic device can be used to determine whether to end the universal control input. Regarding claim 17, Bian in view of Kline and Marcus does not specifically disclose the first electronic device of claim 16, wherein the spatial alignment determination system is configured to disable, based at least in part on the display of the first electronic device no longer being spatially aligned with the display of the second electronic device, the universal input control. However, Pasuero discloses a first electronic device (Figs 1 and 11; [0157]-[0158], e.g., a first electronic device 201) comprising: a spatial alignment determination system (Fig. 1; e.g., processor 240 and orientation system 249), implemented at least in part in hardware, configured to automatically enable, based at least in part on a display of the first electronic device being spatially aligned with a display of a second electronic device (Figs 9-10; [0161]-[0167], e.g., detecting a display of the electronic device 201 is spatially aligned with a display of an electronic device 2 if the first electronic device 201 and the second electronic device 201 satisfy one or more predetermined orientation criterion) and on the first electronic device being connected to the second electronic device ([0160], e.g., the first electronic device 201 and the second electronic device 201 are communicate with each other, see Fig. 4, step 402), a common UI mode and a universal input control on the first electronic device ([0117], [0170], e.g., in the common UI mode, one or more input interface 206 associated with one of the electronic devices 201 may be used to provide input to another one of the electronic devices 201). Pasquero further discloses the first electronic device, wherein the spatial alignment determination system is configured to disable, based at least in part on the display of the first electronic device no longer being spatially aligned with the display of the second electronic device, the universal input control ([0117], [0189], e.g., if the first electronic device and the second electronic device become oriented along different planes, then the common UI mode and the universal input control may be ended). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to use the teachings
Read full office action

Prosecution Timeline

Sep 28, 2023
Application Filed
Jan 24, 2025
Non-Final Rejection — §103, §112
Mar 26, 2025
Applicant Interview (Telephonic)
Mar 26, 2025
Examiner Interview Summary
Mar 28, 2025
Response Filed
May 09, 2025
Final Rejection — §103, §112
Jun 26, 2025
Examiner Interview Summary
Jun 26, 2025
Applicant Interview (Telephonic)
Jun 28, 2025
Request for Continued Examination
Jun 30, 2025
Response after Non-Final Action
Jul 25, 2025
Non-Final Rejection — §103, §112
Aug 22, 2025
Examiner Interview Summary
Aug 22, 2025
Applicant Interview (Telephonic)
Aug 29, 2025
Response Filed
Oct 24, 2025
Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602133
TOUCH DRIVER, TOUCH DEVICE, AND DISPLAY DEVICE INCLUDING THE SAME
2y 5m to grant Granted Apr 14, 2026
Patent 12585364
INPUT DEVICE WITH PROGRAMMABLE STRIPS FOR PERFORMING OPERATIONS ON A DISPLAY OF AN ELECTRONIC DEVICE
2y 5m to grant Granted Mar 24, 2026
Patent 12579926
DISPLAY DEVICE
2y 5m to grant Granted Mar 17, 2026
Patent 12578819
TOUCH PANEL WITH ISO-RESISTANCE COMPENSATION PATTERN
2y 5m to grant Granted Mar 17, 2026
Patent 12572242
TOUCH DISPLAY PANEL AND MANUFACTURING METHOD THEREFOR
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
77%
Grant Probability
94%
With Interview (+17.5%)
2y 5m
Median Time to Grant
High
PTA Risk
Based on 876 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month