Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAILED ACTION
Examiner cites particular columns or paragraphs, and line numbers in the references as applied to the claims below for the convenience of the applicant. Although the specified citations are representative of the teachings in the art and are applied to the specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested that, in preparing responses, the applicant fully consider the references in entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the examiner.
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
In reply to the Non-Final Office Action mailed on 10/22/2025, the applicant has filed a response on 1/15/2026 amending claims 1, 4, 6, 10, 13, 15 and 19. No claim has been added or cancelled. Claims 1-20 are pending in this application.
Previous claim objection is withdrawn in view of applicant’s amendments filed on 1/15/2026.
Claim Objections
Claim 1 is objected to because of the following informalities: the claim recites “a sensor system of a aircraft” in lines 2-3, which appears to be “a sensor system of an aircraft”. Appropriate correction is required.
Claims 2-9 are objected based on their dependence from claim 1.
Claim 10 is objected to because of the following informalities: the claim recites “one or more sensors of a aircraft” in line 2, which appears to be “one or more sensors of an aircraft”. Appropriate correction is required.
Claims 11-18 are objected based on their dependence from claim 10.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-5, 7, 10-14, 16 and 19-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Meyer et al. (US 2023/0086516), in view of Manfred et al. (US 2005/0278120).
Regarding claim 1, Meyer discloses a method (see Figs.8-9), comprising:
receiving, by one or more processors, sensor data from a sensor system (para[0038]-para[0041]; para[0072]; para[0087]; para[0090]; para[0100]; para[0105]; para[0111]; para[0113]; para[0120]-para[0123]; see processors 1002 and sensing system 1024, in Fig. 10 receiving data for “detecting motion (e.g., absolute motion) of the electronic device (e.g., using a motion sensing system of the electronic device)” (see in 806 in Fig. 8; see also 906 in Fig. 9); see e.g. detected motion 113 of device 100, as shown in Figs. 1A-1B, and other examples as shown in Figs. 2A-7B and corresponding paragraphs; note that “the device may detect a usage context corresponding to bicycle travel, walking, jogging, train travel, airplane travel, or the like”);
processing, by the one or more processors, the sensor data to detect and characterize a perturbance (para[0030]; para[0038]-para[0043]; para[0095]-para[0096]; para[0098]; para[0106]-para[0107]; para[0111]; para[0113]; para[0120]-para[0121]; e.g. according to sensed movements/perturbances data, “certain motion characteristics of a detected motion may indicate that a subsequently detected relative motion (or component thereof) was unintentional” in order to “be effectively ignored or isolated from intentional… motion”);
receiving, by the one or more processors, user input data indicative of user input provided by a user via a touchscreen display (para[0089]; para[0104]; para[0111]; para[0113]; para[0122]; in 804 in Fig. 8 a touch is detected by touch sensor 1020 (in Fig. 10) of a touchscreen display of the electronic device, and corresponding input data is received by processor 1002 (in Fig. 10); see also 904 in Fig. 9; see touch inputs in Figs. 1B, 2B, 4B, 6B and 7A);
analyzing, by the one or more processors, the perturbance and the user input to determine whether the user input is likely to be invalid due to the perturbance by comparing the perturbance to one or more preprogrammed perturbance criteria (para[0030]; para[0095]-para[0096]; para[0098]; para[0106]-para[0107]; para[0111]; “Information about these movements may then be used to determine where the user was intending to (or is intending to) touch”; see 808 and 810 in Fig. 8; “in accordance with a determination that a characteristic of the relative motion does not satisfy a threshold condition, a first input location is determined based on a location of the contact”; “Thus, if the relative motion does not satisfy the threshold condition (e.g., a distance moved by the user's finger along a direction parallel to an input surface is less than the threshold distance), the input may be assumed not to have been erroneous, and the input location is determined to be the location of the contact (e.g., no compensation or correction is applied)”; “if the relative motion does satisfy the threshold condition (e.g., a distance moved by the user's finger along a direction parallel to an input surface is equal to or greater than the threshold distance), the input may be assumed to have been erroneous, and a second input location is determined”; see also 908 and 910 in Fig. 9); and
initiating, by the one or more processors, a corrective action in response to a determination that the user input is likely to be invalid due to the perturbance, wherein the corrective action includes at least one of confirming the user input, invalidating the user input, or modifying the user input (para[0030]; para[0084]-para[0086]; para[0096]-para[0099]; para[0107]-para[0111]; “Information about… movements may… be used to determine where the user was intending to (or is intending to) touch, and adjust the location of the touch input accordingly (e.g., by applying a distance offset to a detected touch location)”, as shown in Figs. 2B, 4B, 6B and 7A (claimed ‘modifying the user input’); see 810 in Fig. 8 and 910 in Fig. 9; “if the relative motion does satisfy the threshold condition (e.g., a distance moved by the user's finger along a direction parallel to an input surface is equal to or greater than the threshold distance), the input may be assumed to have been erroneous, and a second input location is determined”; “The second input location may be determined by subtracting a component of absolute motion of the device from the relative motion between the input member and the device”; “input-location correction techniques may be applied if an absolute motion of a device satisfies a threshold condition, or if a relative motion satisfies a threshold condition, or both”; “Determining the corrected input location may include providing the motion of the electronic device and the location of the contact to the selected input-location correction model, and receiving, as output from the selected input-location correction model, the corrected input location”, “by applying a distance offset to the location of a touch contact”; e.g. regarding Figs. 7A-7B, “the device 700 determines that the contact location 708 may not reflect the intended touch target and applies a correction or compensation (e.g., based on relative motion of the device 700 and the input member, as illustrated by arrows 710, 712) to produce the input location 706”; “In this case, because the input location 706 resulted from a compensation technique (e.g., the device motion was large enough to cause the device to apply an offset to the contact location), and the input location 706 corresponds to an end-call button 704, the device 700 displays a confirmatory user interface 714 that includes confirmatory buttons 716 and 718”, “such that the user can more easily select the desired confirmatory button despite any motion of the device 700 and/or the input member (which initially caused the potentially inaccurate input, and which may still be affecting the device 700 and/or the input member)” (claimed ‘confirming the user input’ if button 718 is selected); “The user is therefore afforded an opportunity to confirm or reject the selection of the critical input (e.g., the end-call button)” (claimed ‘invalidating the user input’ if button 716 is selected)).
However, Meyer does not appear to expressly disclose receiving, by one or more processors, sensor data from a sensor system of a aircraft; processing, by the one or more processors, the sensor data to detect and characterize a perturbance of the aircraft, wherein characterizing the perturbance includes determining a frequency and a magnitude of the perturbance; receiving, by the one or more processors, user input data indicative of user input provided by a user via a touchscreen display in communication with the aircraft; and analyzing, by the one or more processors, the perturbance and the user input to determine whether the user input is likely to be invalid due to the perturbance of the aircraft by comparing the frequency and the magnitude of the perturbance to one or more preprogrammed perturbance criteria.
Manfred discloses receiving, by one or more processors, sensor data from a sensor system of a aircraft (para[0023]-para[0025]; para[0028]; para[0036]; see Figs. 1-3; see e.g. a microprocessor or CPU on an Air Data Inertial Reference System ("ADIRS") of an aircraft, “to measure conditions such as… normal accelerations” and “recording and transmitting primary and secondary measurements at an in-flight aircraft”; “Data collection would come from the Inertial Reference Systems flying in commercial airliners and business jets”; “An IRS calculates a normal acceleration component”); processing, by the one or more processors, the sensor data to detect and characterize a perturbance of the aircraft, wherein characterizing the perturbance includes determining a frequency and a magnitude of the perturbance (para[0023]-para[0025]; para[0028]; para[0033]; para[0036]; see Figs. 1-3; “Based on frequency and amplitude, a normal acceleration algorithm could be created to interpret acceleration as turbulence”); receiving, by the one or more processors, user input data indicative of user input provided by a user via a touch device in communication with the aircraft (para[0023]-para[0025]; para[0028]; para[0033]; para[0036]; para[0038]; see Figs. 1-3; “additional information may be provided to the system by a flightcrew”; “For example, a keypad may be provided for a crewmember” and “the keypad would allow one-touch activation”, which “would be considered as primary measurements by the system such as those shown as element(s) 208 in FIG. 2”); and analyzing, by the one or more processors, the perturbance of the aircraft by comparing the frequency and the magnitude of the perturbance to one or more preprogrammed perturbance criteria (para[0023]-para[0025]; para[0028]; para[0033]; para[0036]; para[0038]; see Figs. 1-3; since “turbulence predictions would come from previously experienced turbulence data”, and “Based on frequency and amplitude, a normal acceleration algorithm could be created to interpret acceleration as turbulence”, it is clear that measured perturbance of the aircraft is analyzed by comparing the frequency and amplitude of the perturbance to one or more previously experienced turbulence data).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to combine the teachings in Meyer’s invention, with the teachings in Manfred’s invention, to have receiving, by one or more processors, sensor data from a sensor system of a aircraft; processing, by the one or more processors, the sensor data to detect and characterize a perturbance of the aircraft, wherein characterizing the perturbance includes determining a frequency and a magnitude of the perturbance; receiving, by the one or more processors, user input data indicative of user input provided by a user via a touchscreen display in communication with the aircraft; and analyzing, by the one or more processors, the perturbance and the user input to determine whether the user input is likely to be invalid due to the perturbance of the aircraft by comparing the frequency and the magnitude of the perturbance to one or more preprogrammed perturbance criteria, for the advantage of an improved system and method for monitoring conditions aboard in-flight aircraft (para[0006]).
Regarding claim 2, Meyer and Manfred disclose all the claim limitations as applied above (see claim 1). In addition, Meyer discloses performing the corrective action includes generating, by the one or more processors, a visual prompt on the touchscreen display that requests confirmation of the user input (para[0084]-para[0086]; para[0099]; e.g. regarding Figs. 7A-7B, “the device 700 determines that the contact location 708 may not reflect the intended touch target and applies a correction or compensation (e.g., based on relative motion of the device 700 and the input member, as illustrated by arrows 710, 712) to produce the input location 706”; “In this case, because the input location 706 resulted from a compensation technique (e.g., the device motion was large enough to cause the device to apply an offset to the contact location), and the input location 706 corresponds to an end-call button 704, the device 700 displays a confirmatory user interface 714 that includes confirmatory buttons 716 and 718” (claimed ‘visual prompt’), “such that the user can more easily select the desired confirmatory button despite any motion of the device 700 and/or the input member (which initially caused the potentially inaccurate input, and which may still be affecting the device 700 and/or the input member)”; “The user is therefore afforded an opportunity to confirm or reject the selection of the critical input (e.g., the end-call button)”).
Regarding claim 3, Meyer and Manfred disclose all the claim limitations as applied above (see claim 1). In addition, Meyer discloses performing the corrective action includes modifying, by the one or more processors, the user input based on the perturbance and programed criteria (para[0030]; para[0096]-para[0098]; para[0107]-para[0110]; “Information about… movements may… be used to determine where the user was intending to (or is intending to) touch, and adjust the location of the touch input accordingly (e.g., by applying a distance offset to a detected touch location)”, as shown in Figs. 2B, 4B, 6B and 7A (claimed ‘modifying the user input’); see 810 in Fig. 8 and 910 in Fig. 9; “if the relative motion does satisfy the threshold condition (e.g., a distance moved by the user's finger along a direction parallel to an input surface is equal to or greater than the threshold distance), the input may be assumed to have been erroneous, and a second input location is determined”; “The second input location may be determined by subtracting a component of absolute motion of the device from the relative motion between the input member and the device”; “input-location correction techniques may be applied if an absolute motion of a device satisfies a threshold condition, or if a relative motion satisfies a threshold condition, or both”; “Determining the corrected input location may include providing the motion of the electronic device and the location of the contact to the selected input-location correction model, and receiving, as output from the selected input-location correction model, the corrected input location”).
Regarding claim 4, Meyer and Manfred disclose all the claim limitations as applied above (see claim 3). In addition, Meyer discloses modifying the user input is based on, at least in part, a timing of the perturbance relative to the user input (para[0033]; para[0044]; para[0089]-para[0090]; “if the device detected a large motion just prior to detecting a contact (e.g., within a time window before the contact), the device may apply an offset (e.g., a correction) to the detected input location to determine the intended input location”).
Regarding claim 5, Meyer and Manfred disclose all the claim limitations as applied above (see claim 3). In addition, Meyer discloses modifying the user input includes:
processing, by the one or more processors, the user input data to determine an actual location of contact on the touchscreen display (para[0089]; para[0104]; para[0113]; para[0122]; in 804 in Fig. 8 actual touch is detected by touch sensor 1020 (in Fig. 10) of a touchscreen display of the electronic device, and corresponding input data is received by processor 1002 (in Fig. 10); see also 904 in Fig. 9; see touch inputs in Figs. 1B, 2B, 4B, 6B and 7A);
determining, by the one or more processors, an intended location of contact on the touchscreen display based on the perturbance (para[0030]; para[0032]-para[033]; para[0044]; “Information about the detected motion may then be used to correct the touch input to more accurately capture the user's intended touch target”); and
adjusting, by the one or more processors, the user input from the actual location of contact to the intended location of contact (para[0030]; para[0096]-para[0098]; para[0107]-para[0110]; “Information about… movements may… be used to determine where the user was intending to (or is intending to) touch, and adjust the location of the touch input accordingly (e.g., by applying a distance offset to a detected touch location)”, as shown in Figs. 2B, 4B, 6B and 7A; see 810 in Fig. 8 and 910 in Fig. 9; “if the relative motion does satisfy the threshold condition (e.g., a distance moved by the user's finger along a direction parallel to an input surface is equal to or greater than the threshold distance), the input may be assumed to have been erroneous, and a second input location is determined”; “The second input location may be determined by subtracting a component of absolute motion of the device from the relative motion between the input member and the device”; “input-location correction techniques may be applied if an absolute motion of a device satisfies a threshold condition, or if a relative motion satisfies a threshold condition, or both”; “Determining the corrected input location may include providing the motion of the electronic device and the location of the contact to the selected input-location correction model, and receiving, as output from the selected input-location correction model, the corrected input location”, that is the intended location)).
Regarding claim 7, Meyer and Manfred disclose all the claim limitations as applied above (see claim 1). In addition, Meyer discloses processing, by the one or more processors, the user input data to determine that the user input includes a first type of operation or command (para[0036]; para[0045]; “a user may contact (e.g., touch) a touchscreen of a device to provide inputs to the device, and the touchscreen may detect the contact and the location of the contact and take action accordingly”);
determining, by the one or more processors, that the user intended a second type of operation or command based on the perturbance (para[0030]; para[0045]; “conditions such as abrupt movements of the user and/or the device prior to or during detection of a touch input may suggest that the touch input was compromised”, that is, that the user intended a second type of action; “Information about… movements may then be used to determine where the user was intending to (or is intending to) touch , and adjust the location of the touch input accordingly (e.g., by applying a distance offset to a detected touch location)”; e.g. “The device may… determine whether or not to apply an input location correction”); and
changing, by the one or more processors, the user input from the first type of operation or command to the second type of operation or command (para[0005]; para[0009]; para[0030]; para[0044]-para[0045]; e.g. “The device may… determine whether the absolute and/or relative motion that is detected prior to the touch input satisfies a threshold condition to determine whether or not to apply an input location correction”, that is, whether the user intended a second type of action at the second (intended) input location different from the actual touch location; “If the threshold condition is satisfied, the device may apply an input location correction based on the detected absolute and/or relative motion of the device and/or the input member (e.g., because the motion is likely to have caused an erroneous touch location”; “In accordance with a determination that the second input location corresponds to a location of a virtual button on a display associated with the input surface, the electronic device may perform an action associated with the virtual button”; “The device 100 may then take an action based on the corrected touch location, rather than the location of the contact”).
Regarding claim 10, Meyer discloses a system (see Figs. 1A-7B and 10; para[0030]), comprising:
one or more sensors configured to sense measurable conditions internal or external (see touch sensor 1020, force sensor 1022 and sensing systems 1024 in Fig. 10; note that “the device may detect a usage context corresponding to bicycle travel, walking, jogging, train travel, airplane travel, or the like”; para[0072]; para[0111]; para[0120]; para[0122]-para[0123]); and
a controller in operable communication with the one or more sensors and with a touchscreen display having a graphic user interface that includes selectable icons (see processing unit 1002 in Fig. 10 which may include one or more computer processors or microcontrollers, in communication with touch sensor 1020 e.g. in a touchscreen display (see Figs. 1A-7B), force sensor 1022 and sensing systems 1024; para[0089]; para[0104]; para[0111]; para[0113]; para[0122]), the controller configured to, by one or more processors:
receive sensor data from the one or more sensors (para[0038]-para[0041]; para[0087]; para[0090]; para[0100]; para[0105]; para[0111]; para[0113]; para[0120]-para[0123]; see processors 1002 and sensing system 1024, in Fig. 10 receiving data for “detecting motion (e.g., absolute motion) of the electronic device (e.g., using a motion sensing system of the electronic device)” (see in 806 in Fig. 8; see also 906 in Fig. 9); see e.g. detected motion 113 of device 100, as shown in Figs. 1A-1B, and other examples as shown in Figs. 2A-7B and corresponding paragraphs);
process the sensor data to detect and characterize a perturbance (para[0030]; para[0038]-para[0043]; para[0095]-para[0096]; para[0098]; para[0106]-para[0107]; para[0111]; para[0113]; para[0120]-para[0121]; e.g. according to sensed movements/ perturbances data, “certain motion characteristics of a detected motion may indicate that a subsequently detected relative motion (or component thereof) was unintentional” in order to “be effectively ignored or isolated from intentional… motion”);
receive user input data indicative of user input provided by a user via the touchscreen display (para[0089]; para[0104]; para[0111]; para[0113]; para[0122]; in 804 in Fig. 8 a touch is detected by touch sensor 1020 (in Fig. 10) of a touchscreen display of the electronic device, and corresponding input data is received by processor 1002 (in Fig. 10); see also 904 in Fig. 9; see touch inputs in Figs. 1B, 2B, 4B, 6B and 7A);
analyze the perturbance and the user input to determine whether the user input is likely to be invalid due to the perturbance (para[0030]; para[0095]-para[0096]; para[0098]; para[0106]-para[0107]; para[0111]; “Information about these movements may then be used to determine where the user was intending to (or is intending to) touch”; see 808 and 810 in Fig. 8; “in accordance with a determination that a characteristic of the relative motion does not satisfy a threshold condition, a first input location is determined based on a location of the contact”; “Thus, if the relative motion does not satisfy the threshold condition (e.g., a distance moved by the user's finger along a direction parallel to an input surface is less than the threshold distance), the input may be assumed not to have been erroneous, and the input location is determined to be the location of the contact (e.g., no compensation or correction is applied)”; “if the relative motion does satisfy the threshold condition (e.g., a distance moved by the user's finger along a direction parallel to an input surface is equal to or greater than the threshold distance), the input may be assumed to have been erroneous, and a second input location is determined”; see also 908 and 910 in Fig. 9); and
initiate a corrective action in response to a determination that the user input is likely to be invalid due to the perturbance, wherein the corrective action includes at least one of confirming the user input, invalidating the user input, or modifying the user input (para[0030]; para[0084]-para[0086]; para[0096]-para[0099]; para[0107]-para[0111]; “Information about… movements may… be used to determine where the user was intending to (or is intending to) touch, and adjust the location of the touch input accordingly (e.g., by applying a distance offset to a detected touch location)”, as shown in Figs. 2B, 4B, 6B and 7A (claimed ‘modifying the user input’); see 810 in Fig. 8 and 910 in Fig. 9; “if the relative motion does satisfy the threshold condition (e.g., a distance moved by the user's finger along a direction parallel to an input surface is equal to or greater than the threshold distance), the input may be assumed to have been erroneous, and a second input location is determined”; “The second input location may be determined by subtracting a component of absolute motion of the device from the relative motion between the input member and the device”; “input-location correction techniques may be applied if an absolute motion of a device satisfies a threshold condition, or if a relative motion satisfies a threshold condition, or both”; “Determining the corrected input location may include providing the motion of the electronic device and the location of the contact to the selected input-location correction model, and receiving, as output from the selected input-location correction model, the corrected input location”, “by applying a distance offset to the location of a touch contact”; e.g. regarding Figs. 7A-7B, “the device 700 determines that the contact location 708 may not reflect the intended touch target and applies a correction or compensation (e.g., based on relative motion of the device 700 and the input member, as illustrated by arrows 710, 712) to produce the input location 706”; “In this case, because the input location 706 resulted from a compensation technique (e.g., the device motion was large enough to cause the device to apply an offset to the contact location), and the input location 706 corresponds to an end-call button 704, the device 700 displays a confirmatory user interface 714 that includes confirmatory buttons 716 and 718”, “such that the user can more easily select the desired confirmatory button despite any motion of the device 700 and/or the input member (which initially caused the potentially inaccurate input, and which may still be affecting the device 700 and/or the input member)” (claimed ‘confirming the user input’ if button 718 is selected); “The user is therefore afforded an opportunity to confirm or reject the selection of the critical input (e.g., the end-call button)” (claimed ‘invalidating the user input’ if button 716 is selected)).
However, Meyer does not appear to expressly disclose one or more sensors of a aircraft configured to sense measurable conditions internal or external to the aircraft; the controller configured to process the sensor data to detect and characterize a perturbance of the aircraft, wherein characterizing the perturbance includes determining a frequency and a magnitude of the perturbance; and analyze the perturbance and the user input to determine whether the user input is likely to be invalid due to the perturbance of the aircraft by comparing the frequency and magnitude of the perturbance to one or more preprogrammed criteria.
Manfred discloses one or more sensors of a aircraft configured to sense measurable conditions internal or external to the aircraft (para[0023]-para[0025]; para[0028]; para[0036]; see Figs. 1-3; see e.g. a microprocessor or CPU on an Air Data Inertial Reference System ("ADIRS") of an aircraft, “to measure conditions such as… normal accelerations” and “recording and transmitting primary and secondary measurements at an in-flight aircraft”; “the various measuring devices may include a static air temperature gauges, total air temperature probes, air data modules, wind-direction measurement devices, total pressure gauges, static pressure gauges, a relative humidity gauge, and orthogonally positioned accelerometers”; “Data collection would come from the Inertial Reference Systems flying in commercial airliners and business jets”; “An IRS calculates a normal acceleration component”); a controller configured to process sensor data to detect and characterize a perturbance of the aircraft, wherein characterizing the perturbance includes determining a frequency and a magnitude of the perturbance (para[0023]-para[0025]; para[0028]; para[0033]; para[0036]; see Figs. 1-3; the microprocessor or CPU on an Air Data Inertial Reference System ("ADIRS") of the aircraft processes measured conditions and “Based on frequency and amplitude, a normal acceleration algorithm could be created to interpret acceleration as turbulence”); and analyze the perturbance of the aircraft by comparing the frequency and magnitude of the perturbance to one or more preprogrammed criteria (para[0023]-para[0025]; para[0028]; para[0033]; para[0036]; see Figs. 1-3; since “turbulence predictions would come from previously experienced turbulence data”, and “Based on frequency and amplitude, a normal acceleration algorithm could be created to interpret acceleration as turbulence”, it is clear that measured perturbance of the aircraft is analyzed by comparing the frequency and amplitude of the perturbance to one or more previously experienced turbulence data).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to combine the teachings in Meyer’s invention, with the teachings in Manfred’s invention, to have one or more sensors of a aircraft configured to sense measurable conditions internal or external to the aircraft; the controller configured to process the sensor data to detect and characterize a perturbance of the aircraft, wherein characterizing the perturbance includes determining a frequency and a magnitude of the perturbance; and analyze the perturbance and the user input to determine whether the user input is likely to be invalid due to the perturbance of the aircraft by comparing the frequency and magnitude of the perturbance to one or more preprogrammed criteria, for the advantage of an improved system and method for monitoring conditions aboard in-flight aircraft (para[0006]).
Regarding claims 11-14 and 16, these claims are analogous to claims 2-5 and 7, and therefore are rejected for the same reasons as claims 2-5 and 7 above.
Regarding claim 19, Meyer discloses a vehicle (e.g. a vehicle comprising the systems in Figs. 1A-7B and 10; para[0011]; para[0029]-para[0030]; para[0050]; para[0052]; para[0072]; para[0076]; para[0102]; para[0111]), comprising:
one or more sensors configured to sense measurable conditions internal or external to the vehicle (see touch sensor 1020, force sensor 1022 and sensing systems 1024 in Fig. 10; para[0050]-para[0052]; para[0111]; para[0120]; para[0122]-para[0123]);
a controller in operable communication with the one or more sensors and with a touchscreen display having a graphic user interface that includes selectable icons (see processing unit 1002 in Fig. 10 which may include one or more computer processors or microcontrollers, in communication with touch sensor 1020 e.g. in a touchscreen display (see Figs. 1A-7B), force sensor 1022 and sensing systems 1024; para[0089]; para[0104]; para[0111]; para[0113]; para[0122]), the controller configured to, by one or more processors:
receive sensor data from the one or more sensors (para[0038]-para[0041]; para[0087]; para[0090]; para[0100]; para[0105]; para[0111]; para[0113]; para[0120]-para[0123]; see processors 1002 and sensing system 1024, in Fig. 10 receiving data for “detecting motion (e.g., absolute motion) of the electronic device (e.g., using a motion sensing system of the electronic device)” (see in 806 in Fig. 8; see also 906 in Fig. 9); see e.g. detected motion 113 of device 100, as shown in Figs. 1A-1B, and other examples as shown in Figs. 2A-7B and corresponding paragraphs);
process the sensor data to detect and characterize a perturbance of the vehicle (para[0030]; para[0038]-para[0043]; para[0095]-para[0096]; para[0098]; para[0106]-para[0107]; para[0111]; para[0113]; para[0120]-para[0121]; e.g. according to sensed movements/ perturbances data, “certain motion characteristics of a detected motion may indicate that a subsequently detected relative motion (or component thereof) was unintentional” in order to “be effectively ignored or isolated from intentional… motion”);
receive user input data indicative of user input performed by a user via the touchscreen display (para[0089]; para[0104]; para[0111]; para[0113]; para[0122]; in 804 in Fig. 8 a touch is detected by touch sensor 1020 (in Fig. 10) of a touchscreen display of the electronic device, and corresponding input data is received by processor 1002 (in Fig. 10); see also 904 in Fig. 9; see touch inputs in Figs. 1B, 2B, 4B, 6B and 7A);
process the user input data to determine whether the user input is likely to be invalid due to the perturbance (para[0030]; para[0095]-para[0096]; para[0098]; para[0106]-para[0107]; para[0111]; “Information about these movements may then be used to determine where the user was intending to (or is intending to) touch”; see 808 and 810 in Fig. 8; “in accordance with a determination that a characteristic of the relative motion does not satisfy a threshold condition, a first input location is determined based on a location of the contact”; “Thus, if the relative motion does not satisfy the threshold condition (e.g., a distance moved by the user's finger along a direction parallel to an input surface is less than the threshold distance), the input may be assumed not to have been erroneous, and the input location is determined to be the location of the contact (e.g., no compensation or correction is applied)”; “if the relative motion does satisfy the threshold condition (e.g., a distance moved by the user's finger along a direction parallel to an input surface is equal to or greater than the threshold distance), the input may be assumed to have been erroneous, and a second input location is determined”; see also 908 and 910 in Fig. 9); and
initiate a corrective action in response to a determination that the user input is likely to be invalid due to the perturbance, wherein the corrective action includes at least one of confirming the user input, invalidating the user input, or modifying the user input (para[0030]; para[0084]-para[0086]; para[0096]-para[0099]; para[0107]-para[0111]; “Information about… movements may… be used to determine where the user was intending to (or is intending to) touch, and adjust the location of the touch input accordingly (e.g., by applying a distance offset to a detected touch location)”, as shown in Figs. 2B, 4B, 6B and 7A (claimed ‘modifying the user input’); see 810 in Fig. 8 and 910 in Fig. 9; “if the relative motion does satisfy the threshold condition (e.g., a distance moved by the user's finger along a direction parallel to an input surface is equal to or greater than the threshold distance), the input may be assumed to have been erroneous, and a second input location is determined”; “The second input location may be determined by subtracting a component of absolute motion of the device from the relative motion between the input member and the device”; “input-location correction techniques may be applied if an absolute motion of a device satisfies a threshold condition, or if a relative motion satisfies a threshold condition, or both”; “Determining the corrected input location may include providing the motion of the electronic device and the location of the contact to the selected input-location correction model, and receiving, as output from the selected input-location correction model, the corrected input location”, “by applying a distance offset to the location of a touch contact”; e.g. regarding Figs. 7A-7B, “the device 700 determines that the contact location 708 may not reflect the intended touch target and applies a correction or compensation (e.g., based on relative motion of the device 700 and the input member, as illustrated by arrows 710, 712) to produce the input location 706”; “In this case, because the input location 706 resulted from a compensation technique (e.g., the device motion was large enough to cause the device to apply an offset to the contact location), and the input location 706 corresponds to an end-call button 704, the device 700 displays a confirmatory user interface 714 that includes confirmatory buttons 716 and 718”, “such that the user can more easily select the desired confirmatory button despite any motion of the device 700 and/or the input member (which initially caused the potentially inaccurate input, and which may still be affecting the device 700 and/or the input member)” (claimed ‘confirming the user input’ if button 718 is selected); “The user is therefore afforded an opportunity to confirm or reject the selection of the critical input (e.g., the end-call button)” (claimed ‘invalidating the user input’ if button 716 is selected)).
However, Meyer does not appear to expressly disclose characterizing the perturbance includes determining a frequency and a magnitude of the perturbance; and the controller configured to process the user input data to determine whether the user input is likely to be invalid due to the perturbance by comparing the frequency and magnitude of the perturbance to one or more preprogrammed perturbance criteria.
Manfred discloses a controller configured to process sensor data to detect and characterize a perturbance of a vehicle, wherein characterizing the perturbance includes determining a frequency and a magnitude of the perturbance (para[0023]-para[0025]; para[0028]; para[0033]; para[0036]; see Figs. 1-3; the microprocessor or CPU on an Air Data Inertial Reference System ("ADIRS") of the aircraft processes measured conditions and “Based on frequency and amplitude, a normal acceleration algorithm could be created to interpret acceleration as turbulence”); and configured to compare the frequency and magnitude of the perturbance to one or more preprogrammed perturbance criteria (para[0023]-para[0025]; para[0028]; para[0033]; para[0036]; see Figs. 1-3; since “turbulence predictions would come from previously experienced turbulence data”, and “Based on frequency and amplitude, a normal acceleration algorithm could be created to interpret acceleration as turbulence”, it is clear that measured perturbance of the aircraft is analyzed by comparing the frequency and amplitude of the perturbance to one or more previously experienced turbulence data).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to combine the teachings in Meyer’s invention, with the teachings in Manfred’s inventio, to have characterizing the perturbance includes determining a frequency and a magnitude of the perturbance; and the controller configured to process the user input data to determine whether the user input is likely to be invalid due to the perturbance by comparing the frequency and magnitude of the perturbance to one or more preprogrammed perturbance criteria, for the advantage of an improved system and method for monitoring conditions aboard in-flight aircraft (para[0006]).
Regarding claim 20, Meyer and Manfred disclose all the claim limitations as applied above (see claim 19). In addition, Meyer discloses the controller is configured to, by the one or more processors, perform the corrective action by either:
generating a visual prompt on the touchscreen display that requests confirmation of the user input (para[0084]-para[0086]; para[0099]; e.g. regarding Figs. 7A-7B, “the device 700 determines that the contact location 708 may not reflect the intended touch target and applies a correction or compensation (e.g., based on relative motion of the device 700 and the input member, as illustrated by arrows 710, 712) to produce the input location 706”; “In this case, because the input location 706 resulted from a compensation technique (e.g., the device motion was large enough to cause the device to apply an offset to the contact location), and the input location 706 corresponds to an end-call button 704, the device 700 displays a confirmatory user interface 714 that includes confirmatory buttons 716 and 718” (claimed ‘visual prompt’), “such that the user can more easily select the desired confirmatory button despite any motion of the device 700 and/or the input member (which initially caused the potentially inaccurate input, and which may still be affecting the device 700 and/or the input member)”; “The user is therefore afforded an opportunity to confirm or reject the selection of the critical input (e.g., the end-call button)”); or
processing the user input data to determine an actual location of contact on the touchscreen display (para[0089]; para[0104]; para[0113]; para[0122]; in 804 in Fig. 8 actual touch is detected by touch sensor 1020 (in Fig. 10) of a touchscreen display of the electronic device, and corresponding input data is received by processor 1002 (in Fig. 10); see also 904 in Fig. 9; see touch inputs in Figs. 1B, 2B, 4B, 6B and 7A);
determining an intended location of contact on the touchscreen display based on the perturbance (para[0030]; para[0032]-para[033]; para[0044]; “Information about the detected motion may then be used to correct the touch input to more accurately capture the user's intended touch target”); and
adjusting the user input from the actual location of contact to the intended location of contact (para[0030]; para[0096]-para[0098]; para[0107]-para[0110]; “Information about… movements may… be used to determine where the user was intending to (or is intending to) touch, and adjust the location of the touch input accordingly (e.g., by applying a distance offset to a detected touch location)”, as shown in Figs. 2B, 4B, 6B and 7A; see 810 in Fig. 8 and 910 in Fig. 9; “if the relative motion does satisfy the threshold condition (e.g., a distance moved by the user's finger along a direction parallel to an input surface is equal to or greater than the threshold distance), the input may be assumed to have been erroneous, and a second input location is determined”; “The second input location may be determined by subtracting a component of absolute motion of the device from the relative motion between the input member and the device”; “input-location correction techniques may be applied if an absolute motion of a device satisfies a threshold condition, or if a relative motion satisfies a threshold condition, or both”; “Determining the corrected input location may include providing the motion of the electronic device and the location of the contact to the selected input-location correction model, and receiving, as output from the selected input-location correction model, the corrected input location”, that is the intended location)).
Claim(s) 6 and 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Meyer et al. (US 2023/0086516), in view of Manfred et al. (US 2005/0278120), and further in view of Fagan et al. (US 2020/0148366).
Regarding claim 6, Meyer and Manfred disclose all the claim limitations as applied above (see claim 3). In addition, Meyer discloses the programed criteria is configured to consider a reach distance between a fixed physical location of the touchscreen display and the user (para[0057]-para[0059]; para[0107]-para[0110]; e.g. as shown in Fig. 4A, “The touch sensor may also be configured to detect the presence of the input member when it is in proximity to but not in contact with the touch-sensitive input surface 402”; “Parameters such as the magnitude of the change in capacitance may be used to determine factors such as the location of the input member 404, the distance of the input member 404 from the touch-sensitive input surface 402, and the like”; “as shown in FIG. 4A, the intended touch location 408 may be the location on the touch-sensitive input surface 402 that is perpendicularly below the input member at a certain time before a contact is detected and/or when the input member 404 is a certain distance away from the touch-sensitive input surface 402”, the intended touch location 408 being the claimed fixed physical location, based in the broadest reasonable interpretation of the claimed limitations; “FIG. 4B illustrates how the actual location of the contact 410 may differ from the intended touch location 408 due to motion of the device 400 and/or the input member 404”; “Determining the corrected input location may include providing the motion of the electronic device and the location of the contact to the selected input-location correction model, and receiving, as output from the selected input-location correction model, the corrected input location” at the intended touch location based on the distance of the input member from the touch-sensitive input surface, as discussed in regard to Figs. 4A-4B).
However, Meyer and Manfred do not appear to expressly disclose detecting a seat location of the user with the sensor system, wherein the reach distance is a predetermined distance based on the seat location of the user.
Fagan discloses detecting a seat location of a user with a sensor system (para[0027]; para[0029]; para[0039]; regarding Figs. 1 and 11-12, “Seat 100 includes a network of proximity and position sensors 158, which includes a plurality of sensors used to determine a position of seat 100”; “The user may be an occupant of seat 100 (e.g., a passenger or crew member), maintenance personnel, or other aircraft operator/manager”), wherein a reach distance between a fixed physical location and the user is a predetermined distance based on the seat location of the user (para[0027]; para[0029]; para[0039]; regarding Figs. 11-12, “Seat 100 includes a network of proximity and position sensors 158, which includes a plurality of sensors used to determine a position of seat 100 and to determine proximity of seat 100 to nearby components of the aircraft”; accordingly, it is clear that the proximity between nearby components of the aircraft and the passenger or crew member is determined based on the seat location of the passenger or crew member).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to combine the teachings in Meyer’s and Manfred’s combination, with the teachings in Fagan’s invention, to have detecting a seat location of the user with the sensor system, wherein the reach distance is a predetermined distance based on the seat location of the user, for the advantage of
determining locations of obstructions, both fixed and dynamic, and to ensure that seat movements commanded by the user will not result unwanted contact between the seat and another object (para[0039]).
Regarding claim 15, it is analogous to claim 6, and therefore it is rejected for the same reasons as claim 6 above.
Claim(s) 8 and 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Meyer et al. (US 2023/0086516), in view of Manfred et al. (US 2005/0278120), and further in view of Whitlow et al. (US 2011/0187651).
Regarding claim 8, Meyer and Manfred disclose all the claim limitations as applied above (see claim 1). However, Meyer and Manfred do not appear to expressly disclose processing, by the one or more processors, the user input data to determine that the user input includes a swipe operation or command or a double-tap operation or command; determining, by the one or more processors, that the user intended a single-tap operation or command based on the perturbance; and changing, by the one or more processors, the user input from the swipe operation or command or the double-tap operation or command to the single-tap operation or command.
Whitlow discloses processing, by one or more processors, user input data to determine that a user input includes a swipe operation or command or a double-tap operation or command (para[0019]; para[0026]-para[0027]; para[0030, 0032]; see Figs. 1-4; “Vibrations may cause a flight crew member to inadvertently double touch (tap) a desired region 204”); determining, by the one or more processors, that the user intended a single-tap operation or command based on a perturbance (para[0019]; para[0026]-para[0027]; para[0030]; see Figs. 1-4; “Vibrations may cause a flight crew member to inadvertently double touch (tap) a desired region 204, when only a single touch is desired”); and changing, by the one or more processors, the user input from the swipe operation or command or the double-tap operation or command to the single-tap operation or command (para[0019]; para[0026]-para[0027]; para[0030]; see Figs. 1-4; “when movement greater than a threshold is sensed by the motion sensing device 120, routines in the processor 104 may adjust the period of time between touches for registering the touch as an input”; accordingly, “Such inadvertent double taps are prevented by increasing the time between which valid inputs are registered by the touch panel”, in order to register the intended single touch).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to combine the teachings in Meyer’s and Manfred’s combination, with the teachings in Whitlow’s invention, to have processing, by the one or more processors, the user input data to determine that the user input includes a swipe operation or command or a double-tap operation or command; determining, by the one or more processors, that the user intended a single-tap operation or command based on the perturbance; and changing, by the one or more processors, the user input from the swipe operation or command or the double-tap operation or command to the single-tap operation or command, for the advantage of providing a touch screen whose input is adaptive to movement caused by turbulence, G forces, and/or equipment vibrations, to improve input accuracy (para[0004]-para[0007]; para[0034]).
Regarding claim 17, it is analogous to claim 8, and therefore it is rejected for the same reasons as claim 8 above.
Claims 9 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Meyer et al. (US 2023/0086516), in view of Manfred et al. (US 2005/0278120), and further in view of Kawalkar (US 2013/0194193).
Regarding claim 9, Meyer and Manfred disclose all the claim limitations as applied above (see claim 1). However, Meyer and Manfred do not appear to expressly disclose determining, by the one or more processors, that the user input includes a first dragging operation wherein the user performed a sliding contact across the touchscreen display; receiving, by the one or more processors, additional user input data subsequent to receiving the user input data that is indicative of a second dragging operation; determining, by the one or more processors, that an unintended interruption occurred between receiving the user input data and receiving the additional user input data based on the perturbance; and combining, by the one or more processors, the user input data and the additional user input data to generate a continuous, uninterrupted dragging operation.
Kawalkar discloses determining, by one or more processors, that user input includes a first dragging operation wherein the user performed a sliding contact across a touchscreen display (para[0007]; para[0028]-para[0035]; para[0037]-para[0038]; para[0043]-para[0045]; see Figs. 1, 4 and 12-15; e.g. “adaptive gesture corrector 112 receives… input gesture signals” corresponding e.g. to a first drag/slide action across touch sensitive region 108 represented by e.g. a first line segment, as shown in Fig. 13); receiving, by the one or more processors, additional user input data subsequent to receiving user input data that is indicative of a second dragging operation (para[0007]; para[0028]-para[0035]; para[0037]-para[0038]; para[0043]-para[0045]; see Figs. 1, 4 and 12-15; see e.g. “consecutive gesture segments… received within an acceptable interval of time”, indicative of a second drag/slide action across touch sensitive region 108, as shown in Fig. 13); determining, by the one or more processors, that an unintended interruption occurred between receiving the user input data and receiving the additional user input data based on perturbance (para[0007]; para[0028]-para[0035]; para[0037]-para[0038]; para[0043]-para[0045]; see Figs. 1, 4 and 12-15; “instabilities in the vehicle platform can cause missing gesture profile segments”; “consecutive gesture segments are received within an acceptable interval of time for these segments to be valid portions of an input gesture profile (1202)”, thus assuming that an unintended interruption occurred between the input of the two segments, based on instability of the touch screen display 102 as detected by instability detector 106, due to e.g. “turbulence, rough roads, or rough seas”); and combining, by the one or more processors, the user input data and the additional user input data to generate a continuous, uninterrupted dragging operation (para[0007]; para[0028]-para[0035]; para[0037]-para[0038]; para[0043]-para[0045]; see Figs. 1, 4 and 12-15; “individual line segments of the input gesture are extended in both directions (1212), the intersection points of the extended line segments are computed (1214), and the line segments beyond these intersection points are clipped (1216)”; “The resultant gesture profile is thus generated (1218), and is continuous with all missing segments filled in with the extrapolated gesture segments”).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to combine the teachings in Meyer’s and Manfred’s combination, with the teachings in Kawalkar’s invention, to have determining, by the one or more processors, that the user input includes a first dragging operation wherein the user performed a sliding contact across the touchscreen display; receiving, by the one or more processors, additional user input data subsequent to receiving the user input data that is indicative of a second dragging operation; determining, by the one or more processors, that an unintended interruption occurred between receiving the user input data and receiving the additional user input data based on the perturbance; and combining, by the one or more processors, the user input data and the additional user input data to generate a continuous, uninterrupted dragging operation, for the advantage of correcting irregularities and discontinuities in gesture-based input commands supplied to a gesture-based touch screen user interface that does not rely on replicating abnormal environments and operating situations and/or training based on numerous permutations and combinations (para[0006]).
Regarding claim 18, it is analogous to claim 9, and therefore it is rejected for the same reasons as claim 9 above.
Response to Arguments
Applicant’s arguments with respect to claim(s) 1, 10 and 19 have been considered but are moot because the new ground of rejection does not rely on any reference as applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Newly added limitations have now been treated on the merits, and the above rejection has been amended in the same fashion as the amended claims.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to GLORYVID FIGUEROA-GIBSON whose telephone number is (571)272-5506. The examiner can normally be reached on 9am-5pm, Monday -Friday, Eastern Time.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Nitin Patel can be reached on 571-272-7677. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/GLORYVID FIGUEROA-GIBSON/Patent Examiner, Art Unit 2628
/NITIN PATEL/Supervisory Patent Examiner, Art Unit 2628