Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 01/08/2026 has been entered.
Claim status
Claims 1, 3-6 and 8-10 are pending; claim 1 and 6 are independent. Claims 2 and 7 have been cancelled.
Response to Arguments
Applicant's arguments filed 01/08/2026 have been fully considered but they are not persuasive.
in response to applicant’s argument that Takada, Pieree and Kim, taken individually or combined, fail to teach or suggest inventive features of the presently claimed, including “aligning, by the processor, the at least one widget or menu icon based on a manipulating position of an object or a detected position of the object, in response to the object approaching the display screen”, as is called for by claims 1 and 6.
However, the examiner respectfully disagrees, Takada taught in Para 0035, wherein the vehicle-mounted device control unit 102 stores a data table that defines the relation among each of these distances, the voltage value output from the sensing unit 103, and the type of the infrared light distance sensor that detects the user's hand. Based on this data table and the voltage actually output from the sensing unit 103, the vehicle-mounted device control unit 102 identifies in which region, region 1 to region 3, the user's hand is present, so in figs 3-7, S302 and Para 0045, wherein the sensing unit 103 monitors whether the user's hand is detected in region 1 such as the one shown in FIG. 4. If the user's hand is detected in region 1 (S302: “Is hand detected in region 1?” Yes) as shown in FIG. 5. In Para 0046, the vehicle-mounted device control unit 102 performs control for the display unit 112 to move a predetermined icon, displayed by the display unit 112, to the right side, that is, to the driver's side in such a way that the NAVI button shown in FIG. 5 is moved (S304: “Move predetermined icon to predetermined position” based on the user's hand is detected in region 1.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1, 3-6 and 8-10 is/are rejected under 35 U.S.C. 103 as being unpatentable over Takada (US 2018/0232057), in view of Pieree (US 2017/0090695), in view of Kim (US 2014/0019910), hereinafter Kim, and further in view of Kim (US 2009/0237372), hereinafter “Kim72”.
Regarding claim 1, Takada teaches a display method using a position detection apparatus, (fig. 3 and fig. 10), the method comprising:
displaying, by a processor, at least one widget or menu icon to a length corresponding to at least a portion of an area of a display screen in one direction (fig. 4 and Par 0044, wherein the NAVI icon and the AV icon are displayed on a display unit112);
aligning, by the processor, the at least one widget or menu icon based on a manipulating position of an object or a detected position of the object, in response to the object approaching the display screen (fig. 3, S302 and Para 0045, wherein the sensing unit 103 monitors whether the user's hand is detected in region 1 such as the one shown in FIG. 4. If the user's hand is detected in region 1 (S302: “Is hand detected in region 1?” Yes) as shown in FIG. 5.In Para 0046, the vehicle-mounted device control unit 102 performs control for the display unit 112 to move a predetermined icon, displayed by the display unit 112, to the right side, that is, to the driver's side in such a way that the NAVI button shown in FIG. 5 is moved (S304: “Move predetermined icon to predetermined position”); and
activating at least one of the at least one widget or menu icon based on the detected position of the object (fig. 3, S308, S309, S310 and Paras 0048-0049, wherein. if the sensing unit 103 detects that the user's hand is present in region 1 (S305: “Is user's hand present in region 1?” Yes) and that the user's hand is present in region 2 in FIG. 6 (S308: “Is user's hand present in region 2 or region 3?” region 2), the vehicle-mounted device control unit 102 performs control for the display unit 112 to display the lower-level menu of the NAVI icon in a fan-like manner (S309: “Expand menu of predetermined icon”) as shown in FIG. 6);
wherein the displaying includes displaying, by the processor, detailed information corresponding to an activated menu icon on the display screen, in response to an approach distance of the object to the display screen being within a critical distance (fig. 3, S309 and Para 0048, wherein. if the sensing unit 103 detects that the user's hand is present in region 1 (S305: “Is user's hand present in region 1?” Yes) and that the user's hand is present in region 2 in FIG. 6 (S308: “Is user's hand present in region 2 or region 3?” region 2), the vehicle-mounted device control unit 102 performs control for the display unit 112 to display the lower-level menu of the NAVI icon in a fan-like manner (S309: “Expand menu of predetermined icon”) as shown in FIGs 6 and 7).
Takada apparently does not explicitly teach the italicized portions of:
displaying, by a processor, at least one widget or menu icon to a length corresponding to at least a portion of an area of a display screen in one direction based on information about a size of the display screen.
However, Pieree discloses “displaying, by a processor, at least one widget or menu icon… based on information about a size of the display screen”, see Para 0026.
It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to have modified a display method of Takada applying the teaching of Pieree to adjust the displayed content based on a size of the display screen, as a known technique to yield a predictable result.
Takada in view of Pieree does not expressly disclose wherein the manipulating position of the object is determined based on a shape of the object.
However, Kim discloses wherein the manipulating position of the object is determined based on a shape of the object, see fig. 6 and Paras 0087-0090.
It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to have modified a display method of Takada in view of Pieree by applying the teaching of Kim wherein a decisions to move the higher or lower folder level folder or whether to previous on next folder on the same level can be determined according to the direction (horizontal or vertical) of the gesture or the shape of the gesture (e.g. shape of the hand indicating a certain number), as a known technique to yield a predictable result.
Takada in view of Pieree and in view of Kim does not expressly disclose wherein the alignment of the at least one widget or menu icon is dynamically adjusted based on the position of the object within a predefined region of the display screen.
However, “Kim 72” disclosed “wherein the alignment of the at least one widget or menu icon is dynamically adjusted based on the position of the object within a predefined region of the display screen”, see figs 10a-10e and Paras 0100-0105.
It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to have modified a display method of Takada in view of Pieree and in view of Kim by applying the teaching of “Kim 72”, when an object or a finger approaches a preset position on the touch screen displaying a web page, the touch screen senses the proximity touch input by the finger and transmits an input signal to the controller in response to the proximity touch input. If the proximity touch operation continues for a preset period of time, the controller determines that the input signal is a proximity touch operation for inputting a user command and displays the control window at a predetermined area on the main screen of the touch screen, as shown in FIG. 10c. Once the finger deviates from the touch recognition effective distance, the upper area of the web page is fixedly displayed again on the touch screen as displayed initially, as a known technique to yield a predictable result.
Regarding claim 3, Takada in view of Pieree in view of Kim and in view of “Kim 72” teaches the display method according to claim 1, further comprising realigning, by the processor, the at least one widget based on the detected position of the object, in response to the object deviating from a region corresponding to a widget display range of the display screen (fig. 3, S310 and Para 0049, wherein, if the sensing unit 103 detects that the user's hand is present in region 3 in FIG. 7 (S308: “Is user's hand present in region 2 or region 3?” region 3), the vehicle-mounted device control unit 102 performs control for the display unit 112 to display the lower-level menu of the AV icon in a fan-like manner (S310: “Expand menu of predetermined icon”) as shown in FIG. 7, Takada).
Regarding claim 4, Takada in view of Pieree and in view of Kim and in view of “Kim 72” teaches the display method according to claim 1, wherein the manipulating position of the object is set in advance in a memory or a storage device, is further determined based on a position at which the object is first detected, or is determined based on a stretching direction of the object (Para 0035, Takada).
Regarding claim 5, Takada in view of Pieree and in view of Kim and in view of “Kim 72” teaches the display method according to claim 1, wherein, in the displaying of the at least one widget or menu icon, the at least one widget or menu icon is displayed based on the size of the display screen and information about a screen mode, the size of the display screen is one of sizes corresponding to a small-sized screen, a middle-sized screen, and a large-sized screen, and the screen mode is one of a full screen mode or a split screen mode (Para 0026, Pieree).
Regarding claim 6, Takada teaches a position detection apparatus (fig. 1) comprising:
a display configured to display information on a display screen (figs 1-4, a display unit 112 and Para 0042, wherein the display unit 112, is a device that presents video information to the user, includes a display unit such as a LCD);
a linear infrared (IR) sensor configured to detect an object configured to approach the display (fig. 2 and Para 0031-0032); and
a processor configured to:
display at least one widget or menu icon to a length corresponding to at least a portion of an area of the display screen in one direction (fig. 4 and Para 0044, wherein the NAVI icon and the AV icon are displayed on a display unit112),
align the at least one widget or menu icon based on a manipulating position of the object or a detected position of the object, based on that the object approaches the display screen (fig. 3, S302 and Para 0045, wherein the sensing unit 103 monitors whether the user's hand is detected in region 1 such as the one shown in FIG. 4. If the user's hand is detected in region 1 (S302: “Is hand detected in region 1?” Yes) as shown in FIG. 5.In Para 0046, the vehicle-mounted device control unit 102 performs control for the display unit 112 to move a predetermined icon, displayed by the display unit 112, to the right side, that is, to the driver's side in such a way that the NAVI button shown in FIG. 5 is moved (S304: “Move predetermined icon to predetermined position”), and
activate at least one of the at least one widget or menu icon based on the detected position of the object (fig. 3, S308, S309, S310 and Paras 0048-0049, wherein. if the sensing unit 103 detects that the user's hand is present in region 1 (S305: “Is user's hand present in region 1?” Yes) and that the user's hand is present in region 2 in FIG. 6 (S308: “Is user's hand present in region 2 or region 3?” region 2), the vehicle-mounted device control unit 102 performs control for the display unit 112 to display the lower-level menu of the NAVI icon in a fan-like manner (S309: “Expand menu of predetermined icon”) as shown in FIG. 6);
wherein the processor further displays detailed information corresponding to an activated menu icon on the display screen, based on that an approach distance of the object to the display screen is within a critical distance (fig. 3, S309 and Para 0048, wherein. if the sensing unit 103 detects that the user's hand is present in region 1 (S305: “Is user's hand present in region 1?” Yes) and that the user's hand is present in region 2 in FIG. 6 (S308: “Is user's hand present in region 2 or region 3?” region 2), the vehicle-mounted device control unit 102 performs control for the display unit 112 to display the lower-level menu of the NAVI icon in a fan-like manner (S309: “Expand menu of predetermined icon”) as shown in FIGs 6 and 7).
Takada apparently does not explicitly teach the italicized portions of:
display at least one widget or menu icon to a length corresponding to at least a portion of an area of the display screen in one direction based on information about a size of the display screen.
However, Pieree discloses “displaying, by a processor, at least one widget or menu icon… based on information about a size of the display screen”, see Para 0026.
It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to have modified a position detection apparatus of Takada by applying the teaching of Pieree to adjust the displayed content based on a size of the display screen, as a known technique to yield a predictable result.
Takada in view of Pieree does not expressly disclose wherein the manipulating position of the object is determined based on a shape of the object.
However, Kim discloses wherein the manipulating position of the object is determined based on a shape of the object, see fig. 6 and Paras 0087-0090.
It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to have modified a display method of Takada in view of Pieree by applying the teaching of Kim wherein a decisions to move the higher or lower folder level folder or whether to previous on next folder on the same level can be determined according to the direction (horizontal or vertical) of the gesture or the shape of the gesture (e.g. shape of the hand indicating a certain number), as a known technique to yield a predictable result.
Takada in view of Pieree and in view of Kim does not expressly disclose wherein the alignment of the at least one widget or menu icon is dynamically adjusted based on the position of the object within a predefined region of the display screen.
However, “Kim 72” disclosed “wherein the alignment of the at least one widget or menu icon is dynamically adjusted based on the position of the object within a predefined region of the display screen”, see figs 10a-10e and Paras 0100-0105.
It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to have modified a display method of Takada in view of Pieree and in view of Kim by applying the teaching of “Kim 72”, when an object or a finger approaches a preset position on the touch screen displaying a web page, the touch screen senses the proximity touch input by the finger and transmits an input signal to the controller in response to the proximity touch input. If the proximity touch operation continues for a preset period of time, the controller determines that the input signal is a proximity touch operation for inputting a user command and displays the control window at a predetermined area on the main screen of the touch screen, as shown in FIG. 10c. Once the finger deviates from the touch recognition effective distance, the upper area of the web page is fixedly displayed again on the touch screen as displayed initially, as a known technique to yield a predictable result.
Regarding claim 8, Takada in view of Pieree and in view of Kim and in view of “Kim 72” teaches the position detection apparatus according to claim 6, wherein the processor realigns the at least one widget based on the detected position of the object, when the object deviates from a region corresponding to a widget display range of the display screen (fig. 3, S310 and Para 0049, wherein, if the sensing unit 103 detects that the user's hand is present in region 3 in FIG. 7 (S308: “Is user's hand present in region 2 or region 3?” region 3), the vehicle-mounted device control unit 102 performs control for the display unit 112 to display the lower-level menu of the AV icon in a fan-like manner (S310: “Expand menu of predetermined icon”) as shown in FIG. 7, Takada).
Regarding claim 9, Takada in view of Pieree and in view of Kim and in view of “Kim 72” teaches the position detection apparatus according to claim 6, wherein the manipulating position of the object is set in advance in a memory or a storage device, is further determined based on a position at which the object is first detected, or is determined based on a stretching direction of the object (Para 0035, Takada).
Regarding claim 10, Takada in view of Pieree and in view of Kim and in view of “Kim 72” teaches the position detection apparatus according to claim 6, wherein the processor displays the at least one widget or menu icon based on the size of the display screen and information about a screen mode, the size of the display screen is one of sizes corresponding to a small-sized screen, a middle-sized screen, and a large-sized screen, and the screen mode is one of a full screen mode or a split screen mode (Para 0026, Pieree).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Cho (US 2018/0217670), relates to a vehicle display device and a vehicle including the vehicle display device.
Ward (US 2016/0202871), relates to an apparatus and method for controlling zooming of a display, and particularly a touch screen display, by using proximity detection.
Hsu (US 2012/0192111), relate to a user interface of an electronic device, and more particularly to a method for varying icon sizes of the menu icons displayed in the user interface and the electronic device using the same.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SAIFELDIN E ELNAFIA whose telephone number is (571) 270-5852. The examiner can normally be reached 9-5.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, WILLIAM BODDIE can be reached at (571) 272-0666. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/S.E.E/Examiner, Art Unit 2625 2/25/2026
/WILLIAM BODDIE/Supervisory Patent Examiner, Art Unit 2625