DETAILED ACTION
This Office Action is in response to Applicant's Application filed on 11/12/2024.
Claims 1-8 are pending for examination.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 1/15/2026, 11/12/2024 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1-8 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
The claims are generally narrative and indefinite, failing to conform with current U.S. practice. They appear to be a literal translation into English from a foreign document and are replete with grammatical and idiomatic errors.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-3, 8 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by YUICHI (JP2010032546A).
Regarding claim 1, YUICHI teaches An information processing device, comprising:
a detection unit that detects a touch operation on a screen (YUICHI: Para 24 “the CPU 20 of the PND 1 receives a touch operation by the user's fingertip on the touch panel 25 of the monitor 3 through the input information processing unit 20A, and calculates the display position of the map image on the LCD 24 according to the touch operation”);
an operation area display control unit that displays a first operation area on the screen when a first touch operation of sliding a touch position on the screen is detected by the detection unit(YUICHI: Fig. 5 NG4; Para 33 “when the CPU 20 of the PND 1 recognizes that a drag scroll operation has been performed in which an arbitrary point PO2 on the navigation map image NG3 displayed on the LCD 24 is touched with a fingertip and then traced across the screen, the CPU 20 is able to execute a drag scroll process in which the map on the LCD 24 is gradually moved while being displayed by sequentially displaying navigation map images NG4 and NG5 in which the map display area has been moved in a direction along the trajectory of the drag scroll operation along with the drag scroll operation”); and
an object display control unit that, when a second touch operation in the first operation area is detected by the detection unit, moves an object to be displayed on the screen and that is subject to display control by sliding a predetermined distance toward a starting point side of the touch position of the first touch operation(YUICHI: Fig. 5 NG5; Para 32 “Drag Scroll Processing As shown in FIG. 5, when a navigation map image NG3 is displayed on the LCD 24 of the monitor 3, and the user touches an arbitrary point PO2 on the screen of the LCD 24 with their fingertip, and then traces the fingertip to points PO3 and PO4, the CPU 20 of the PND 1 displays navigation map images NG4 and NG5 in which the map is sequentially moved from point PO2 to point PO4 in accordance with the trajectory of the drag scroll operation.”; Para 33 “when the CPU 20 of the PND 1 recognizes that a drag scroll operation has been performed in which an arbitrary point PO2 on the navigation map image NG3 displayed on the LCD 24 is touched with a fingertip and then traced across the screen, the CPU 20 is able to execute a drag scroll process in which the map on the LCD 24 is gradually moved while being displayed by sequentially displaying navigation map images NG4 and NG5 in which the map display area has been moved in a direction along the trajectory of the drag scroll operation along with the drag scroll operation”), wherein
the operation area display control unit displays the first operation area at a position between the starting point and one side of the screen intersecting with a straight line extending in a sliding direction of the first touch operation from the starting point, the position being away from the one side(YUICHI: Fig. 5 NG5; Para 32 “Drag Scroll Processing As shown in FIG. 5, when a navigation map image NG3 is displayed on the LCD 24 of the monitor 3, and the user touches an arbitrary point PO2 on the screen of the LCD 24 with their fingertip, and then traces the fingertip to points PO3 and PO4, the CPU 20 of the PND 1 displays navigation map images NG4 and NG5 in which the map is sequentially moved from point PO2 to point PO4 in accordance with the trajectory of the drag scroll operation.”; Para 33 “when the CPU 20 of the PND 1 recognizes that a drag scroll operation has been performed in which an arbitrary point PO2 on the navigation map image NG3 displayed on the LCD 24 is touched with a fingertip and then traced across the screen, the CPU 20 is able to execute a drag scroll process in which the map on the LCD 24 is gradually moved while being displayed by sequentially displaying navigation map images NG4 and NG5 in which the map display area has been moved in a direction along the trajectory of the drag scroll operation along with the drag scroll operation”).
Regarding claim 2, YUICHI teaches The information processing device according to claim 1, wherein the detection unit detects an operation of sliding a touch position in the first touch operation into the first operation area as the second touch operation(YUICHI: Fig. 5 NG5; Para 32 “Drag Scroll Processing As shown in FIG. 5, when a navigation map image NG3 is displayed on the LCD 24 of the monitor 3, and the user touches an arbitrary point PO2 on the screen of the LCD 24 with their fingertip, and then traces the fingertip to points PO3 and PO4, the CPU 20 of the PND 1 displays navigation map images NG4 and NG5 in which the map is sequentially moved from point PO2 to point PO4 in accordance with the trajectory of the drag scroll operation”).
Regarding claim 3, YUICHI teaches The information processing device according to claim 1, wherein the operation area display control unit displays the first operation area on the starting point side of a midpoint between the starting point and the one side of the screen(YUICHI: Fig. 5 NG3; Para 32 “Drag Scroll Processing As shown in FIG. 5, when a navigation map image NG3 is displayed on the LCD 24 of the monitor 3, and the user touches an arbitrary point PO2 on the screen of the LCD 24 with their fingertip, and then traces the fingertip to points PO3 and PO4, the CPU 20 of the PND 1 displays navigation map images NG4 and NG5 in which the map is sequentially moved from point PO2 to point PO4 in accordance with the trajectory of the drag scroll operation”; i.e. left side of the screen would encompass the first operation area).
As per claim 8, it recites A non-transitory, computer-readable recording medium having stored thereon a computer program that, when executed by an electronic processor of an information processing device, configures the information processing device to execute a process having limitations similar to those of claim 1 and therefore is rejected on the same basis. YUICHI teaches A non-transitory, computer-readable recording medium (YUICHI: Para 14 “The PND 1 also reads out basic programs and various application programs stored in a nonvolatile memory 21 and executes them on a RAM (Random Access Memory) 22, thereby realizing normal navigation functions and a map scrolling processing function described later.”)
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim 4-5 is/are rejected under 35 U.S.C. 103 as being unpatentable over YUICHI (JP2010032546A)in view of ENDO (US20120206481A1).
In regards to claim 4, YUICHI teaches The information processing device according to claim 1, wherein the operation area display control unit displays a second operation area in a position adjacent to the first operation area(YUICHI: Fig. 5 NG5; Para 32 “Drag Scroll Processing As shown in FIG. 5, when a navigation map image NG3 is displayed on the LCD 24 of the monitor 3, and the user touches an arbitrary point PO2 on the screen of the LCD 24 with their fingertip, and then traces the fingertip to points PO3 and PO4, the CPU 20 of the PND 1 displays navigation map images NG4 and NG5 in which the map is sequentially moved from point PO2 to point PO4 in accordance with the trajectory of the drag scroll operation” i.e. right side of the screen would encompass the second operation area).
Yet YUICHI do not explicitly teach the object display control unit enlarges a display of the object when the detection unit detects a third touch operation in the second operation area.
However, in the same field of endeavor, ENDO teaches the object display control unit enlarges a display of the object when the detection unit detects a third touch operation in the second operation area(ENDO: Fig. 3A-3D; Para 78 “The dragging operation is an operation wherein the operator touches the display unit 5 with a finger, and slides this touched finger in the upper direction while in the state of touching the display unit 5. The control unit 15 detects the distance between the position wherein the finger of the operator is touching the display unit 5 which serves as a touch panel and the position wherein the finger that has been sliding is removed from the display unit 5 at the touching position (the distance of the series of dragging operations), and subjects the still image displayed on the display unit 5 to enlarging operation according to the dragging operation distance and displays this”).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to modify The information processing device of YUICHI with the feature of the object display control unit enlarges a display of the object when the detection unit detects a third touch operation in the second operation area disclosed by ENDO. One would be motivated to do so for the benefit of “desired objects can be speedily searched from a great number of objects” (ENDO: Para 15).
In regards to claim 5, the combination of YUICHI and ENDO teaches The information processing device according to claim 4, and YUICHI further teaches wherein the detection unit detects an operation of sliding a touch position in the second touch operation from the first operation area to the second operation area as the third touch operation(YUICHI: Fig. 5 NG5; Para 32 “Drag Scroll Processing As shown in FIG. 5, when a navigation map image NG3 is displayed on the LCD 24 of the monitor 3, and the user touches an arbitrary point PO2 on the screen of the LCD 24 with their fingertip, and then traces the fingertip to points PO3 and PO4, the CPU 20 of the PND 1 displays navigation map images NG4 and NG5 in which the map is sequentially moved from point PO2 to point PO4 in accordance with the trajectory of the drag scroll operation” i.e. right side of the screen would encompass the second operation area).
Claim 6-7 is/are rejected under 35 U.S.C. 103 as being unpatentable over YUICHI (JP2010032546A)in view of Nakamura (US20110164062A1).
In regards to claim 6, YUICHI teaches The information processing device according to claim 1
Yet YUICHI do not explicitly teach wherein the operation area display control unit displays a third operation area on an opposite side of the starting point to the first operation area, and
when a fourth touch operation in the third operation area is detected by the detection unit, the object display control unit returns a display state of the object to a display state before the object was moved by the predetermined distance by sliding.
However, in the same field of endeavor, Nakamura teaches wherein the operation area display control unit displays a third operation area on an opposite side of the starting point to the first operation area(Nakamura: Para 97 “the current location button is a button for returning the screen to the initial navigation screen. The current location button is provided separately from the buttons which are displayed by software, such that there is an additional meaning to allow the user to feel comfortable that, if the button is only depressed, the screen returns to the initial navigation screen, that is, the screen in which the position of the host vehicle is displayed on the map”), and
when a fourth touch operation in the third operation area is detected by the detection unit, the object display control unit returns a display state of the object to a display state before the object was moved by the predetermined distance by sliding(Nakamura: Para 97 “the current location button is a button for returning the screen to the initial navigation screen. The current location button is provided separately from the buttons which are displayed by software, such that there is an additional meaning to allow the user to feel comfortable that, if the button is only depressed, the screen returns to the initial navigation screen, that is, the screen in which the position of the host vehicle is displayed on the map”).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to modify The information processing device of YUICHI with the feature of wherein the operation area display control unit displays a third operation area on an opposite side of the starting point to the first operation area, and when a fourth touch operation in the third operation area is detected by the detection unit, the object display control unit returns a display state of the object to a display state before the object was moved by the predetermined distance by sliding disclosed by Nakamura. One would be motivated to do so for the benefit of “improving operational performance of a button operation on the touch panel” (Nakamura: Para 22).
In regards to claim 7, the combination of YUICHI teaches The information processing device according to claim 1, and Nakamura further teaches which is installed in a vehicle, wherein
a seat row including a first seat and a second seat aligned in a first direction is installed in the vehicle(Nakamura: Para 76 “. The navigation device 1 is used in a state of being installed around the center of the dashboard where a passenger at a driver's seat or a front passenger's seat easily reaches, and includes a main unit 2 and a display unit 3”), and
the screen is positioned in front of the seat row and is formed to extend in the first direction(Nakamura: Para 76 “. The navigation device 1 is used in a state of being installed around the center of the dashboard where a passenger at a driver's seat or a front passenger's seat easily reaches, and includes a main unit 2 and a display unit 3”). The Examiner supplies the same rationale for the combination of references YUICHI and Nakamura as in Claim 6 above.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Takagi (JP2011141340A) disclosed a map display technology in an in-vehicle navigation system, and more particularly to a map display device in which an improvement in map enlargement / reduction processing operation is made and a navigation system using the map display device.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to WENYUAN YANG whose telephone number is (571)272-5455. The examiner can normally be reached Monday - Thursday 9:00AM-5:00PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Hitesh Patel can be reached at (571) 270-5442. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/W.Y./Examiner, Art Unit 3667
/Hitesh Patel/Supervisory Patent Examiner, Art Unit 3667
2/23/26