DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant's arguments with respect to the rejections of claims 1-6, 8-16, and 18-20 under 35 U.S.C. § 103 (Applicant’s Remarks, filed 09/23/2025, pp. 6-8) have been fully considered but they are not persuasive for at least the following reasons:
Concerning the rejections of independent Claims 1 and 11 under 35 U.S.C. § 103, Applicant argues Ragavan (as well as Hayashi and Nagamatsu, either alone or in any proper combination) fails to teach the amended limitation “identify[ing] the braille character based on the first input and the second input received by the touch panel, the second input (i) confirming that the dot configuring the braille character corresponds to the dotted protrusion when the second operation is received after the first operation, or (ii) indicating that the dot configuring the braille character corresponds to a flat surface when the second operation is received without the first operation” (Applicant’s Remarks, filed 9/23/2025, pp. 6-8). Examiner respectfully disagrees. Under the broadest reasonable interpretation of the claims, the second input must confirm at least one of the following: (i) that the dot corresponds to a dotted protrusion when the second operation is received after the first operation or (ii) that the dot corresponds to a flat surface when the second operation is received without the first operation. Therefore, Examiner asserts Ragavan teaches the second option of this limitation: “the second input… (ii) indicat[es] that the dot configuring the braille character corresponds to a flat surface when the second operation is received without the first operation.” Examiner directs Applicant’s attention to paragraph 0014 of Ragavan’s specification, in which Ragavan teaches that a second operation (a swipe in the left direction) given without the first operation (a swipe in the right direction) indicates the dot corresponds to a flat surface. See rejections of claims below.
Concerning the rejections of Claims 2-6, 8-10, 12-16, and 18-20 under 35 U.S.C. § 103, Applicant argues that Ragavan, Hayashi, and Nagamatsu, viewed alone or in any proper combination, fail to teach each of the elements of independent claims 1 and 11, which each of these claims depend upon and further limit (Applicant’s Remarks, filed 9/23/2025, p. 8). Examiner respectfully disagrees and reiterates that the specified limitation is taught by Ragavan, as explained above, and that each of the limitations of claims 1 and 11 are taught by Ragavan modified by Hayashi. See rejections of claims below.
The previous rejections of claims 7 and 17 under 35 U.S.C. § 103 have been withdrawn due to Applicant’s cancellation of claims 7 and 17 (Applicant’s Remarks, filed 09/23/25, p. 6).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-4, 6, 8-14, 16, and 18-20 are rejected under 35 U.S.C. 103 as being unpatentable over Ragavan in view of Hayashi.
Regarding Claim 1, Ragavan discloses a touch panel (figs. 2A-2B; abstract: “A Braille user interface for a Braille communication system on a touch screen”) configured to:
receive a first input of giving a first instruction that a dot configuring a braille character corresponds to a dotted protrusion through a first operation (fig. 2A; par. 0014: “the first gesture is a swipe [in] the right direction which creates a raised dot”), and
receive a second input of giving a second instruction different from the first instruction of the first input through a second operation different from the first operation (fig. 2B; par. 0014: “the second gesture is a swipe in the left direction which creates a non-raised dot”); and
a processor (par. 0042: “Such action is read into the touch screen-enabled electronic device 45 that operates the touch screen 50 and the touch screen-enabled electronic device 45 then interprets that motion as a raised dot 35 (as shown in FIG. 1). Of course that interpretation is software based using a non-volatile software program;” Examiner notes such a device necessarily and inherently must have a processor) configured to:
identify the braille character based on the first input and the second input received by the touch panel (par. 0013: “The first gesture and the second gesture are used to input a Braille character while the computing system converts that Braille character into the other written language”), the second input confirming that the dot configuring the braille character corresponds to the dotted protrusion when the second operation is received after the first operation, or indicating that the dot configuring the braille character corresponds to a flat surface when the second operation is received without the first operation (fig. 2B; par. 0014: “the second gesture is a swipe in the left direction which creates a non-raised dot;” Examiner further notes that the second operation of swiping left performed without the first operation of swiping right indicates the dot corresponds to a flat surface).
Ragavan does not explicitly disclose how the processor searches for a corresponding process function. However, Hayashi discloses a processor (par. 0001: “an image processing device (e.g., a copier, a facsimile, an MFP, etc.) that performs image output processing based on input data, and in particular to such an image processing device that is equipped with an operation panel through which a user inputs commands and information for the image output processing;” Examiner notes such a device necessarily and inherently must have a processor) configured to:
search for one or more process functions corresponding to the identified braille character (fig. 7; par. 0014: “When inputting braille data operates the key point for instructing "convex" input to be Braille in Braille keys [391 to 396], when the input operation is completed, pressing the execution key 39b Input is complete. The MFP machine generates Braille data according to the above-described user instruction, and manages setting input such as command input and operation conditions based on the Braille data;” par. 0020: “However, when a function other than the copy function, such as a FAX function, is to be selected, it is necessary to switch to the selected function by inputting from the Braille keyboard 39;” Examiner interprets this to be equivalent to the processor searching for the relevant process function. Because the user enters braille input and then the function they requested happens, it is necessary for the processor to be able to read and understand the user’s braille input and consequently match it to the correct process). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the process function searching of Hayashi with the input apparatus of Ragavan in order to more easily enable people with vision impairments to easily use electronic devices with touch screens to perform the specific functions they want (Hayashi, abstract; par. 0004).
Regarding Claim 2, Ragavan further discloses when a number of times the processor confirms whether or not the dot configuring the braille character corresponds to the dotted protrusion reaches a predetermined number of times (par. 0039: “Each dot array 30 is a six-bit character set arranged in two (2) columns and three (3) rows. Each of the six-bit positions in the dot array 30 is represented by either a raised dot 35 or by a non-raised dot 40;” Examiner notes the predetermined number of times is six because there are six dots for each character and that the processor necessarily and inherently must check each of these six dots in order to correctly translate the character), the processor identifies the braille character (par. 0013: “first gesture and the second gesture are used to input a Braille character while the computing system converts that Braille character into the other written language;” Examiner further notes the character is not translated/identified until the entire character—each of the six dots—has been entered).
Regarding Claim 3, Ragavan modified by Hayashi further discloses a display configured to display a process function result of the one or more process functions searched by the processor (Hayashi, fig. 3: braille display 38; par. 0013: “As shown in the figure, the Braille character module 38 has six Braille elements 38e as elements for forming Braille characters, and variable information can be displayed depending on the concave and convex states of these Braille elements 38e. In order to realize the variable information display function, each braille element 38e needs to change its display state depending on the type of braille to be displayed;” par. 0017: “Incidentally, when the user performs an incorrect input operation or becomes unsure of how to operate during operation in the "Braille display mode," it is necessary to guide the user to perform the correct operation. That is, when such a situation occurs, it is necessary to inform the user of the current situation and to be able to take measures such as re-entering the data or guiding the user to the next step. Of course, it is possible to use the Braille display 37 to inform the user of the situation and to instruct the user to re-enter information, etc.”). The combination of the input apparatus of Ragavan with the processor capable of process searching as taught by Hayashi described above for Claim 1 would have included this display.
Regarding Claim 4, Ragavan modified by Hayashi further discloses a speaker configured to output a voice guidance relating to a determination of the one or more process functions searched by the processor (Hayashi, fig. 6; par. 0017: “For example, if the user makes a mistake in inputting operations or becomes unsure of how to operate the device midway, the current situation will be notified to the user by voice from the speaker. The voice notification allows the user to know the situation, and once the user has confirmed it, voice guidance is provided to guide the user to the next step”). The combination of the input apparatus of Ragavan with the processor capable of process searching as taught by Hayashi described above for Claim 1 would have included the speaker.
Regarding Claim 6, Ragavan modified by Hayashi further discloses a scanner configured to perform a first process function of the one or more process functions, the first process function comprising a scanning function (Hayashi, fig. 1: scanner 4; par. 0011: “The scanner distribution application 29, in accordance with distribution instructions from the main controller (not shown), receives image data of the document input through the scanner unit 4 (processed in the same manner as when the copy function is used) from the image processing unit 24, converts this data into a data format that can be used by the distribution destination, compresses the data, etc., and sends it to the distribution destination via a communication interface”), and
a printer configured to perform a second process function of the one or more process functions, the second process function comprising a printing function (Hayashi, fig. 1: printer 19; par. 0011: “The printer application 25 generates image data to be used for the print output in accordance with instructions of a print output request received from a host device (not shown) via a network or the like, and causes the printer unit 19 to print out the data based on this image data via the image processing unit 24”). The combination of the input apparatus of Ragavan with the processor capable of process searching as taught by Hayashi described above for Claim 1 would have included the printing and scanning functions.
Regarding Claim 8, Ragavan further discloses the first operation is a tap operation, a double tap operation, a long tap operation, a swipe operation, a pinch out operation, or a pinch in operation (par. 0014: “the first gesture is a swipe [in] the right direction”), and the second operation is a tap operation, a double tap operation, a long tap operation, a swipe operation, a pinch out operation, or a pinch in operation (par. 0014: the second gesture is a swipe in the left direction”).
Regarding Claim 9, Ragavan modified by Hayashi further discloses the braille character has a one-to-one relationship with a process name of one of the one or more process functions (Hayashi, fig. 7; par. 0020: each braille input has only one possible output function, e.g., entering the braille character correlated to “scan” causes the device to scan or the character correlated to “fax” causes the device to fax. The Examiner notes that to a person having ordinary skill in the art, the one-to-one relationship between the entered braille characters and the process names of the process functions would be an obvious choice as to enhance the ease of use and predictability of the device). The combination of the input apparatus of Ragavan with the processor capable of process searching as taught by Hayashi described above for Claim 1 would have included this one-to-one relationship.
Regarding Claim 10, Ragavan modified by Hayashi further discloses the braille character is correlated with a plurality of process names, each of the plurality of process names of one of the one or more process functions (Hayashi, par. 0020: each braille input is associated with a given process/function name, such as fax, copy, scan, or print. The Examiner notes that to a person having ordinary skill in the art, the correlation between one entered braille character and one process names of the process functions would be an obvious choice as to enhance the ease of use and predictability of the device). The combination of the input method of Ragavan with the processor capable of process searching as taught by Hayashi described above for Claim 1 would have included the described correlation.
Regarding Claim 11, Ragavan discloses receiving a first input of giving a first instruction that a dot configuring a braille character corresponds to a dotted protrusion through a first operation (fig. 2A; par. 0014: “the first gesture is a swipe [in] the right direction which creates a raised dot”);
receiving a second input of giving a second instruction different from the first instruction of the first input through a second operation different from the first operation (fig. 2B; par. 0014: “the second gesture is a swipe in the left direction which creates a non-raised dot”); and
identifying the braille character based on the first input and the second input (par. 0013: “The first gesture and the second gesture are used to input a Braille character while the computing system converts that Braille character into the other written language”),
the second input confirming that the dot configuring the braille character corresponds to the dotted protrusion when the second operation is received after the first operation, or indicating that the dot configuring the braille character corresponds to a flat surface when the second operation is received without the first operation (fig. 2B; par. 0014: “the second gesture is a swipe in the left direction which creates a non-raised dot;” Examiner further notes that the second operation performed without the first operation indicates the dot corresponds to a flat surface).
Ragavan does not explicitly disclose searching for a corresponding process function. However, Hayashi discloses searching for one or more process functions corresponding to the braille character identified (fig. 7; par. 0014: “When inputting braille data operates the key point for instructing "convex" input to be Braille in Braille keys [391 to 396], when the input operation is completed, pressing the execution key 39b Input is complete. The MFP machine generates Braille data according to the above-described user instruction, and manages setting input such as command input and operation conditions based on the Braille data;” par. 0020: “However, when a function other than the copy function, such as a FAX function, is to be selected, it is necessary to switch to the selected function by inputting from the Braille keyboard 39;” Examiner interprets this to be equivalent to the processor searching for the relevant process function. Because the user enters braille input and then the function they requested happens, it is necessary for the processor to be able to read and understand the user’s braille input and consequently match it to the correct process). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the process function searching of Hayashi with the input method of Ragavan in order to more easily enable people with vision impairments to easily use electronic devices with touch screens to perform the specific functions they want (Hayashi, abstract; par. 0004).
Regarding Claim 12, Ragavan further discloses identifying the braille character occurs when a number of times confirming whether or not the dot configuring the braille character corresponds to the dotted protrusion reaches a predetermined number of times (par. 0039: “Each dot array 30 is a six-bit character set arranged in two (2) columns and three (3) rows. Each of the six-bit positions in the dot array 30 is represented by either a raised dot 35 or by a non-raised dot 40;” Examiner notes the predetermined number of times is six because there are six dots for each character and that the processor necessarily and inherently must check each of these six dots in order to correctly translate the character; par. 0013: “first gesture and the second gesture are used to input a Braille character while the computing system converts that Braille character into the other written language;” Examiner further notes the character is not translated/identified until the entire character—each of the six dots—has been entered).
Regarding Claim 13, Ragavan modified by Hayashi further discloses displaying a process function result of the one or more process functions searched (Hayashi, fig. 3: braille display 38; par. 0013: “As shown in the figure, the Braille character module 38 has six Braille elements 38e as elements for forming Braille characters, and variable information can be displayed depending on the concave and convex states of these Braille elements 38e. In order to realize the variable information display function, each braille element 38e needs to change its display state depending on the type of braille to be displayed;” par. 0017: “Incidentally, when the user performs an incorrect input operation or becomes unsure of how to operate during operation in the "Braille display mode," it is necessary to guide the user to perform the correct operation. That is, when such a situation occurs, it is necessary to inform the user of the current situation and to be able to take measures such as re-entering the data or guiding the user to the next step. Of course, it is possible to use the Braille display 37 to inform the user of the situation and to instruct the user to re-enter information, etc.”). The combination of the input method of Ragavan with the processor capable of process searching as taught by Hayashi described above for Claim 11 would have included this display.
Regarding Claim 14, Ragavan modified by Hayashi further discloses outputting a voice guidance relating to a determination of the one or more process functions searched (Hayashi, fig. 6; par. 0017: “For example, if the user makes a mistake in inputting operations or becomes unsure of how to operate the device midway, the current situation will be notified to the user by voice from the speaker. The voice notification allows the user to know the situation, and once the user has confirmed it, voice guidance is provided to guide the user to the next step”). The combination of the input method of Ragavan with the processor capable of process searching as taught by Hayashi described above for Claim 11 would have included the speaker.
Regarding Claim 16, Ragavan modified by Hayashi further discloses a first process function of the one or more process functions comprises a scanning function (Hayashi, fig. 1: scanner 4; par. 0011: “The scanner distribution application 29, in accordance with distribution instructions from the main controller (not shown), receives image data of the document input through the scanner unit 4 (processed in the same manner as when the copy function is used) from the image processing unit 24, converts this data into a data format that can be used by the distribution destination, compresses the data, etc., and sends it to the distribution destination via a communication interface”), and
a second process function of the one or more process functions comprises a printing function (Hayashi, fig. 1: printer 19; par. 0011: “The printer application 25 generates image data to be used for the print output in accordance with instructions of a print output request received from a host device (not shown) via a network or the like, and causes the printer unit 19 to print out the data based on this image data via the image processing unit 24”). The combination of the input method of Ragavan with the processor capable of process searching as taught by Hayashi described above for Claim 11 would have included the printing and scanning functions.
Regarding Claim 18, Ragavan further discloses the first operation is a tap operation, a double tap operation, a long tap operation, a swipe operation, a pinch out operation, or a pinch in operation (par. 0014: “the first gesture is a swipe [in] the right direction”), and the second operation is a tap operation, a double tap operation, a long tap operation, a swipe operation, a pinch out operation, or a pinch in operation (par. 0014: the second gesture is a swipe in the left direction”).
Regarding Claim 19, Ragavan modified by Hayashi further discloses the braille character has a one-to-one relationship with a process name of one of the one or more process functions (Hayashi, fig. 7; par. 0020: each braille input has only one possible output function; e.g., entering the braille character correlated to “scan” causes the device to scan or the character correlated to “fax” causes the device to fax. The Examiner notes that to a person having ordinary skill in the art, the one-to-one relationship between the entered braille characters and the process names of the process functions would be an obvious choice as to enhance the ease of use and predictability of the device). The combination of the input method of Ragavan with the processor capable of process searching as taught by Hayashi described above for Claim 11 would have included this one-to-one relationship.
Regarding Claim 20, Ragavan modified by Hayashi further discloses the braille character is correlated with a plurality of process names, each of the plurality of process names of one of the one or more process functions (Hayashi, par. 0020: each braille input is associated with a given process/function name, such as fax, copy, scan, or print. The Examiner notes that to a person having ordinary skill in the art, the correlation between one entered braille character and one process names of the process functions would be an obvious choice as to enhance the ease of use and predictability of the device). The combination of the input method of Ragavan with the processor capable of process searching as taught by Hayashi described above for Claim 11 would have included the described correlation.
Claims 5 and 15 are rejected under 35 U.S.C. 103 as being unpatentable over Ragavan in view of Hayashi as applied to claims 1 and 11, respectively, above, and further in view of Nagamatsu.
Regarding Claim 5 and 15, modified Ragavan does not explicitly disclose that the braille characters can represent characters of any of several different languages or types of text. However, Nagamatsu discloses the braille character is a hiragana character, an alphabet, a numerical character, or a character of a language (par. 0044: “data representing general characters such as hiragana, katakana, numerals, alphabets, [or] English words”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the available languages taught by Nagamatsu with the input apparatus and method of modified Ragavan in order to maximize the number of supported languages and therefore maximize the number of people who are able to use the device (Nagamatsu, par. 0044).
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JULIE DOSHER whose telephone number is (571) 272-4842. The examiner can normally be reached Monday - Friday, 10 a.m. - 6 p.m. ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Dmitry Suhol can be reached at (571) 272-4430. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/J.G.D./Examiner, Art Unit 3715
/DMITRY SUHOL/Supervisory Patent Examiner, Art Unit 3715