DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55.
Claim Objections
Claims 1, 5, 6, and 13 are objected to because of the following informalities:
In Claim 1, line 4, there should be a semicolon “;” inserted after “whiteboard.”
In Claim 1, lines 8-9, there should be a semicolon “;” inserted after “video.”
In Claim 1, line 15, the phrase “information relevant the…” should instead read “information relevant to the…”
In Claim 5, line 4, the phrase “the head gestures aand hand gestures” should instead read “the head gestures or hand gestures,” to maintain consistency.
In Claim 6, lines 1-2, the phrase “a power supply assembly comprising of solar panels coupled to the roof” should instead read “a power supply assembly comprising solar panels coupled to a roof.”
In Claim 13, lines 1-2, the phrase “The system of claim 1, wherein further comprising…” should instead read “The system of claim 1, further comprising...”
Appropriate correction is required.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-13 are rejected under 35 U.S.C. 103 as being unpatentable over a 2020 UNIDO article “How solar-powered tablets are educating rural communities about COVID-19 in Mozambique” (hereinafter “Kisakye”), and further in view of US 8,543,256 (hereinafter “Karafiath”), US 2020/0388186 (hereinafter “Weldemariam”), and US 2019/0228215 (hereinafter “Najafirad”).
Regarding Claim 1, Kisakye discloses an amphibious vehicle (p. 6: image of amphibious vehicle in a body of water);
at least one touch screen display (p. 5: “100 inch touch screen… 4 touchscreen computers;” p. 2: “The interactive touch screen”);
a digital whiteboard (p. 2: “The interactive touch screen… can also be used as a digital white board”);
at least one camera associated with the touch screen displays (p. 2: “The interactive touch screen can be linked with cameras”); and
a processor configured to communicate with the touch screen displays (p. 2: “The interactive touch screen… allow inclusive and group video conferencing;” p. 5: “touch screen computers;” Examiner notes this processor is inherent in such a system and must necessarily communicate with these interactive touch screen displays in order for the them to function as disclosed); and
display the information using the touch screen display (p. 2: “The Community Tablet delivers customized animations that are relevant and familiar to local communities;” Examiner further clarifies that Kisakye discloses displaying information which is relevant to the geographical location but does not explain how this information is determined. This will be addressed immediately below).
Kisakye discloses displaying information relevant to the geographical location but does not explicitly disclose how the location is determined. However, Karafiath discloses a location sensor on the amphibious vehicle configured to determine a geographical location of the amphibious vehicle (col. 6, lines 22-32: amphibious vehicle is equipped with a GPS to detect location) and configured to communicate the geographical location of the amphibious vehicle to the processor (Examiner notes the GPS must necessarily communicate with the processor for the information to be used). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the GPS of Karafiath with the system of Kisakye in order to effectively determine the location of the amphibious vehicle, which can be used in several ways, potentially including determining whether the vehicle is in water or on land (Karafiath, fig. 2; col. 8, lines 35-50) or remotely controlling the vehicle (Karafiath, col. 8, lines 8-23).
As stated above, Kisakye discloses displaying information relevant to the geographical location but does not disclose how the information is determined. However, Weldemariam discloses wherein the processor is configured to: communicate with a database to determine information relevant the geographical location of the amphibious vehicle (par. 0052: “the profile of the geospatial context of all locations with respect to each culturally sensitive feature is stored in a geolocation database wherein each cultural feature and associated context is geotagged;” par. 0056: “A contextual educational module may be triggered to provide personalized and interactive educational content to clarify culturally sensitive concepts and terms; the content can be optionally fetched from various information sources, such as databases;” Examiner notes the processor necessarily communicates with such a database in order to successfully fetch information from it). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the system of Kisakye with the database communication of Weldemariam in order to educate and communicate in a culturally and regionally proper/respectful manner (Weldemariam, abstract, pars. 0002-0003).
Kisakye also implies assessing emotional responses of users to detect efficacy (p. 2: “while also collecting vital data to demonstrate how well a message has been understood”) but does not explicitly explain how this process is done. However, Najafirad discloses the at least one camera (par. 0007: “different facial images captured from a camera(s)”) is configured to:
assess emotional responses of individual users to a user-defined event in a presentation or video (par. 0028: “the system does the following—extracts multiple faces from a streaming video, finds specific emotions in each face (e.g. happiness, fear, anger, etc.), and also determines the degree of arousal and valence associated with each emotion. Thus, the affective computing system provides a comprehensive model for extracting a person's emotions, emotional behavior over time, and the degree of arousal and valence associated”), and in response to the emotional responses of the individual users, obtaining a detection result of efficacy of the presentation or video (par. 0028: “a temporal neural network system capable of estimating the excitement and attentiveness of multiple users from streaming videos;” par. 0003: “method of extracting behavioral patterns, attention span, excitement, and engagement could help interpret whether a group of people paid attention to the events and gained any insights from them. This system can be used to assess qualitative and quantitative measures in sharing and receiving information in a group. Temporal information on the attentiveness on a per-user basis as well as of the group can be used to infer points of interest thereby help design novel interactive teaching methods, personalized exercise recommendations to students, better resource sharing, efficiency in meetings, etc.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to use the assessment of emotional responses as disclosed by Najafirad with the system of Kisakye in order to effectively determine whether users are paying attention and enjoying the provided content (Najafirad, abstract; par. 0028), which can then be used to better design future content (Najafirad, pars. 0002-0004). Additionally, Kisakye already discloses using emotional response data to determine the efficacy of the provided message (Kisakye, p. 2) and merely does not explain how that relied upon emotional response data is found.
Regarding Claim 2, Kisakye further discloses the touch screen display comprises a fully interactive touch-sensitive screen, wherein the touch screen display is configured to detect user input and communicate the user input to the processor (p. 5: “100 inch touch screen… 4 touchscreen computers;” p. 2: “The interactive touch screen… allow inclusive and group video conferencing;” Examiner further notes the touch screen being “interactive” or being referred to as “[a] touch screen computer” inherently means the display must necessarily be able to detect this input and communicate it to a processor; otherwise, nothing would happen upon user input).
Regarding Claim 3, Kisakye modified by Najafirad further discloses a first selection by a user of at least one individual of interest in the audience (Najafirad, figs. 2A-2B: user may select anywhere between one individual and all individuals in the audience) and a second selection of at least one metric of emotional expression of interest (Najafirad, fig. 4: one or multiple parts of individual’s expression can be analyzed to determine individual’s emotional state out of several emotions; par. 0029: “Facial Action Coding System (FACS) uses physical, visual changes in the face called action units (AUs) to encode facial expressions. FACS encoding can combine basic facial actions to represent complex human facial expressions. Each facial expression can have one or many AUs associated with it. Unique facial AUs are a result of one or more facial muscle movements. Thus, FACS in a high level is encoding subtle facial muscle movements into discrete action units. For example, AUs 1, 4, and 15 together correlate to a ‘sad’ emotion. In other words, the emotion ‘sad’ is encoded using FACS by combining AUs 1-Inner Brow Raiser, 4-Brow Lowered, 15-Lip Corner Depressor”). The combination of the system of Kisakye with the assessment of emotional responses of Najafirad described above for Claim 1 would have included this selection of an individual and metric.
Regarding Claim 4, Kisakye modified by Najafirad further discloses identified frames to extract the emotional expression data (Najafirad, figs. 2B, 3-4: identified frames used to extract emotional expression data; par. 0029: “Facial Action Coding System (FACS) uses physical, visual changes in the face called action units (AUs) to encode facial expressions. FACS encoding can combine basic facial actions to represent complex human facial expressions”) comprising: extracting from the database, emotional expression data associated with at least one selected emotional metric for at least one selected user and/or individual of interest (Najafirad, par. 0008: “facial expression image database… the neural network is provided with actual data and ground truths to learn from;” par. 0033: “The stored data consists of a timestamp of the capture, number of faces extracted, and attributes extracted for each face. This database provides a treasure trove of information and can be used for various data analytic tasks, such as finding the number of users attended, general emotional orientation of the group of users, etc.”). The combination of the system of Kisakye with the assessment of emotional responses of Najafirad described above for Claim 1 would have included these extraction steps.
Regarding Claim 5, Kisakye modified by Najafirad further discloses a digital whiteboard portion of the mobile classroom display system equipped to display location specific information (Kisakye, p. 2: “The interactive touch screen… can also be used as a digital white board… delivers customized animations that are relevant and familiar to local communities) supported by a camera to capture long distance mass gesture recognition through head gestures or hand gestures (Najafirad, figs. 2A-2B, 7: camera is used to capture head gestures of audience), wherein the head gestures and hand gestures comprise: captured gestures and calculated responses to rate the overall positive or negative understanding and/or reaction of the audience (par. 0028: “the system does the following—extracts multiple faces from a streaming video, finds specific emotions in each face (e.g. happiness, fear, anger, etc.), and also determines the degree of arousal and valence associated with each emotion. Thus, the affective computing system provides a comprehensive model for extracting a person's emotions, emotional behavior over time, and the degree of arousal and valence associated;” par. 0008: “accurately generate emotion probabilities for the face”). The combination of the system of Kisakye with the assessment of emotional responses of Najafirad described above for Claim 1 would have included the described gesture recognition.
Regarding Claim 6, Kisakye further discloses a power supply assembly comprising solar panels coupled to the roof of the amphibious vehicle (p. 5: image depicts solar energy source placed on roof of amphibious vehicle; p. 2: “The Community Tablet is… powered by solar panels”).
Regarding Claim 7, Kisakye further discloses a power supply comprising a battery electrically connected to a solar panel for supplying power to the processor, display and/or location sensor when the solar power is not supplying power (p. 3: “with the solar kits, consisting of a five 250w solar panels, one 3kW hybrid inverter, four 200AH batteries and accessories including circuit breakers, plugs, and terminals, the Community Tablets are being converted to solar power”).
Regarding Claim 8, Kisakye further discloses a trailer frame including a wheel assembly (p. 5: image at lower right depicts a trailer frame with wheel assembly), wherein the trailer frame is removably coupled to a motor vehicle, non-motor vehicle, or boat for transport (p. 2: “The Community Tablet is a container… transported by trailer, which can be attached to anything – from a motor vehicle to a donkey”).
Regarding Claim 9, Kisakye modified by Karafiath further discloses a small self-propelled motor that enables short distance travel from land to water (Karafiath, Fig. 2: the amphibious vehicle transitions from land to water; col. 7, line 58 - col. 8, line 7: “design can be optimized for providing for propulsion and maneuverability. For instance, two marine propellers 30 and two marine rudders 31 can be provided for wheels-up operation, and a marine jet (e.g., water-jet) propulsion unit 40 can be provided for wheels-down operation;” col. 7, lines 31-44: “an amphibious wheeled vehicle that can function as a marine vessel when in the water, and as a wheeled motor vehicle when on land”). The combination of the system of Kisakye with the amphibious vehicle comprising a GPS of Karafiath described above for Claim 1 would have included this motor.
Regarding Claim 10, Kisakye further discloses a network communication module configured to communicate wirelessly using one or more of: a satellite communication channel; and a packet-switched cellular telecommunications network (p. 2: “For connectivity, the tablet is linked to the internet network via the Global System for Mobile Communications (GSM) or by satellite”).
Regarding Claim 11, Kisakye modified by Karafiath discloses the processor is configured to detect contact points with water (Karafiath, col. 8, lines 35-50: “If the inventive vehicle is initially right-side up (wheels down) on upon situation in water, the inventive vehicle is caused to turn upside down (wheels up)… an onboard computer 76, situated in inventive vehicle 20, can afford autonomous control of inventive vehicle 20;” Examiner notes the ability of the vehicle to automatically convert to its wheels-up mode indicates there is data which indicates whether the system is travelling in water; Examiner additionally notes that because the location data is found via GPS, that data can inherently be used to also determine whether the current location or nearby locations are within bodies of water). The combination of the system of Kisakye with the GPS of Karafiath described above for Claim 1 would have included this detection ability.
Regarding Claim 12, Kisakye modified by Karafiath discloses predetermined data that includes at least one of data indicating when the mobile classroom system is immersed or travelling in water (Karafiath, col. 8, lines 35-50: “If the inventive vehicle is initially right-side up (wheels down) on upon situation in water, the inventive vehicle is caused to turn upside down (wheels up)… an onboard computer 76, situated in inventive vehicle 20, can afford autonomous control of inventive vehicle 20;” Examiner notes the ability of the vehicle to automatically convert to its wheels-up mode indicates there is data which indicates whether the system is travelling in water; Examiner additionally notes that because the location data is found via GPS, that data can inherently be used to also determine whether the current location is within a body of water). The combination of the system of Kisakye with the GPS of Karafiath described above for Claim 1 would have included this data.
Regarding Claim 13, Kisakye modified by Weldemariam further discloses a database that is queried by the processor in order to determine information that is relevant to a particular location (par. 0052: “the profile of the geospatial context of all locations with respect to each culturally sensitive feature is stored in a geolocation database wherein each cultural feature and associated context is geotagged;” par. 0056: “A contextual educational module may be triggered to provide personalized and interactive educational content to clarify culturally sensitive concepts and terms; the content can be optionally fetched from various information sources, such as databases;” Examiner notes the processor necessarily queries the database in order to retrieve said relevant information). The combination of the system of Kisakye with the database of Weldemariam described above for Claim 1 would have included this querying.
Claim 14 is rejected under 35 U.S.C. 103 as being unpatentable over Kisakye in view of Karafiath, Weldemariam, and Najafirad as applied to claim 1 above, and further in view of a 2019 BBC article “The tablet computer pulled by donkey” (hereinafter “Wakefield”) and US 2010/0092015 (hereinafter “McPherson”).
Regarding Claim 14, modified Kisakye does not explicitly disclose a loudspeaker. However, Wakefield discloses a loudspeaker (p. 8: loudspeaker to left of trailer), wherein the processor is configured to communicate with the waterproof loudspeaker (Examiner notes the processor must necessarily communicate with the loudspeaker to work as intended, i.e. play sound that corresponds to the presentation, which is also in communication with the processor; Examiner further notes the underlined portion will be addressed below), and wherein the waterproof loudspeaker system may be self-stored when the amphibious vehicle is in transit (fig. 8: loudspeaker is of a small size, and is capable of fitting inside the amphibious trailer while in transit for storage; Examiner reiterates the underlined portion will be addressed below). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the loudspeaker of Wakefield with the system of Kisakye in order to provide audio along with the presented video (Wakefield, p. 8).
Kisakye modified by Wakefield does not explicitly disclose the loudspeaker is waterproof. However, McPherson discloses a waterproof loudspeaker (abstract: “The speaker further includes a waterproof loudspeaker”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to use a waterproof speaker such as the one disclosed by Wakefield as the loudspeaker of Kisakye modified by Wakefield in order to allow the speaker to be used in various weather conditions (McPherson, par. 0018).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
A 2019 Guardian article “The man with a tablet for making aid to African countries better” (Austin) teaches an amphibious mobile classroom which customizes lessons based on geographical data
US 2007/0093148 (Gibbs) teaches an amphibious vehicle which has sensors that determine if the vehicle is in a body of water and where that water is making contact with the vessel
US 2021/0185276 (Peters) teaches a method and device which detects various facial features, compares those features to ones in a database to determine an emotion, and uses the emotional response data to gauge effectiveness of a lesson
US 2020/0342979 (Sadowsky) teaches a method and system which captures facial data and calculates a cognitive state metric based on the data
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JULIE DOSHER whose telephone number is (571) 272-4842. The examiner can normally be reached Monday - Friday, 10 a.m. - 6 p.m. ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Dmitry Suhol can be reached at (571) 272-4430. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/J.G.D./Examiner, Art Unit 3715
/DMITRY SUHOL/Supervisory Patent Examiner, Art Unit 3715