Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAILED ACTION
Response to Arguments
Regarding objections.
Applicant argues:
Objections to the Specification In response to the Examiner's objection to the specification, the new title is presented as set forth above. Accordingly, withdrawal of the objection to the specification is respectfully requested.
Examiner replies that:
Withdrawn.
Applicant argues:
Objections to the Claims Claims 2 and 3 are objected to because "the claims recite 'virtual reality', however while this is a 3D view and technically virtually reality could be implemented by the system, the system does not appear to intend to use virtual reality." The Office Action at page 2 Only to expedite prosecution, Applicant has amended claim 2 to recite "the shared image is an avatar transmitted as a three-dimensional image from the mobile terminal," and has amended claim 3 to recite "the shared image is an avatar transmitted as a three-dimensional moving image from the mobile terminal." The amendments to the claims alleviate the above-noted objections. Accordingly, Applicant respectfully requests withdrawal of the claim objections.
Examiner replies that:
Withdrawn.
Regarding 35 USC § 102/103.
Applicant argues:
Nakashima does not teach or suggest a conditional determination of whether the data amount of the shared image exceeds excess image processing capacity, followed by a display mode change when this condition is satisfied.
Nakashima discloses general processing load management techniques, such as reducing CPU clock frequency in response to thermal conditions (see paragraphs [0085]-[0087]), and switching from 3D to 2D map display to reduce load (see paragraph [0157]).
However, Nakashima lacks any teaching or suggestion of quantifying excess image processing capacity or calculating the size (data amount) of a shared image, let alone comparing the two to control a display mode. Nakashima controls the display based on hardware conditions, such as when the CPU becomes too hot. The system reacts by lowering the clock speed or switching to a lighter display mode. However, it does not compare the actual size of the image data with the available processing capacity.
In contrast, the claimed vehicle has a control mechanism where the display controller makes a decision based on a comparison between the data amount of the incoming shared image and the available image processing capacity. This control structure ensures that image display continues without interruption, even under load-providing a concrete technical effect and addressing a practical challenge not contemplated in Nakashima (see paragraph [0018] of the published application).
Further, there are further differences between the claimed vehicle and Nakashima.
Claim 1 requires that the shared image displayed by the vehicle is received from a mobile terminal that is wirelessly connected to the vehicle. This means that the image content-such as an avatar or map-is generated externally, for example on a user's smartphone, and then sent to the vehicle via wireless communication. Nakashima does not describe such an arrangement. While Nakashima discusses onboard CPUs and their functions like drive assist and multimedia processing (see paragraph [0037]), it does not mention receiving image content from an external mobile terminal. Although the term "mobile body" appears in Nakashima, it refers to the vehicle itself-not to a separate device like a mobile terminal (see paragraph [0037]). There is also no teaching of wireless coupling between the vehicle and a mobile terminal for transmitting shared image data.
Claim 1 also recites that the display controller outputs processed display data so that the shared image continues to be displayed without interruption. This reflects a specific design goal-to maintain continuous display even when the processing load is high, by adjusting the quality or format of the image as needed (see paragraph [0018] of the published application). Nakashima does not teach or suggest this kind of functionality. Although Nakashima describes switching from a 3D map to a 2D map when the CPU clock speed is reduced (see paragraph [0157]), this is done to reduce load and is not tied to keeping the display continuous. Nakashima does not describe generating or outputting adjusted display data for the purpose of avoiding display interruption, nor does it suggest that this is an intended goal.
Examiner replies that:
Applicant has amended the claims to change the scope since the previous action. The amendment(s) necessitate new ground(s) of rejection and are rejected in detail under the § 102/103 headings below.
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are:
Display unit in claims 1, 8 (and dependent claims).
Monitoring unit in claims 1, 9 (and dependent claims).
Display controller in claims 1, 9 (and dependent claims).
Main controller in claims 9 (and dependent claims).
Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claim 9 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 9 recites “the second display mode requiring less processing load than the first display mode” where the processing load is how much processing an image requires. As, written the language is not broad, but rather improper as “requiring less processing load” does not have a clear and valid meaning. The second display mode may require less processing (not processing load) than the first display mode. The second display mode alternative may require smaller processing loads. The second display mode alternative may require that less processing loads are processed.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) Claim(s) 1-2, 4, 8 is/are rejected under 35 U.S.C. 103 as being unpatentable over Nakashima U.S. Patent/PG Publication 20080091974 in view of Brock U.S. Patent/PG Publication 6473085 and Zhu U.S. Patent/PG Publication 20130147842.
Regarding claim 1:
A vehicle comprising: (Nakashima [0029] Referring first to FIG. 1, a car navigation device 1 is mounted next to a steering wheel 2 in a vehicle compartment. The car navigation device 1 has a microcomputer 3, a display 40, a speaker 42 and switches 44. The display 40 includes a monitor section 40A, an audio section 40B and a navigation section 40C.)
a display unit configured to display data including a shared image shared with received from a mobile terminal, the mobile terminal being communicatively coupled to the vehicle via configured to establish coupling by communication (Nakashima [0037] The multi-core CPU 10 is a CPU for a mobile body (vehicle) constructed by a plurality of CPU cores 50, 52, 54, 56, which are assigned to execute different processing. For instance, the CPU cores 50, 52, 54, 56 execute drive assist processing, route guide processing, multi-media control processing, and the like, respectively.)(Nakashima Fig. 2) since there is a mobile body (mobile terminal) connected (shared and coupled by communication) to a car display.
a monitoring unit configured to monitor [[a]] an image processing load state of the vehicle (Nakashima [0085] At the time of developing the application, for example, processing procedure is provided to reduce the processing load to execute the processing even when the CPU operation clock frequency has decreased in addition to when the first to fourth CPU cores 50 to 56 are executed with full power, and the processing load is switched over to the processing of a light load when the previous CPU operation clock frequency has decreased so will not to impair the holding of the system.)(Nakashima [0087] According to the above car navigation device 1, when it is detected by the first to fourth temperature sensors 80, 82, 84, 86 that the temperature of any one of the first to fourth CPU cores 50 to 56 has reached a predetermined temperature, the CPU operation clock frequencies of the second to fourth CPU cores 52, 54, 56 are lowered except that of the first CPU core 50 to which the processing of the highest priority is assigned among a plurality of processing assigned to the first to fourth CPU cores 50, 52, 54 and 56.).
and a display controller configured to determine (Nakashima [0085] At the time of developing the application, for example, processing procedure is provided to reduce the processing load to execute the processing even when the CPU operation clock frequency has decreased in addition to when the first to fourth CPU cores 50 to 56 are executed with full power, and the processing load is switched over to the processing of a light load when the previous CPU operation clock frequency has decreased so will not to impair the holding of the system.)(Nakashima [0087] According to the above car navigation device 1, when it is detected by the first to fourth temperature sensors 80, 82, 84, 86 that the temperature of any one of the first to fourth CPU cores 50 to 56 has reached a predetermined temperature, the CPU operation clock frequencies of the second to fourth CPU cores 52, 54, 56 are lowered except that of the first CPU core 50 to which the processing of the highest priority is assigned among a plurality of processing assigned to the first to fourth CPU cores 50, 52, 54 and 56.).
control a display mode of the shared image when the data amount of the shared image exceeds the amount of excess image processing capacity and in accordance with the processing load state and an amount of data regarding the shared image output processed display data to the display unit such that the shared image continues to be displayed without interruption (Nakashima [0157] In the above first and second embodiments, the 3D map display that exerts a high processing load on the navigation software may be limited while the clock frequency is being lowered and, instead, the two-dimensional map may be mainly displayed.).
Nakashima does not teach comparing image processing capacity with the amount of data, although they determine if image processing capacity is being exceeded. In a related field of endeavor, Brock teaches:
and a display controller configured to determine an amount of excess image processing capacity based on the image processing load state; compare the amount of excess image processing capacity with a data amount of the shared image (Brock C5 L50-65 As illustrated in FIG. 2, interactive graphics application 200 further includes a render mode select facility 208 for directing the dynamic selection of an appropriate rendering mode. The resource fractions for each of the rendering modes are accessible by render mode select facility 208 which may reside within interactive graphics package 206. The system load sensor 215 relates system load information to render mode select facility 208. In turn, the render mode select facility 208 compares the resource fraction of each rendering mode with the currently available processing capacity as determined by the processing load. A rendering mode may then be selected and activated by render mode select facility 208 such that the amount of computation required to render using the selected rendering mode is compatible with the determined processor load. Until changed by the render mode select facility 208, render engine 213 uses the selected render mode for subsequent image rendering.)(Brock C6 L50-65 FIG. 4 is a high-level logic diagram depicting a dynamic rendering mode selection sequence 400 in accordance with the teachings of the present invention. Rendering mode selection sequence 400 commences as shown at step 402 and proceeds to step 404, which illustrates the system processing load being sampled. This may be done through a system load sensor facility 205 incorporated in the data processing system. Once the processing load has been determined, and as shown at step 410, the processing capacity available for rendering images is determined. In one embodiment of the present invention, this available capacity may be computed as the difference between the maximum possible processing capacity (i.e., 1.0) and the current processing load as a fraction of the maximum.)
Therefore, it would have been obvious before the effective filing date of the claimed invention to compare image processing capacity with the amount of data as taught by Brock. The motivation for doing so would have been to maximize the visual quality and reduce sluggish responses (Brock C1 L40-55). Further, the rationale for doing so would have been that it is a simple substitution of one known element for another to obtain predictable results where Nakashima reduces quality if the CPU temperatures are getting too hot, and Brock is reducing quality if the processor is going to be beyond capacity, where the end result in both is adjusting the processing load to prevent errors. Therefore it would have been obvious to combine Brock with Nakashima to obtain the invention.
Nakashima in view of Brock does not teach wireless communication. In a related field of endeavor, Brock teaches:
a display unit configured to display data including a shared image shared with received from a mobile terminal, the mobile terminal being communicatively coupled to the vehicle via wireless communication configured to establish coupling by communication (Zhu [0026] Both the server 114 and the display devices 116-122 are computer devices that may include one or more processors such as a central processing unit (CPU) 130, one or more high speed memories 131, one or more low speed memories 132, 132a, 132b, 132c, one or more user interface devices 134 (keyboard, touch screen, etc.), a network interface 136 and one or more peripheral interfaces, along with other well known components. As is known to one skilled in the art, other types of computers may be used as display devices that have different architectures than those depicted in FIG. 1. The display devices 116-122 represent any suitable computer device, such as a desktop device, a handheld and/or mobile device, such as a mobile phone, personal data assistant, laptop computer, tablet personal computer, car navigation system, or hand-held GPS unit. More broadly, the display devices 116-122 represent any personal computing device, server, or network of such devices, or any other processing device having a user interface and CPU and capable of displaying a visual rendering of map data. Furthermore, while in some examples, the network 125 is described as a wireless network, the network 125 may be any wired or wireless network, where the display devices 116-122 are communicatively coupled to the network 125.)
Therefore, it would have been obvious before the effective filing date of the claimed invention to use a wireless terminal as taught by Zhu. The rationale for doing so would have been that it is a simple substitution of one known element for another to obtain predictable results where Nakashima has a car navigation where the mobile terminal sends information to the display, and Zhu has a car display that can receive information over a network and includes mobile devices, where the end result is data being received from an electronic device and displayed by the vehicle. Therefore it would have been obvious to combine Zhu with Nakashima in view of Brock to obtain the invention.
Regarding claim 2:
The vehicle according to claim 1, has all of its limitations taught by Nakashima in view of Brock and Zhu. Nakashima further teaches wherein the shared image is an avatar transmitted as a three-dimensional image from the mobile terminal, and the display controller is configured to, when the processing load state is a high load state, allow the display unit to display the avatar, while changing the three-dimensional image of the avatar to a two-dimensional image (Nakashima [0157] In the above first and second embodiments, the 3D map display that exerts a high processing load on the navigation software may be limited while the clock frequency is being lowered and, instead, the two-dimensional map may be mainly displayed.). See fig. 1, 40c, the map contains a car avatar.
Regarding claim 4:
The vehicle according to claim 1, has all of its limitations taught by Nakashima in view of Brock and Zhu. Nakashima further teaches wherein the shared image is a map image, the map image is a three-dimensional image, and the display controller is configured to, when the processing load state is a high load state, allow the display unit to display the map image, while changing the three-dimensional image of the map image to a two-dimensional image (Nakashima [0157] In the above first and second embodiments, the 3D map display that exerts a high processing load on the navigation software may be limited while the clock frequency is being lowered and, instead, the two-dimensional map may be mainly displayed.).
Regarding claim 8:
The claim is a parallel version of claim 1. As such it is rejected under the same teachings.
Claim(s) 3 is/are rejected under 35 U.S.C. 103 as being unpatentable over Nakashima U.S. Patent/PG Publication 20080091974 in view of Brock U.S. Patent/PG Publication 6473085 and Zhu U.S. Patent/PG Publication 20130147842 and Kamiya U.S. Patent/PG Publication 20240083252.
Regarding claim 3:
The vehicle according to claim 1, has all of its limitations taught by Nakashima in view of Brock and Zhu. Nakashima further teaches wherein the shared image is an avatar transmitted as a three-dimensional image from the mobile terminal, and the display controller is configured to, when the processing load state is a high load state, allow the display unit to display the avatar, (Nakashima [0157] In the above first and second embodiments, the 3D map display that exerts a high processing load on the navigation software may be limited while the clock frequency is being lowered and, instead, the two-dimensional map may be mainly displayed.). See fig. 1, 40c, the map contains a car avatar.
Nakashima does not teach reducing transitions. In a related field of endeavor, Kamiya teaches:
while suppressing a number of transitions of the moving image (Kamiya [0105] In this case, the processing load may be reduced by decreasing the frame rate of the image content displayed in the visual field of the occupant).
Therefore, it would have been obvious before the effective filing date of the claimed invention to reduce transitions as taught by Kamiya. The motivation for doing so would have been to reduce the processing load (Kamiya [0105]). Therefore it would have been obvious to combine Kamiya with Nakashima to obtain the invention.
Claim(s) 5 is/are rejected under 35 U.S.C. 103 as being unpatentable over Nakashima U.S. Patent/PG Publication 20080091974 in view of Brock U.S. Patent/PG Publication 6473085 and Zhu U.S. Patent/PG Publication 20130147842 and Gorantla U.S. Patent/PG Publication 20230146926.
Regarding claim 5:
The vehicle according to claim 1, has all of its limitations taught by Nakashima in view of Brock and Zhu. Nakashima further teaches wherein the shared image is a map image, and the display controller is configured to, when the processing load state is a high load state, allow the display unit to display the map image, (Nakashima [0157] In the above first and second embodiments, the 3D map display that exerts a high processing load on the navigation software may be limited while the clock frequency is being lowered and, instead, the two-dimensional map may be mainly displayed.).
Nakashima does not teach deleting distant objects. In a related field of endeavor, Gorantla teaches:
allow the display unit to display the map image, while deleting, from the map image, an image of an object located depthwise far away from a travel path (Gorantla [0076] Since object classification can be computationally expensive compared to identifying ground reflections or moving objects, the first filtering operation can speed up processing time by reducing the amount of data to which object-based filtering is applied. [0077] At 806, the computer system identifies features in the input map that correspond to objects belonging to one or more classes. For example, the features identified in 806 may include points associated with bounding boxes that are labeled as being vehicles. Alternatively or additionally, the computer system identifies features that are beyond a threshold distance from the first vehicle. Identifying features corresponding to objects that belong to one or more classes can be performed as part of object-based filtering, using one or more object classes as filter criteria. Identifying features that are beyond a threshold distance can be performed as part of distance-based filtering, using the threshold distance as a filter criterion. For example, a threshold distance of fifteen meters can be used to define a rectangular region of interest such that features more than fifteen meters away in a lateral or longitudinal direction are identified for removal.).
Therefore, it would have been obvious before the effective filing date of the claimed invention to delete distant objects as taught by Gorantla. The motivation for doing so would have been reduced processing (Gorantla [0076]-[0077]). Therefore it would have been obvious to combine Gorantla with Nakashima to obtain the invention.
Claim(s) 6 is/are rejected under 35 U.S.C. 103 as being unpatentable over Nakashima U.S. Patent/PG Publication 20080091974 in view of Brock U.S. Patent/PG Publication 6473085 and Zhu U.S. Patent/PG Publication 20130147842 and Yoshida U.S. Patent/PG Publication 20100100846.
Regarding claim 6:
The vehicle according to claim 1, has all of its limitations taught by Nakashima in view of Brock and Zhu. Nakashima further teaches wherein the shared image is a map image, and the display controller is configured to, when the processing load state is a high load state, allow the display unit to display the map image, (Nakashima [0157] In the above first and second embodiments, the 3D map display that exerts a high processing load on the navigation software may be limited while the clock frequency is being lowered and, instead, the two-dimensional map may be mainly displayed.).
Nakashima does not teach enlarging the map. In a related field of endeavor, Yoshida teaches:
when the processing load state is a high load state, allow the display unit to display the map image, while enlarging the map image (Yoshida [0092] (4) When the map is displayed with a larger scale (zoomed) larger than a predetermined scale, area of a region covered by the map image in the window becomes smaller, and the number of on-map icons on the map image becomes relatively smaller and intervals between the on-map icons become larger. In such a case, the operability on the map image may not be reduced and the processing load on the force data setting may not become excessive. In view of the above, the condition for force setting includes a condition that the map image is displayed with a large scale larger than a predetermined scale. If this condition is satisfied, the attractive force may be set to the or-map icons. Due to this manner, it is possible to facilitate an operation of pointing to an icon on the map image.).
Therefore, it would have been obvious before the effective filing date of the claimed invention to enlarge the map as taught by Yoshida. The motivation for doing so would have been to reduce processing (Yoshida [0092]). Therefore it would have been obvious to combine Yoshida with Nakashima to obtain the invention.
Claim(s) 7 is/are rejected under 35 U.S.C. 103 as being unpatentable over Nakashima U.S. Patent/PG Publication 20080091974 in view of Brock U.S. Patent/PG Publication 6473085 and Zhu U.S. Patent/PG Publication 20130147842 and Sawato U.S. Patent/PG Publication 20150256506.
Regarding claim 7:
The vehicle according to claim 2, has all of its limitations taught by Nakashima in view of Brock and Zhu. Nakashima does not teach a conference. In a related field of endeavor, Sawato teaches:
wherein the avatar (Nakashima [0082] FIG. 9A is a view showing an example of displaying second icons in a second region and first icons in a first region of a display unit of an electronic device of the own vehicle (K) (hereinafter referred to as, "display unit of own vehicle (K)") of an enhanced first embodiment; ) is used in a Web conference, (Sawato [0072] FIG. 3E is a view showing a display example of the display unit of the own vehicle (K) in a case of the own vehicle (K) sending a "thank you" message;).
Therefore, it would have been obvious before the effective filing date of the claimed invention to have a conference as taught by Sawato. The motivation for doing so would have been allow for users to enjoy conversation (Sawato [0070]). Therefore it would have been obvious to combine Sawato with Nakashima to obtain the invention.
Claim(s) Claim(s) 9 is/are rejected under 35 U.S.C. 103 as being unpatentable over Nakashima U.S. Patent/PG Publication 20080091974.
Regarding claim 9:
(New) A vehicle comprising: (Nakashima [0029] Referring first to FIG. 1, a car navigation device 1 is mounted next to a steering wheel 2 in a vehicle compartment. The car navigation device 1 has a microcomputer 3, a display 40, a speaker 42 and switches 44. The display 40 includes a monitor section 40A, an audio section 40B and a navigation section 40C.)
a display unit; a display controller configured to communicate with a mobile terminal, receive a shared image generated by software running on the mobile terminal, and display the shared image on the display unit (Nakashima [0037] The multi-core CPU 10 is a CPU for a mobile body (vehicle) constructed by a plurality of CPU cores 50, 52, 54, 56, which are assigned to execute different processing. For instance, the CPU cores 50, 52, 54, 56 execute drive assist processing, route guide processing, multi-media control processing, and the like, respectively.)(Nakashima Fig. 2) since there is a mobile body (mobile terminal) connected (shared and coupled by communication) to a car display.
a main controller configured to control overall operation of the vehicle and manage allocation of processing resources (Nakashima [0038] The display 40 is for displaying the results of the route guide processing, multi-media control processing or safety drive assist processing executed by the microcomputer 3, and a liquid crystal display or a CRT is used. The speaker 42 is for informing the driver of the results of the route guide processing, multi-media control processing or safety drive assist processing executed by the microcomputer 3 by voice. The switches 44 are for making various inputs to the microcomputer 3 by the driver and are, concretely, touch sensors arranged on the screen of the display 40.)(Nakashima [0039] The multi-core CPU 10 includes first to fourth CPU cores 50, 52, 54, 56, first to fourth temperature sensors 80, 82, 84, 86, a processing priority storage unit 32, a CPU clock-forming unit 90, a control unit 30, first to fourth cache memories 60, 62, 64, 66, a cache synchronizer unit 68, a memory controller 70, and a temperature detector unit 88.).
a monitoring unit configured to monitor a processing load associated with displaying the shared image (Nakashima [0085] At the time of developing the application, for example, processing procedure is provided to reduce the processing load to execute the processing even when the CPU operation clock frequency has decreased in addition to when the first to fourth CPU cores 50 to 56 are executed with full power, and the processing load is switched over to the processing of a light load when the previous CPU operation clock frequency has decreased so will not to impair the holding of the system.)(Nakashima [0087] According to the above car navigation device 1, when it is detected by the first to fourth temperature sensors 80, 82, 84, 86 that the temperature of any one of the first to fourth CPU cores 50 to 56 has reached a predetermined temperature, the CPU operation clock frequencies of the second to fourth CPU cores 52, 54, 56 are lowered except that of the first CPU core 50 to which the processing of the highest priority is assigned among a plurality of processing assigned to the first to fourth CPU cores 50, 52, 54 and 56.).
and wherein the display controller is further configured to control display of the shared image in coordination with the main controller, and to display the shared image in a first display mode or a second display mode, the second display mode requiring less processing load than the first display mode (Nakashima [0157] In the above first and second embodiments, the 3D map display that exerts a high processing load on the navigation software may be limited while the clock frequency is being lowered and, instead, the two-dimensional map may be mainly displayed.).
and wherein, when the processing load exceeds a processing capacity available for display operations of the display controller and the main controller, (Nakashima Fig. 2)(Nakashima [0085] At the time of developing the application, for example, processing procedure is provided to reduce the processing load to execute the processing even when the CPU operation clock frequency has decreased in addition to when the first to fourth CPU cores 50 to 56 are executed with full power, and the processing load is switched over to the processing of a light load when the previous CPU operation clock frequency has decreased so will not to impair the holding of the system.)(Nakashima [0087] According to the above car navigation device 1, when it is detected by the first to fourth temperature sensors 80, 82, 84, 86 that the temperature of any one of the first to fourth CPU cores 50 to 56 has reached a predetermined temperature, the CPU operation clock frequencies of the second to fourth CPU cores 52, 54, 56 are lowered except that of the first CPU core 50 to which the processing of the highest priority is assigned among a plurality of processing assigned to the first to fourth CPU cores 50, 52, 54 and 56.).
the display controller is configured to switch from the first display mode to the second display mode (Nakashima [0157] In the above first and second embodiments, the 3D map display that exerts a high processing load on the navigation software may be limited while the clock frequency is being lowered and, instead, the two-dimensional map may be mainly displayed.).
Conclusion
For the prior art referenced and the prior art considered pertinent to Applicant’s disclosure but not relied upon, see PTO-892 “Notice of References Cited”.
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JASON PRINGLE-PARKER whose telephone number is (571) 272-5690 and e-mail is jason.pringle-parker@uspto.gov. The examiner can normally be reached on 8:30am-5:00pm est Monday-Friday. If attempts to reach the examiner by telephone are unsuccessful, the examiner's supervisor, King Poon can be reached on (571) 270-0728. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, seehttp://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JASON A PRINGLE-PARKER/
Primary Examiner, Art Unit 2617