DETAILED ACTION
This action is in response to the application filed 02/28/2024. Claims 1 – 7 are pending and have
been examined, Claims 8 – 26 have been non-elected according to Response to Election / Restriction filed 11/20/2025.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1 – 2, 4 and 6 – 7 are rejected under 35 U.S.C. 103 as being unpatentable over Huang et al. (U.S. Pub. No. 2023/0328168, hereinafter “Huang”) in view of Park et al. (U.S. Pub. No. 2016/0098138, hereinafter “Park”).
Regarding Claim 1, Huang teaches
A system of live-streaming video and audio to a smartphone (see Huang Paragraph [0100], FIGS. 1G-1J, illustrate a portable device of a user transmitting and receiving audio data and/or video data in conjunction with an ongoing video call, in accordance with some embodiments. Further, FIGS. 1K-1L illustrate an example of the portable device operating in a coordinated fashion with a head-worn wearable device (in this example, a pair of smart glasses), in accordance with some embodiments, and Paragraph [0290], Turning next to FIG. 13C, responsive to receiving the user input 1304 (FIG. 13A) or 1305 (FIG. 13B), the smart glasses 150 capture, via the imaging device 169, video data. In some embodiments, the imaging device 169 is configured to capture a field of view of the imaging device of the smart glasses 150 (e.g., imaging device 169). In some embodiments, the smart glasses 150 simultaneously capture audio data using one or more microphones (e.g., a microphone included in the smart glasses 150). The captured video data and/or audio data is transmitted, in real-time, to one or more computing devices (e.g., a sever, a tablet, a computer, a smartphone, etc.) using a network 360 (FIG. 3). In some embodiments, another electronic device (e.g., a tablet, a computer, a smartphone, etc.) communicatively coupled to the smart glasses 150, the wrist-wearable device 102, or both transmits the captured video data and/or audio data (e.g., a smartphone can be used as a communication intermediary and can receive video and/or audio data from the head-worn wearable device and/or the wrist-wearable device, can process or otherwise combine that data, and can then send the processed or combined data to a server that then makes the video stream available to the viewers of the video/live stream)), the system comprising:
one or more cameras, microphones, and speakers coupled to a portable article for generating video and audio (see Huang Paragraph [0131] and Figure 1E, the smart glasses 150 include speakers, a microphone, and an imaging device 169 (e.g., a camera or other type of image sensor), the speakers, the microphone, the imaging device 169, and the display 155 are integrated and/or coupled to a part of a frame 157 of the smart glasses 150, and Paragraph [0132], cause the smart glasses 150 to one or more of capture, receive, and present one or both of audio data and video data);
Huang does not expressively teach
a smartphone modified with hardware that controls the one or more cameras, microphones, and speakers; and
a non-transitory memory storing an app and instructions on the smartphone, the app operating on the smartphone and accessing the non-transitory memory to provide inputs to the hardware for controlling the one or more cameras, microphones, and speakers to receive, store, and display the video and to receive, store, and broadcast the audio.
However, Park teaches
a smartphone modified with hardware that controls the one or more cameras, microphones, and speakers (see Park Paragraph [0332], The wearable device-dedicated icon 1009 of FIG. 36 may be an application for controlling the wearable device 200 by using the mobile terminal 100. In particular, the wearable device-dedicated icon 1009 of FIG. 36 may be an application for controlling the music playback of the wearable device 200 or the driving of the camera modules 280 and 290 by using the mobile terminal 100, Paragraph [0114], The wireless communication unit 110 may include at least one of a broadcast reception module 111, a mobile communication module 112, a wireless internet module 113, a short-range communication module 114, and a location information module 115. The input unit 120 may include a camera 121 or an image input unit for image signal input, a microphone 122 or an audio input unit for audio signal input, and a user input unit 123 (for example, a touch key and a mechanical key)) for receiving information from a user. Voice data or image data collected by the input unit 120 are analyzed and processed as a user's control command, and [0170] The input unit 202 may include a camera 202a or an image input unit for image signal input, a microphone 202b or an audio input unit for audio signal input, and a user input unit 202c for receiving information from a user. The camera 202a includes a front camera module 280 and a rear camera module 290 equipped at the wearable device 200 according to an embodiment of the present invention. Then, the user input unit 202c includes the above-described various buttons. Then, the microphone 202b includes a microphone 248 (see FIG. 4) equipped at the wearable device 200); and
a non-transitory memory storing an app and instructions on the smartphone (see Park Paragraph [0332], the wearable device-dedicated icon 1009 of FIG. 36 may be an application for controlling the music playback of the wearable device 200 or the driving of the camera modules 280 and 290 by using the mobile terminal 100, and Paragraph [0118], memory 170 may store a plurality of application programs (for example, application programs or applications) running on the mobile terminal 100 and also data and commands fir operations of the mobile terminal 100), the app operating on the smartphone and accessing the non-transitory memory to provide inputs to the hardware for controlling the one or more cameras, microphones, and speakers to receive, store, and display the video and to receive, store, and broadcast the audio (see Park Paragraph [0334], The control screen for wearable device 1015 of FIG. 36 may include a music control screen 1011 and a camera control screen 1013. The music control screen 1011 may be switched or into a radio control screen. That is, the music control screen 1011 and the radio control screen are switched into each other. The music control screen 1011 may be referred to a sound control screen. The sound control screen may play other sounds other than music, Paragraph [0336], The mobile terminal 100 may perform a control to capture an image by using the front camera module 280 and/or the rear camera module 290 of the wearable device 200 through the control screen for wearable device 1015. That is, the mobile terminal 100 checks which one of buttons 1021, 1023, and 1025 of FIG. 36 displayed on the control screen for wearable device 1015 in order for the driving of the front camera module 280 and/or the rear camera module 290 is selected in operation S317. The buttons may include a front camera button 1021, a rear camera button 1023, and an all button 1025 for driving both the front camera and the rear camera, Paragraph [0337], If a selection command on at least one of the buttons 1021, 1023, and 1025 displayed on the wearable control screen 1015 is input, the mobile terminal 100 controls the wearable device 200 so that a camera 1021 and/or 1023 corresponding to the selection command is driven to display a preview screen on an image input from the camera 1021 and/or 1023 in operation S318. For example, when a selection command on the all button 1025 is input, both the front camera module 280 and the rear camera module 290 are driven so that a front preview screen 1024 of FIG. 37 for a front image input to the front camera module 280 and a rear preview screen 1026 of FIG. 37 for a rear image input to the rear camera module 290 may be displayed, Paragraph [0338], When a capture command is input from a user, the mobile terminal 100 performs a control on the wearable device 200 to capture an image input to a corresponding camera in operation S319. Moreover, when pairing is completed, a control bar may be displayed on a screen. The control bar may a touch state display area. The control bar is displayed on a display window that forms another layer through a touch drag operation from an upper or lower bezel. A control window for controlling the wearable device 200 may be displayed on the display window. For example, a window for controlling each of audio and camera of the wearable device 200 may be displayed on the control window, Paragraph [320], the camera modules 280 and 290 mounted at the wearable device 200 may include a front camera module 280 for receiving an image for a front object, for example, picture image, or video, Paragraph [0114], The wireless communication unit 110 may include at least one of a broadcast reception module 111, a mobile communication module 112, a wireless internet module 113, a short-range communication module 114, and a location information module 115. The input unit 120 may include a camera 121 or an image input unit for image signal input, a microphone 122 or an audio input unit for audio signal input, and a user input unit 123 (for example, a touch key and a mechanical key)) for receiving information from a user. Voice data or image data collected by the input unit 120 are analyzed and processed as a user's control command, and [0170] The input unit 202 may include a camera 202a or an image input unit for image signal input, a microphone 202b or an audio input unit for audio signal input, and a user input unit 202c for receiving information from a user. The camera 202a includes a front camera module 280 and a rear camera module 290 equipped at the wearable device 200 according to an embodiment of the present invention. Then, the user input unit 202c includes the above-described various buttons. Then, the microphone 202b includes a microphone 248 (see FIG. 4) equipped at the wearable device 200).
It would have been obvious to one of ordinary skill in the art before the effective filing date of
the claimed invention to combine the teaching of a system of live streaming video and audio to a smartphone in which one or more cameras, microphones, and speakers are coupled to a portable article for generating audio and video (as taught in Huang), with a smartphone app that controls camera(s), microphone(s), and speaker(s) of a portable article (as taught in Park), the motivation being to provide a wireless communication to control a portable article/device to maximize user convenience (see Park Paragraph [0346]).
Regarding Claim 2, Huang in view of Park teach
The system according to claim 1, wherein the portable article comprises at least one of a pair of eyeglasses, a bodycam, and a headset (see Huang Figures 1G and 1H, smart glasses/ head-worn wearable device).
Regarding Claim 4, Huang in view of Park teach
The system according to claim 1, further comprising a wire harness for coupling the smartphone to the portable article and the one or more cameras, microphones, and speakers thereof (see Park Paragraph [0117], The interface unit 160 may serve as a path to various kinds of external devices connected to the mobile terminal 100. The interface unit 160 may include at least one of a wired/wireless headset port, an external charger port, a wired/wireless data port, a memory card port, a port connecting a device equipped with an identification module such as the USIM card 130, an audio Input/Output (I/O) port, a video I/O port, and an earphone port. In correspondence to that an external device is connected to the interface unit 160, the mobile terminal 100 may perform an appropriate control relating to the connected external device, and Paragraph [0157], The interface unit 160 may serve as a path to all external devices connected to the mobile terminal 100. The interface unit 160 may receive data from an external device, receive power and deliver it to each component in the mobile terminal 100, or transmit data in the mobile terminal 100 to an external device. For example, the interface unit 160 may include a wired/wireless headset port, an external charger port, a wired/wireless data port, a memory card port, a port connecting a device equipped with an identification module, an audio I/O port, a video I/O port, and an earphone port).
Regarding Claim 6, Huang in view of Park teach
The system according to claim 1, wherein the smartphone receives and displays the video on a display of the smartphone (see Huang Paragraph [0100], FIGS. 1G-1J, illustrate a portable device of a user transmitting and receiving audio data and/or video data in conjunction with an ongoing video call, in accordance with some embodiments. Further, FIGS. 1K-1L illustrate an example of the portable device operating in a coordinated fashion with a head-worn wearable device (in this example, a pair of smart glasses), in accordance with some embodiments, and Paragraph [0290], Turning next to FIG. 13C, responsive to receiving the user input 1304 (FIG. 13A) or 1305 (FIG. 13B), the smart glasses 150 capture, via the imaging device 169, video data. In some embodiments, the imaging device 169 is configured to capture a field of view of the imaging device of the smart glasses 150 (e.g., imaging device 169). In some embodiments, the smart glasses 150 simultaneously capture audio data using one or more microphones (e.g., a microphone included in the smart glasses 150). The captured video data and/or audio data is transmitted, in real-time, to one or more computing devices (e.g., a sever, a tablet, a computer, a smartphone, etc.) using a network 360 (FIG. 3). In some embodiments, another electronic device (e.g., a tablet, a computer, a smartphone, etc.) communicatively coupled to the smart glasses 150, the wrist-wearable device 102, or both transmits the captured video data and/or audio data (e.g., a smartphone can be used as a communication intermediary and can receive video and/or audio data from the head-worn wearable device and/or the wrist-wearable device, can process or otherwise combine that data, and can then send the processed or combined data to a server that then makes the video stream available to the viewers of the video/live stream), and Figures 1G, 1H, and 1I displaying the video call on a smartphone).
Regarding Claim 7, Huang in view of Park teach
The system according to claim 1, wherein the smartphone broadcasts the audio over a speaker of the smartphone (see Huang Paragraph [0290], the smart glasses 150 capture, via the imaging device 169, video data. In some embodiments, the imaging device 169 is configured to capture a field of view of the imaging device of the smart glasses 150 (e.g., imaging device 169). In some embodiments, the smart glasses 150 simultaneously capture audio data using one or more microphones (e.g., a microphone included in the smart glasses 150). The captured video data and/or audio data is transmitted, in real-time, to one or more computing devices (e.g., a sever, a tablet, a computer, a smartphone, etc.) using a network 360 (FIG. 3). In some embodiments, another electronic device (e.g., a tablet, a computer, a smartphone, etc.) communicatively coupled to the smart glasses 150, the wrist-wearable device 102, or both transmits the captured video data and/or audio data (e.g., a smartphone can be used as a communication intermediary and can receive video and/or audio data from the head-worn wearable device and/or the wrist-wearable device, can process or otherwise combine that data, and can then send the processed or combined data to a server that then makes the video stream available to the viewers of the video/live stream)).
Claim 3 is rejected under 35 U.S.C. 103 as being unpatentable over Huang et al. (U.S. Pub. No. 2023/0328168, hereinafter “Huang”) in view of Park et al. (U.S. Pub. No. 2016/0098138, hereinafter “Park”) and Jani (U.S. Pub. No. 2024/0099446).
Regarding Claim 3, Huang in view of Park teach all the limitations of Claim 1, but do not expressively teach
The system according to claim 1, wherein the portable article is mountable to a tripod.
However, Jani teaches
The system according to claim 1, wherein the portable article is mountable to a tripod (see Jani Paragraph [0052], It will be understood that, in some embodiments, the attachment mechanism 300 may be any mechanism (e.g., suction mounts, adhesive mounts, straps, harnesses, tripods, stands, etc.) to which a user may desire to use for attaching the rotatable mounting apparatus 200 and wearable device 100 to a mounting structure).
It would have been obvious to one of ordinary skill in the art before the effective filing date of
the claimed invention to combine the teaching of a system of live streaming video and audio to a smartphone in which one or more cameras, microphones, and speakers are coupled to a portable article for generating audio and video and a smartphone app that controls camera(s), microphone(s), and speaker(s) of a portable article (as taught in Huang in view of Park), with the capability of mounting a portable article to a tripod (as taught Jani), the motivation being to accommodate to different users that may find wearable devices difficult to adjust and use efficiently for different preferences of mounting, location, and different usability needs for their devices, by providing a form of mounting for the wearable device (see Jani Paragraphs [0002], [0003], and [0052]).
Claim 5 is rejected under 35 U.S.C. 103 as being unpatentable over Huang et al. (U.S. Pub. No. 2023/0328168, hereinafter “Huang”) in view of Park et al. (U.S. Pub. No. 2016/0098138, hereinafter “Park”) and Bindouski et al. (U.S. Pub. No. 2024/0195903, hereinafter “Bindouski”).
Regarding Claim 5, Huang in view of Park teach all the limitations of Claim 1, but do not expressively teach
The system according to claim 1, wherein the app operates on the smartphone to remotely control a mouse and keyboard of a personal computer.
However, Bindouski teaches
The system according to claim 1, wherein the app operates on the smartphone to remotely control a mouse and keyboard of a personal computer (see Bindouski Paragraph [0272] FIG. 12 illustrates an example implementation for an aspect of a method for remotely controlling a smartphone 1222 without tactile communication with a user of the smartphone 1222 through the user interface of a remote control application 1292 installed on a computer 1250 of any type 1278, when the smartphone 1222 and the given computer 1250 are operated simultaneously by the same user and are located within the physical reach of the same user. Instead of a 1222 smartphone, you can use any type of computer. For example, 1201, desktop computer, laptop, tablet, etc., and Paragraph [0279] The user of the computer 1250, through the user interface of the remote control application 1292 in the form of a projection of the screen 1290 of the smartphone 1222, expects to launch the mobile application 1282 of the smartphone 1222 using the computer mouse 1298 or keyboard 1296 of the remote control computer 1250, placing the mouse pointer 1286 on the corresponding element of the graphical interface-icon applications 1282).
It would have been obvious to one of ordinary skill in the art before the effective filing date of
the claimed invention to combine the teaching of a system of live streaming video and audio to a smartphone in which one or more cameras, microphones, and speakers are coupled to a portable article for generating audio and video and a smartphone app that controls camera(s), microphone(s), and speaker(s) of a portable article (as taught in Huang in view of Park), with an application that allows a user to remotely control a mouse and keyboard of a computer using another computer device (as taught in Bindouski), the motivation being to provide safe management of peripheral devices without direct tactile communication (see Bindouski Abstract and Paragraph [0002]).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Refer to PTO-892, Notice of References Cited for a listing of analogous art.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to CARISSA A JONES whose telephone number is (703)756-1677. The examiner can normally be reached Telework M-F 6:30 AM - 4:00 PM CT.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Duc Nguyen can be reached at 5712727503. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/CARISSA A JONES/Examiner, Art Unit 2691
/DUC NGUYEN/Supervisory Patent Examiner, Art Unit 2691