DETAILED ACTION
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
This Office Action is in response to Request for Continued Examination and Applicant Amendment and Arguments filed on 26 November, 2025.
Claims 1-2, 6-12, 14-18 and 20 are pending for examination. Claims 3-5, 13 and 19 were cancelled.
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 26 November, 2025 has been entered.
Claim Rejections - 35 USC § 103
The following is a quotation of pre-AIA 35 U.S.C. 103(a) which forms the basis for all obviousness rejections set forth in this Office action:
(a) A patent may not be obtained though the invention is not identically disclosed or described as set forth in section 102, if the differences between the subject matter sought to be patented and the prior art are such that the subject matter as a whole would have been obvious at the time the invention was made to a person having ordinary skill in the art to which said subject matter pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-2, 6-7, 11-12, 14-15 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Ashwood et al. (US Pub. 2019/0064891 A1) in view of Tzvieli et al. (US Pub. 2021/0318558 A1) and further in view of Tzvieli et al. (US Pub. 2022/0155860 A1 (hereafter Tzvieli ‘860’)), Goldberg et al. (US Pub. 2017/0092007 A1) and Li et al. (US Pub. 2019/0000332 A1).
Ashwood, Tzvieli, Goldberg and Li were cited in the previous Office Action.
As per claim 1, Ashwood teaches the invention substantially as claimed including Eyewear, comprising (Ashwood, Fig. 1, 31 eyewear):
a frame having a first side and a second side (Ashwood, Fig. 1, 32 frame including left and right sides; [0022] lines 2-5, The glasses 31 can include a frame 32 made from any suitable material such as plastic or metal, including any suitable shape memory alloy; [0023] lines 2-7, Frame 32 additionally includes a left arm or temple piece 46 and a second arm or temple piece 47 coupled to the respective left and right end portions 41, 42 of the front piece 33 by any suitable means such as a hinge (not shown), so as to be coupled to the front piece 33, or rigidly or fixable secured to the front piece so as to be integral with the front piece 33);
a plurality of electronic components (Ashwood, Fig. 1, 69; [0026] lines 1-6, Glasses 31 include cameras 69. Although two cameras are depicted, other embodiments contemplate the use of a single or additional (i.e., more than two) cameras. In various embodiments, glasses 31 may include any number of input sensors or peripheral devices in addition to cameras 69);
a first system on a chip (SoC) adjacent the first side of the frame, the first SoC coupled to a first set of the plurality of electronic components (Ashwood, Fig. 1, 61 computer (as first system on a chip) with left frame, 69 camera and sensors in left frame; Fig. 2, 219 peripheral device elements (E.G sensors) (as coupled to); [0024] lines 1-19, Glasses 31 can include a computing device, such as computer 61, which can be of any suitable type so as to be carried by the frame 32 and…In one embodiment, the computer 61 can be disposed in both of the temple pieces 46, 47. The computer 61 can include one or more processors with memory, wireless communication circuitry, and a power source. As described above, the computer 61 comprises low-power circuitry, high-speed circuitry, and a display processor. Various other embodiments may include these elements in different configurations or integrated together in different ways. Additional details of aspects of computer 61 may be implemented as illustrated by camera device 210 discussed below; also see [0091] lines 1-3, Certain embodiments are described herein as including logic or a number of components, modules, elements, or mechanism; [0092] lines 7-8, Field-Programmable Gate Array (FPGA) or an Application Specific Integrated Circuit (ASIC) (as SoC)); and
a second SoC adjacent the second side (Ashwood, Fig. 1, 61 computer, 69 camera and sensors in right frame; Fig. 2, 219 peripheral device elements (E.G sensors) (as coupled to); [0024] lines 1-19, Glasses 31 can include a computing device, such as computer 61, which can be of any suitable type so as to be carried by the frame 32 and…In one embodiment, the computer 61 can be disposed in both of the temple pieces 46, 47 (as second SOC in right side frame). The computer 61 can include one or more processors with memory, wireless communication circuitry, and a power source. As described above, the computer 61 comprises low-power circuitry, high-speed circuitry, and a display processor. Various other embodiments may include these elements in different configurations or integrated together in different ways; [0092] lines 7-8, Field-Programmable Gate Array (FPGA) or an Application Specific Integrated Circuit (ASIC) (as SOC))); a second set of the plurality of electronic components (Ashwood, Fig. 1, 69 camera in right side frame, 44 display; [0026] lines 1-6, Glasses 31 include cameras 69. Although two cameras are depicted, other embodiments contemplate the use of a single or additional (i.e., more than two) cameras. In various embodiments, glasses 31 may include any number of input sensors or peripheral devices in addition to cameras 69);
wherein the first and second set of the plurality of electronic components include the same type of electronic components (Ashwood, Fig. 1, 69 camera in both left and right side frames, 43, 44 display; [0026] lines 1-6, Glasses 31 include cameras 69. Although two cameras are depicted, other embodiments contemplate the use of a single or additional (i.e., more than two) cameras. In various embodiments, glasses 31 may include any number of input sensors or peripheral devices in addition to cameras 69; [0031] lines 12-14, Each of the optical elements 43, 44 can be a lens, a display, a display assembly or a combination of the foregoing (as same types (i.e., camera, sensors and displays on both side of frames)).
wherein the first set of the plurality of electronic components comprises a first camera and a first display each directly coupled to the frame first side (Ashwood, Fig. 1, 69 camera in left side frame, 43 display; [0026] lines 1-6, Glasses 31 include cameras 69. Although two cameras are depicted, other embodiments contemplate the use of a single or additional (i.e., more than two) cameras. In various embodiments, glasses 31 may include any number of input sensors or peripheral devices in addition to cameras 69; [0031] lines 12-14, Each of the optical elements 43, 44 can be a lens, a display, a display assembly or a combination of the foregoing); and
the second set of the plurality of electronic components comprises a second camera, and a second display each directly coupled to the frame second side (Ashwood, Fig. 1, 69 camera in right side frame, 44 display; [0026] lines 1-6, Glasses 31 include cameras 69. Although two cameras are depicted, other embodiments contemplate the use of a single or additional (i.e., more than two) cameras. In various embodiments, glasses 31 may include any number of input sensors or peripheral devices in addition to cameras 69; [0031] lines 12-14, Each of the optical elements 43, 44 can be a lens, a display, a display assembly or a combination of the foregoing).
Ashwood fails to explicitly teach the second SoC coupled to the first SoC and to a second set of the plurality of electronic components, wherein the first SoC and the second SoC are synchronized, wherein each of the SoCs are configured to operate the set of the plurality of electronic components of the other.
However, Tzvieli teaches the second SoC coupled to the first SoC and to a second set of the plurality of electronic components, wherein each of the SoCs are configured to operate the set of the plurality of electronic components of the other (Tzvieli, Fig. 1A, 35 in both left side and right side (as SoCs)’ Fig. 3A, PSOG (as SoC); Fig. 3B, 232a, 232b, 232c, 231a, 231b, 233a (left side); 232d, 232e, 232f, 231c, 231d, 233b (right side) (as second set of plurality of electronic components); [0064] lines 11-12, first electronic components (35, 36), (which is also in other side; see Fig. 1A); [0067] lines 1-2, an electronic component may be a form of a computer; [0115] lines 4-7, the computer (400, 410) may be implemented in various ways, such as, but not limited to, a microcontroller, a computer on a chip, a system-on-chip (SoC); [0131] lines 1-4, The terms “photosensor-oculography” and “photosensor-oculography device” (PSOG) as used herein refer to measuring eye position and/or eye movements (or equivalents thereof), of either one eye or both eyes, [0150] lines 1-4, PSOG, as the term is used herein, may involve utilization of one or more light sources and/or one or more detectors, such as discrete photosensors, that detect reflections of the light emitted from the one or more light sources; [0154] FIG. 3B illustrates an embodiment of an eye tracking system that tracks both eyes, which utilizes multiple light sources and detectors to track each eye. The illustrated system includes smartglasses 230 that have PSOG can VOG to track both eyes. Tracking of the left eye is done utilizing a PSOG that includes multiple light sources (emitters 231a and 231b in the figure) as well as multiple detectors (photosensors 232a, 232b, and 232c). Additionally, video camera 233a may be utilized to capture images of the left eye, which can be used to determine positions and/or movements of the left eye. In a similar fashion, tracking the right eye is done in this embodiment utilizing another PSOG that includes additional light sources (emitters 231c and 231d in the figure) as well as additional multiple detectors (photosensors 232d, 232e, and 232f) and an additional video camera 233b that may be utilized to capture images of the right eye [Examiner noted: the PSOG (as SoCs) are configured to operate the set of plurality of electronic components of the other (i.e., utilizing emitters and detectors), see [0131] (PSOG) as used herein refer to measuring eye position and/or eye movements (or equivalents thereof), of either one eye or both eyes (as they can operate the set of the plurality of electronic components of the other)]).
It would have been obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to have combined the teaching of Ashwood with Tzvieli because Tzvieli’s teaching of utilizing the different computers jointly performing the different functions within the wearable device would have provided Ashwood’s system with the advantage and capability to allow the system to utilizing both computers to perform the functions in order to improving the processing speed and system performance.
Both Ashwood and Tzvieli fail to specifically teach wherein the first SoC and the second SoC are synchronized.
However, Tzvieli ‘860’ teaches wherein the first SoC and the second SoC are synchronized (Tzvieli ‘860’, [0015] the eye tracking system may include another PSOG that emits light and takes additional measurements of reflections of the light from the other eye of the user, and the computer calculates a gaze direction based on the measurements of the reflections measured by the PSOG and the additional measurements taken by the other PSOG; [0146] lines 1-18, Due to the nature of how the signals are acquired, the measurements 263 of the reflections and the events 265 will often be received in different manners. Namely, the events 265 are typically detected asynchronously, while the measurements 263 may be obtained in a synchronous manner (e.g., during certain periods, the PSOG 262 may be operated at a fixed frequency at which it emits lights and measures reflections from the eye). Additionally, due to the nature of the operation of event cameras, which enable a quick reading of single pixels, in some embodiments, the rate at which the events 265 are received can be higher than the rate at which the measurements 263 are acquired. Utilizing both the measurements 263 and the events 265 to calculate the eye positions 269 can leverage the different characteristics of these signals to improve performance of an eye tracker system that utilizes the PSOG 262 and the event camera 264, in terms of accuracy, frequency, and/or reduced power usage; (as first SoC (PSOG) and the second SoC/PSOG are synchronized (i.e., measurements obtained in a synchronous manner) for accuracy purpose)).
It would have been obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to have combined the teaching of Ashwood and Tzvieli with Tzvieli ‘860’ because Tzvieli ‘860’’s teaching of both PSOGs at left and right side of smart glasses are utilized to obtaining the measurements in a synchronous manner would have provided Ashwood and Tzvieli’s system with the advantage and capability to allow the system to improving the accuracy of measuring the eye positions in order to reducing the power usage and improving the system performance (see Tzvieli ‘860’’, [0146] “improve performance of an eye tracker system that utilizes the PSOG 262 and the event camera 264, in terms of accuracy, frequency, and/or reduced power usage”).
Ashwood, Tzvieli and Tzvieli ‘860’ fail to explicitly teach wherein each of the SoCs are configured to execute three-dimensional (3D) graphics, wherein the first set of the plurality of electronic components comprises a first color camera, a first computer vision (CV) camera operable by a first color-based CV algorithm, and a first display each directly coupled to the frame first side, the second set of the plurality of electronic components comprises a second color camera, a second CV camera operable by a second color-based CV algorithm, and a second display each directly coupled to the frame second side, wherein the first and second color-based CV algorithms have direct access to color images generated by the color cameras, and wherein each of the first SoC and the second SoC are configured to run each of the first and second color-based CV algorithms and to provide power generation balance in each of the SoCs when running the three-dimensional (3D) graphics.
However, Goldberg teaches wherein each of the SoCs are configured to execute three-dimensional (3D) graphics (Goldberg, Fig. 2, 203, 204 (as each of SoCs); [0016] lines 5-7, 3D images in visual headgear provide for full focus of the entire field of view; [0120] lines 1-7, data processing algorithms are applied to correlate multiple elements of data including CDEM, eye tracking, image capture and enhanced super-pixel projection to the eyes. In one embodiment, algorithms and software utilize the baseline CDEM and provide instructions to allow for the eye to perceive images with normal or better visual acuity; [0121] lines 4-11, Vision correction and enhancement software is used to deliver enhanced visual data (termed “super-pixels”) to the eye that overcomes tested abnormalities and provides vision correction and enhancement. The software uses visual data collection and processing techniques which are correlated with information in the user's field of view to provide the user with optimal desired visual acuity; also see [0148] lines 18-20, compute an accurate three dimensional ocular model which for each display field);
wherein the first set of the plurality of electronic components comprises a first color camera, a first computer vision (CV) camera operable by a first color-based CV algorithm, and a first display each directly coupled to the frame first side (Goldberg, Fig. 2, 203, 202, 201, 205 (left side); [0127] lines 2-5, a display semiconductor or similar device 203(C) with sufficient resolution to project required super-pixels onto a planar display of the smart eyeglasses or the eye itself (as first SoC); [0125] lines 1-17, in one embodiment smart eyeglasses 200 comprise one or more digital cameras 201(A) or sensors to capture a field of view. In some embodiments, the system may employ one or more outward facing digital cameras or sensors. The field of view of these cameras may typically be 180 degrees, but may also be up to 360 degrees depending on application. It may be noted that digital cameras may be based on any suitable kind of imaging sensors, such as semiconductor charge-coupled devices (CCD) or active pixel sensors in complementary metal-oxide-semiconductor (CMOS) or N-type metal-oxide-semiconductor (NMOS, Live MOS) technologies. In one embodiment, digital cameras with night vision capabilities (as first computer vision (CV) camera) are employed; [0120] lines 1-5, data processing algorithms are applied to correlate multiple elements of data including CDEM, eye tracking, image capture and enhanced super-pixel projection to the eyes. In one embodiment, algorithms and software utilize the baseline CDEM and provide instructions to allow for the eye to perceive images with normal or better visual acuity. [0126] lines 1-14, The smart eyeglasses 200 further comprise cameras/sensors 202(B) for tracking eye movements. In some embodiments, the system may employ one or more inward facing digital cameras or sensors, which are used to track the movement of the eyes and to determine the foveal and peripheral fields of focus…The inward facing digital cameras or sensors may be based on any suitable kind of imaging sensors, such as semiconductor charge-coupled devices (CCD) or active pixel sensors in complementary metal-oxide-semiconductor (CMOS) or N-type metal-oxide-semiconductor (NMOS, Live MOS) technologies. (as include first color camera); [0137] lines 2-5, uses smart eyeglasses to deliver enhanced vision to a user. In one embodiment, these functions are carried out under the control of a microprocessor embedded in the smart eyeglasses, which executes instructions in accordance with appropriate software algorithms; also see [0138] lines 1-4, capture images of the user's field of view. Processing software and hardware then combine the images into a video (as include first color-based CV algorithm for providing enhanced vision to user); see [0004] sharp central vision and detect colors and fine details;) and
the second set of the plurality of electronic components comprises a second color camera, a second CV camera operable by a second color-based CV algorithm, and a second display each directly coupled to the frame second side, wherein the first and second color-based CV algorithms have direct access to color images generated by the color cameras (Goldberg, Fig. 2, 203, 202, 201, 205 (right side); [0127] lines 2-5, a display semiconductor or similar device 203(C) with sufficient resolution to project required super-pixels onto a planar display of the smart eyeglasses or the eye itself; [0125] lines 1-17, in one embodiment smart eyeglasses 200 comprise one or more digital cameras 201(A) or sensors to capture a field of view. In some embodiments, the system may employ one or more outward facing digital cameras or sensors. The field of view of these cameras may typically be 180 degrees, but may also be up to 360 degrees depending on application. It may be noted that digital cameras may be based on any suitable kind of imaging sensors, such as semiconductor charge-coupled devices (CCD) or active pixel sensors in complementary metal-oxide-semiconductor (CMOS) or N-type metal-oxide-semiconductor (NMOS, Live MOS) technologies. In one embodiment, digital cameras with night vision capabilities (as second computer vision (CV) camera) are employed; [0120] lines 1-5, data processing algorithms are applied to correlate multiple elements of data including CDEM, eye tracking, image capture and enhanced super-pixel projection to the eyes. In one embodiment, algorithms and software utilize the baseline CDEM and provide instructions to allow for the eye to perceive images with normal or better visual acuity. [0126] lines 1-14, The smart eyeglasses 200 further comprise cameras/sensors 202(B) for tracking eye movements. In some embodiments, the system may employ one or more inward facing digital cameras or sensors, which are used to track the movement of the eyes and to determine the foveal and peripheral fields of focus…The inward facing digital cameras or sensors may be based on any suitable kind of imaging sensors, such as semiconductor charge-coupled devices (CCD) or active pixel sensors in complementary metal-oxide-semiconductor (CMOS) or N-type metal-oxide-semiconductor (NMOS, Live MOS) technologies. (as include second color camera); [0137] lines 2-5, uses smart eyeglasses to deliver enhanced vision to a user. In one embodiment, these functions are carried out under the control of a microprocessor embedded in the smart eyeglasses, which executes instructions in accordance with appropriate software algorithms; also see [0138] lines 1-4, capture images of the user's field of view. Processing software and hardware then combine the images into a video (as include first color-based CV algorithm for providing enhanced vision to user); see [0004] sharp central vision and detect colors and fine details) (as the first and second color-based CV algorithms have direct access to color images generated by the color cameras in order to providing an enhanced vision to user (i.e., detect colors and fine details)); and
wherein each of the first SoC and the second SoC are configured to run each of the first and second color-based CV algorithms when running the three-dimensional (3D) graphics (Goldberg, Fig. 2, 203 and 204 (as include first and second SoCs); Fig. 2a, 211, (as left side, as applied to right side as well); Fig. 4, 409; [0016] lines 5-7, 3D images in visual headgear provide for full focus of the entire field of view; [0133] lines 16-30, FIG. 2a illustrates one embodiment of a frame 210 of the smart eyeglasses. Referring to FIG. 2a, in this embodiment, an array of microprocessors 211 is placed along one of the sides 212 of the frame. One of ordinary skill in the art would appreciate that the array of microprocessors may be placed at any other suitable location in the frame 210 of the smart eyeglasses. The microprocessors in the array 211 are used in one embodiment, for automatically performing eye testing and mapping. In one embodiment, the microprocessor to process information received from the digital sensors and to deliver the enhanced visual picture elements (super-pixels) to a planar display on the eyeglass lens or the eye (as shown as 204(D) of FIG. 2), is also placed in the same array 211, along with other microprocessors; [0128] lines 1-4, smart eyeglasses 200 comprise at least one microprocessor 204(D) to process the information received from the digital sensors and to deliver the enhanced visual picture elements (super-pixels) to a planar display; [0137] lines 2-5, uses smart eyeglasses to deliver enhanced vision to a user. In one embodiment, these functions are carried out under the control of a microprocessor embedded in the smart eyeglasses, which executes instructions in accordance with appropriate software algorithms; also see [0138] lines 1-4, capture images of the user's field of view. Processing software and hardware then combine the images into a video (as include first color-based CV algorithm for providing enhanced vision to user); see [0004] sharp central vision and detect colors and fine details).
It would have been obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to have combined the teaching of Ashwood, Tzvieli and Tzvieli ‘860’ with Goldberg because Goldberg’s teaching of the smart eyeglasses that creating the enhanced image for user based on the captured images from different cameras would have provided Ashwood, Tzvieli and Tzvieli ‘860’’s system with the advantage and capability to allow the system to providing an enhancement of visual elements that are perceived by the eye and processed by the brain which avoiding the problems associated with vergence-accommodation conflicts that can cause nausea and unease to many users and improving the user experience (see Goldberg [0019] “enhancement of visual elements that are perceived by the eye and processed by the brain. Devices based on such methods would avoid the problems associated with vergence-accommodation conflicts that can cause nausea and unease to many users. There is also a need for methods and systems that have low bandwidth requirements, besides providing a natural enhanced super-vision experience”).
Ashwood, Tzvieli, Tzvieli ‘860’ and Goldberg fail to specifically teach wherein each of the first SoC and the second SoC to provide power generation balance in each of the SoCs when running.
However, Li teaches wherein each of the first SoC and the second SoC to provide power generation balance in each of the SoCs when running (Li, Fig. 2; [0021] lines 1-5, Referring to FIG. 2, a user-wearable device 200 includes an energy harvesting module 210 (also referred to herein as an energy harvester) that continuously generates power by collecting energy from an ambient energy source; [0042] lines 1-14, the processor 230 of the user-wearable device 200 is configured to implement power optimization methods. In one embodiment, the processor 230 incorporates an adaptive power control module, which can be implemented in software or firmware or both, to implement a power management scheme to realize a power balance operation between the power generated by the energy harvesting module and the power consumed by the sensor module and the processor module. More specifically, in one embodiment, the adaptive power control module is configured to adaptively adjust the sensing duty cycle and the signal processing run schedule based on user needs while balancing the energy generation versus consumption; [0059] lines 1-3, the user-wearable device is implemented as an arrhythmia detection and/or screening device that provides power generation-consumption balancing).
It would have been obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to have combined the teaching of Ashwood, Tzvieli, Tzvieli ‘860’ and Goldberg with Li because Li’s teaching of the providing power generation-consumption balancing when running at user-wearable device would have provided Ashwood, Tzvieli, Tzvieli ‘860’ and Goldberg’s system with the advantage and capability to providing the power/energy utilization efficiency which improving the system performance and efficiency.
As per claim 2, Ashwood, Tzvieli, Tzvieli ‘860’, Goldberg and Li teach the invention according to claim 1 above. Goldberg further teaches wherein the first set of the plurality of electronic components mirror the second set of the plurality of electronic components (Goldberg, Fig. 2, 201, 202 and 203, 205 which are also in right side of frame).
As per claim 6, Ashwood, Tzvieli, Tzvieli ‘860’, Goldberg and Li teach the invention according to claim 1 above. Ashwood further teaches wherein: the first and second SoCs are each configured to render three-dimensional (3D) graphics and perform rendering functions (Ashwood, [0024] lines 1-19, Glasses 31 can include a computing device, such as computer 61, which can be of any suitable type so as to be carried by the frame 32 and…In one embodiment, the computer 61 can be disposed in both of the temple pieces 46, 47. The computer 61 can include one or more processors with memory, wireless communication circuitry, and a power source. As described above, the computer 61 comprises low-power circuitry, high-speed circuitry, and a display processor. Various other embodiments may include these elements in different configurations or integrated together in different ways; [0098] lines 3-16, FIG. 9 is merely a non-limiting example of a software architecture, and it will be appreciated that many other architectures can be implemented to facilitate the functionality described herein. In various embodiments, the software 902 is implemented by hardware such as computer 61, camera device 210, computer 376, and/or computer 401 of FIGS. 1, 2, 3, and 4 respectively; [0100] lines 6-11, the libraries 906 can include API libraries 932 such as media libraries (e.g., libraries to support presentation and manipulation of various media formats such as Moving Picture Experts Group-4 (MPEG4), Advanced Video Coding (H.264 or AVC), Moving Picture Experts Group Layer-3 (MP3), Advanced Audio Coding (AAC), Adaptive Multi-Rate (AMR) audio codec, Joint Photographic Experts Group (JPEG or JPG), or Portable Network Graphics (PNG)), graphics libraries (e.g., an OpenGL framework used to render in two dimensions (2D) and three dimensions (3D) in a graphic content on a display).
As per claim 7, Ashwood, Tzvieli, Tzvieli ‘860’, Goldberg and Li teach the invention according to claim 1 above. Ashwood further teaches wherein the first and second SoCs each include an operating system (OS) (Ashwood, [0024] lines 1-19, Glasses 31 can include a computing device, such as computer 61, which can be of any suitable type so as to be carried by the frame 32 and…In one embodiment, the computer 61 can be disposed in both of the temple pieces 46, 47. The computer 61 can include one or more processors with memory, wireless communication circuitry, and a power source. As described above, the computer 61 comprises low-power circuitry, high-speed circuitry, and a display processor. Various other embodiments may include these elements in different configurations or integrated together in different ways; [0035] lines 9-12, the high-speed processor 232 executes an operating system such as a LINUX operating system or other such operating system such as operating system 904 of FIG. 9).
As per claim 11, it is a method claim of claim 1 above. Therefore, it is rejected for the same reason as claim 1 above. In addition, Ashwood further teaches performing a first set of operations with a first system on a chip (SoC) and performing a second set of operations with a second SoC (Ashwood, [0024] lines 1-19, Glasses 31 can include a computing device, such as computer 61, which can be of any suitable type so as to be carried by the frame 32 and…In one embodiment, the computer 61 can be disposed in both of the temple pieces 46, 47; [0013] lines 2-7, managing temperature in a wearable device. Temperature of wearable devices may be managed by the disclosed methods and systems by limiting transmission bandwidth in some aspects based on a current temperature of an electronic component and in some aspects also based on a rate of temperature change of the electronic component; [0043] lines 1-5, The computer 376 may generate heat as part of glasses operations for capturing images, transmitting data, or performing other computing processes. The heat may impact the device as well as a user wearing the device; [0054] lines 1-20, a temperature of a component is determined. The component may be the low power circuitry 220 or the high speed circuitry 230 discussed above with respect to FIG. 2 in some aspects…Process 500 may then rely on the separate temperature sensor in these aspects. Block 501 may also include determination of a rate of temperature change…This determined rate may be used by one or more of the decision blocks discussed below and shown in FIG. 5 to determine whether to increase or decrease a transmission rate limit of the component [Examiner noted: each component/computer that has associated temperature sensor for detecting the temperature when each computer is performing set operations, and increasing or decreasing transmission rate limit based on the temperature detection)]).
As per claims 12, 14 and 15, they are method claims of claims 2, 6 and 7 respectively above. Therefore, they are rejected for the same reasons as claims 2, 6 and 7 respectively above.
As per claim 18, it is a non-transitory computer readable medium claim of claim 1 above. Therefore, it is rejected for the same reason as claim 1 above. In addition, Ashwood further teaches performing a first set of operations with a first system on a chip (SoC) and performing a second set of operations with a second SoC and operate the first set of the plurality of electronic components (Ashwood, [0024] lines 1-19, Glasses 31 can include a computing device, such as computer 61, which can be of any suitable type so as to be carried by the frame 32 and…In one embodiment, the computer 61 can be disposed in both of the temple pieces 46, 47; [0013] lines 2-7, managing temperature in a wearable device. Temperature of wearable devices may be managed by the disclosed methods and systems by limiting transmission bandwidth in some aspects based on a current temperature of an electronic component and in some aspects also based on a rate of temperature change of the electronic component; [0043] lines 1-5, The computer 376 may generate heat as part of glasses operations for capturing images, transmitting data, or performing other computing processes. The heat may impact the device as well as a user wearing the device; [0054] lines 1-20, a temperature of a component is determined. The component may be the low power circuitry 220 or the high speed circuitry 230 discussed above with respect to FIG. 2 in some aspects…Process 500 may then rely on the separate temperature sensor in these aspects. Block 501 may also include determination of a rate of temperature change…This determined rate may be used by one or more of the decision blocks discussed below and shown in FIG. 5 to determine whether to increase or decrease a transmission rate limit of the component [Examiner noted: each component/computer that has associated temperature sensor for detecting the temperature when each computer is performing set operations (i.e., as operating the first set of electronic components, i.e., cameras, sensors), and increasing or decreasing transmission rate limit based on the temperature detection)]).
Claims 8, 16 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Ashwood, Tzvieli, Tzvieli ‘860’, Goldberg and Li, as applied to claims 1, 11 and 18 respectively above, and further in view of Cherukuri (US Pub. 2020/0168000 A1).
Cherukuri was cited in the previous Office Action.
As per claim 8, Ashwood, Tzvieli, Tzvieli ‘860’, Goldberg and Li teach the invention according to claim 1 above. Tzvieli further teaches wherein the first and second SoCs are each configured to perform CV and VOG. (Tzvieli, Fig. 1A, 35; [0067] lines 1-2, an electronic component may be a form of a computer described herein; [0115] lines 4-7, the computer (400, 410) may be implemented in various ways, such as, but not limited to, a microcontroller, a computer on a chip, a system-on-chip (SoC); [0115] lines 10-19, references to a computer or a processor include any collection of one or more computers and/or processors (which may be at different locations) that individually or jointly execute one or more sets of computer instructions. This means that the singular term “computer” is intended to imply one or more computers, which jointly perform the functions attributed to “the computer”. In particular, some functions attributed to the computer may be performed by a computer on a wearable device (e.g., smartglasses); [0478] lines 1-3, the computer 534 is further configured to detect the type of facial expression utilizing a real-time facial expression finite-state machine that is implemented; Fig. 13Ato Fig. 13C; [0036] lines 1-3, FIG. 13A, FIG. 13B, and FIG. 13C illustrate an embodiment of smartglasses with a system configured to detect facial expressions (as perform CV); [0154] lines 1-8, FIG. 3B illustrates an embodiment of an eye tracking system that tracks both eyes, which utilizes multiple light sources and detectors to track each eye. The illustrated system includes smartglasses 230 that have PSOG can VOG to track both eyes. Tracking of the left eye is done utilizing a PSOG that includes multiple light sources (emitters 231a and 231b in the figure) as well as multiple detectors (photosensors 232a, 232b, and 232c)).
Ashwood, Tzvieli, Tzvieli ‘860’, Goldberg and Li fail to specifically teach the VOG is visual odometry (VIO).
However, Cherukuri teaches the VOG is visual odometry (VIO) (Cherukuri, Fig. 3; [0012] lines 6-9, The eyewear device further comprises a thermal camera, an integrated slam or SLAM (Simultaneous Localization and Mapping) system, a visual odometry tracking; [0031] lines 1-5, the device 100 further comprises an integrated slam or SLAM (Simultaneous Localization and Mapping) system, visual odometry tracking, environment meshing, dominant plane detection and dynamic occlusion).
It would have been obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to have combined the teaching Ashwood, Tzvieli, Tzvieli ‘860’, Goldberg and Li with Cherukuri because Cherukuri’s teaching of visual odometry tracking would have provided Ashwood, Tzvieli, Tzvieli ‘860’, Goldberg and Li’s system with the advantage and capability to allow the system to performing the visual odometry tracking in order to detecting the position and movement which improving the user experience.
As per claim 16, it is a method claim of claim 8 above. Therefore, it is rejected for the same reason as claim 8 above.
As per claim 20, Ashwood, Tzvieli, Tzvieli ‘860’, Goldberg and Li teach the invention according to claim 18 above. Ashwood further teaches operate the first and second SoCs to each render three-dimensional (3D) graphics and perform rendering functions (Ashwood, [0024] lines 1-19, Glasses 31 can include a computing device, such as computer 61, which can be of any suitable type so as to be carried by the frame 32 and…In one embodiment, the computer 61 can be disposed in both of the temple pieces 46, 47. The computer 61 can include one or more processors with memory, wireless communication circuitry, and a power source. As described above, the computer 61 comprises low-power circuitry, high-speed circuitry, and a display processor. Various other embodiments may include these elements in different configurations or integrated together in different ways; [0098] lines 3-16, FIG. 9 is merely a non-limiting example of a software architecture, and it will be appreciated that many other architectures can be implemented to facilitate the functionality described herein. In various embodiments, the software 902 is implemented by hardware such as computer 61, camera device 210, computer 376, and/or computer 401 of FIGS. 1, 2, 3, and 4 respectively; [0100] lines 6-11, the libraries 906 can include API libraries 932 such as media libraries (e.g., libraries to support presentation and manipulation of various media formats such as Moving Picture Experts Group-4 (MPEG4), Advanced Video Coding (H.264 or AVC), Moving Picture Experts Group Layer-3 (MP3), Advanced Audio Coding (AAC), Adaptive Multi-Rate (AMR) audio codec, Joint Photographic Experts Group (JPEG or JPG), or Portable Network Graphics (PNG)), graphics libraries (e.g., an OpenGL framework used to render in two dimensions (2D) and three dimensions (3D) in a graphic content on a display);
to operate the first and second SoCs to each operate an operating system (OS) (Ashwood, [0024] lines 1-19, Glasses 31 can include a computing device, such as computer 61, which can be of any suitable type so as to be carried by the frame 32 and…In one embodiment, the computer 61 can be disposed in both of the temple pieces 46, 47. The computer 61 can include one or more processors with memory, wireless communication circuitry, and a power source. As described above, the computer 61 comprises low-power circuitry, high-speed circuitry, and a display processor. Various other embodiments may include these elements in different configurations or integrated together in different ways; [0035] lines 9-12, the high-speed processor 232 executes an operating system such as a LINUX operating system or other such operating system such as operating system 904 of FIG. 9).
In addition, Tzvieli teaches to operate the first and second SoCs to each perform CV and VOG. (Tzvieli, Fig. 1A, 35; [0067] lines 1-2, an electronic component may be a form of a computer described herein; [0115] lines 4-7, the computer (400, 410) may be implemented in various ways, such as, but not limited to, a microcontroller, a computer on a chip, a system-on-chip (SoC); [0115] lines 10-19, references to a computer or a processor include any collection of one or more computers and/or processors (which may be at different locations) that individually or jointly execute one or more sets of computer instructions. This means that the singular term “computer” is intended to imply one or more computers, which jointly perform the functions attributed to “the computer”. In particular, some functions attributed to the computer may be performed by a computer on a wearable device (e.g., smartglasses); [0478] lines 1-3, the computer 534 is further configured to detect the type of facial expression utilizing a real-time facial expression finite-state machine that is implemented; Fig. 13Ato Fig. 13C; [0036] lines 1-3, FIG. 13A, FIG. 13B, and FIG. 13C illustrate an embodiment of smartglasses with a system configured to detect facial expressions (as perform CV); [0154] lines 1-8, FIG. 3B illustrates an embodiment of an eye tracking system that tracks both eyes, which utilizes multiple light sources and detectors to track each eye. The illustrated system includes smartglasses 230 that have PSOG can VOG to track both eyes. Tracking of the left eye is done utilizing a PSOG that includes multiple light sources (emitters 231a and 231b in the figure) as well as multiple detectors (photosensors 232a, 232b, and 232c)).
Ashwood, Tzvieli, Tzvieli ‘860’, Goldberg and Li fail to specifically teach the VOG is visual odometry (VIO).
However, Cherukuri teaches the VOG is visual odometry (VIO) (Cherukuri, Fig. 3; [0012] lines 6-9, The eyewear device further comprises a thermal camera, an integrated slam or SLAM (Simultaneous Localization and Mapping) system, a visual odometry tracking; [0031] lines 1-5, the device 100 further comprises an integrated slam or SLAM (Simultaneous Localization and Mapping) system, visual odometry tracking, environment meshing, dominant plane detection and dynamic occlusion).
It would have been obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to have combined the teaching of Ashwood, Tzvieli, Tzvieli ‘860’, Goldberg and Li with Cherukuri because Cherukuri’s teaching of visual odometry tracking would have provided Ashwood, Tzvieli, Tzvieli ‘860’, Goldberg and Li’s system with the advantage and capability to allow the system to performing the visual odometry tracking in order to detecting the position and movement which improving the user experience.
Claims 9 and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Ashwood, Tzvieli, Tzvieli ‘860’, Goldberg and Li, as applied to claims 1 and 11 respectively above, and further in view of Kim et al. (US Pub. 2021/0181533 A1).
Kim was cited in the previous Office Action.
As per claim 9, Ashwood, Tzvieli, Tzvieli ‘860’, Goldberg and Li teach the invention according to claim 1 above. Tzvieli further teaches wherein the first and second SoCs are each configured to perform machine learning and encoding, and manage communications (Tzvieli, [0067] lines 1-2, an electronic component may be a form of a computer described herein; [0115] lines 4-7, the computer (400, 410) may be implemented in various ways, such as, but not limited to, a microcontroller, a computer on a chip, a system-on-chip (SoC); [0115] lines 10-19, references to a computer or a processor include any collection of one or more computers and/or processors (which may be at different locations) that individually or jointly execute one or more sets of computer instructions. This means that the singular term “computer” is intended to imply one or more computers, which jointly perform the functions attributed to “the computer”; [0100] lines 1-15, Various embodiments described herein involve calculations based on machine learning approaches. Herein, the terms “machine learning approach” and/or “machine learning based approaches” refer to learning from examples using one or more approaches. Examples of machine learning approaches include: decision tree learning, association rule learning, regression models, nearest neighbors classifiers, artificial neural networks, deep learning, inductive logic programming, support vector machines, clustering, Bayesian networks, reinforcement learning, representation learning, similarity and metric learning, sparse dictionary learning, genetic algorithms, rule-based machine learning, and/or learning classifier systems. Herein, a “machine learning-based model” is a model trained using one or more machine learning approaches; [0116] lines 1-6, The computer 400 includes one or more of the following components: processor 401, memory 402, computer readable medium 403, user interface 404, communication interface 405, and bus 406. The computer 410 includes one or more of the following components: processor 411, memory 412, and communication interface 413 (as manage communications; [0478] lines 1-7, the computer 534 is further configured to detect the type of facial expression utilizing a real-time facial expression finite-state machine that is implemented utilizing at least one of the following: a neural network, a Bayesian network, a rule-based classifier, a support vector machine, a hidden Markov model, a deep learning model, and a deep sparse autoencoder). In addition, Ashwood teaches run application logic (Ashwood, [0034] lines 5-9, Low-power processor 222 includes logic for managing the other elements of the camera device 210. As described above, for example, low power processor 222 may accept user input signals from an interface 216 (as application logic)).
Ashwood, Tzvieli, Tzvieli ‘860’, Goldberg and Li fail to specifically teach video encoding.
However, Kim teaches video encoding (Kim, Fig. 2; [0063] lines 1-9, The high-speed GPU interconnect 2208 may refer to a wire-based multi-lane communications link that is used by systems to scale and include one or more PPUs 2200 combined with one or more CPUs, supports cache coherence between the PPUs 2200 and CPUs, and CPU mastering. In an embodiment, data and/or commands are transmitted by the high-speed GPU interconnect 2208 through the hub 2216 to/from other units of the PPU 2200 such as one or more copy engines, video encoders).
It would have been obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to have combined the teaching of Ashwood, Tzvieli, Tzvieli ‘860’, Goldberg and Li with Kim because Kim’s teaching of providing the video encoding to the smart eyewear would have provided Ashwood, Tzvieli, Tzvieli ‘860’, Goldberg and Li’s system with the advantage and capability to allow the system to encoding the video which improving the system performance and efficiency (see Kim, [0031]“integrating the viewer's prescription into the AR display, overall weight and size of the system can be improved significantly’).
As per claim 17, it is a method claim of claim 9 above. Therefore, it is rejected for the same reason as claim 9 above.
Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Ashwood, Tzvieli, Tzvieli ‘860’, Goldberg and Li, as applied to claim 1 above, and further in view of Paavola et al. (US Pub. 2021/0112654 A1), Dai (US Pub. 2002/0085348 A1) and Shah et al. (US Pub. 2007/0176847 A1).
Paavola, Dai and Shah were cited in the previous Office Action.
As per claim 10, Ashwood, Tzvieli, Tzvieli ‘860’, Goldberg and Li teach the invention according to claim 1 above. Ashwood teaches wherein the eyewear further comprises first display components adjacent the first side, second display components adjacent the second side (Ashwood, Fig. 1, 43, 44 displays; [0031] lines 12-14, Each of the optical elements 43, 44 can be a lens, a display, a display assembly or a combination of the foregoing; also see Fig. 7, 211, 362).
Ashwood, Tzvieli, Tzvieli ‘860’, Goldberg and Li fail to specifically teach a passive thermal cooling capacity of 2 to 3 Watts adjacent each of the first and second sides and wherein each of the first and second SoCs operates at 1.5 Watts or less and each of the first and second display components operate at 1 to 2 Watts.
However, Paavola teaches a passive thermal cooling capacity of 2 to 3 Watts adjacent each of the first and second sides and wherein each of the first and second SoCs operates at 10 Watts or less (Paavola, [0002] lines 1-2, Electronic devices employ thermal systems to manage thermal conditions to maintain optimal efficiency; [0014] lines 5-10, Passive cooling systems are often employed with processors that do not exceed approximately 10 watts of power (as including capacity of 2 to 3 Watts)… Processors that exceed 10 watts of power often require active cooling systems to effectively cool these processors below desired operating temperatures (as including operates at 10 Watts or less); [0024] lines 1-4, The hardware component assembly 200 of the illustrated example include a circuit board 202 (e.g., a printed circuit board (PCB)) to which a processor 204 (e.g., a system on chip (SOS)) is coupled).
It would have been obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to have combined the teaching of Ashwood, Tzvieli, Tzvieli ‘860’, Goldberg and Li with Paavola because Paavola’s teaching of providing passive cooling systems for cooling the chips when chips are operating 10 W or less would have provided Ashwood, Tzvieli, Tzvieli ‘860’, Goldberg and Li’s system with the advantage and capability to allow the system to saving the energy in order to optimizing thermal conditions and maintain optimal efficiency (see Paavola [0002] “thermal conditions to maintain optimal efficiency”).
Ashwood, Tzvieli, Tzvieli ‘860’, Goldberg, Li and Paavola fail to specifically teach wherein each of the first and second SoCs operates at 1.5 Watts or less.
However, Dai teaches wherein each of the first and second SoCs operates at 1.5 Watts or less (Dai, [0024] lines 12-13, the SOC requires less than 1.0 watt of power to operate).
It would have been obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to have combined the teaching of Ashwood, Tzvieli, Tzvieli ‘860’, Goldberg, Li and Paavola with Dai because Dai’s teaching of SOC that utilizing less than 1.0 Watt of power for operating would have provided Ashwood, Tzvieli, Tzvieli ‘860’, Goldberg, Li and Paavola’s system with the advantage and capability to allow the smart eyewear system to efficiently utilizing the limited amount of energy and minimizing the heat generation in order to improving the system performance and efficiency.
Ashwood, Tzvieli, Tzvieli ‘860’, Goldberg, Li, Paavola and Dai fail to specifically teach each of the first and second display components operate at 1 to 2 Watts.
However, Shah teaches each of the first and second display components operate at 1 to 2 Watts (Shah, [0020] lines 15-21, typical displays consume approximately 3 Watts out of a total platform average power of approximately 11 Watts. Considering that a reflective display consumes only approximately 1 Watts).
It would have been obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to have combined the teaching of Ashwood, Tzvieli, Tzvieli ‘860’, Goldberg, Li, Paavola and Dai with Shah because Shah’s teaching of reflective display that only consumes approximately 1 Watts would have provided Ashwood, Tzvieli, Tzvieli ‘860’, Goldberg, Li, Paavola and Dai’s system with the advantage and capability to allow the system to utilizing the reflective display with only 1 watt for display information to the user which reducing the power consumption and system efficiency.
Response to Arguments
Applicant’s arguments with respect to claims 1-2, 6-12, 14-18 and 20 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ZUJIA XU whose telephone number is (571)272-0954. The examiner can normally be reached M-F 9:30-5:30 EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Aimee J Li can be reached at (571) 272-4169. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ZUJIA XU/Examiner, Art Unit 2195