DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Terminal Disclaimer
The terminal disclaimer filed on 11/05/2025 disclaiming the terminal portion of any patent granted on this application which would extend beyond the expiration date of patent 12147609; 11809634; 11321577; 10832080; 10229339; 9625995 has been reviewed and is accepted. The terminal disclaimer has been recorded.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1,3,5,14-16, 18, 20-21 is/are rejected under 35 U.S.C. 102a1 as being anticipated by Nayar et al. (US 2004/0070565).
As to Claim 1, Nayar et al. discloses A camera, including: a sensor to perform capturing images including any objects within a field of view (fig.1-2-s104- para.0067, camera or imagers detect light from environmental light sources, which can be obtained by detecting and/or calculating the brightness and direction of light striking the pixel regions of the display region; para.0069,0125,0133), and
a communications connection providing communications with a computer (fig.5,34; lighting information 516 from detector 502 is provided to processor 512; para.0114,0127, 0131, 0149-0150);
wherein the computer comprises at least one processor and a memory that stores instructions (para.0114,0131,0149-0152), which instructions, when executed implement processing including:
recognizing that one or more light sources in the images is a display captured by the camera in the images (para.0067-Information regarding the environmental light received in the display region can also be used to simulate the appearance of an object as if illuminated by the environmental light. It is to be noted that the term "object" as used herein is not intended to be limiting, and is meant to include any item that can be displayed, including smaller, movable items (e.g., small paintings and sculptures) as well as larger features of any scene, such as mountains, lakes, and even astronomical bodies…the detection of the light from the environmental light sources can be performed using a wide variety of techniques. Typically, it is preferable to detect and/or calculate the brightness and direction of light striking various portions (e.g., pixel regions) of the display region; para.0068- The information from the detectors is used to generate information regarding the characteristics of the light rays incident upon the display region…Using the information regarding the incident light rays, and the information regarding the geometrical and reflectance characteristics of the object, an image of the object is generated (step 114) and displayed in the display region; fig. 19, para.0070,0092,0109-0110, 0125,0131,0133) and
providing an indication that the one or more light sources is recognized as a display (para.0070-0071; the measurements from one or more environmental light field detectors are used to render an image of input content as if the content (e.g., a set of scene objects) were illuminated under the lighting conditions present in the room in which the image is being displayed…graphics model of the input content can have both virtual and "environmental" components…The environmental component of the content includes models of objects in the room of the display device. Such objects can include, for example, the display device, the frame in which the display device resides, and other objects and architectural details in the room; para.0079, 0081,0110), and
wherein the recognizing is based, at least in part, on a change in brightness of one or more light sources detected in the images(para.0067-he detection of the light from the environmental light sources can be performed using a wide variety of techniques. Typically, it is preferable to detect and/or calculate the brightness and direction of light striking various portions (e.g., pixel regions) of the display region; para.0068-the displayed image can be updated in real time as the environmental lighting conditions change; para.0070-The content and the illumination field are not necessarily static, but can change with time. In cases of changing input content and/or lighting, the displayed image is preferably updated repeatedly at a rate sufficiently rapid to generate a movie or video sequence in the display region; para.0072-The environmental illumination field is the field actually measured by illumination field detectors; para.0012,0079-0080,0092, 0109, 0110, 0126-0127,0131-0133)
At to Claim 3, Nayar et al. discloses wherein the camera is at least one of a visible-light camera, an infrared (IR) camera, an ultraviolet camera, or a camera operating in another electromagnetic frequency regime (para.0113- detector 502 may be still camera or video camera).
At to Claim 5, Nayar et al. discloses wherein the camera has a variable rate of capture frames (para.0080).
As to Claim 14, Nayar et al. discloses wherein the recognizing of the one or more light sources as the display includes determining a shape of the display (para.0071, 0073).
As to Claim 15, Nayar et al. discloses wherein the instructions implement determining a length, a width, or an aspect ratio of the display (para.0066,0071,0078,0081,0084,0089).
As to Claim 16, Nayar et al. discloses wherein the recognizing of the one or more light sources as the display includes determining that a shape of the display is rectangular (para.0071, 0073, 0078)
As to Claim 18, 20 have limitations similar to those of Claims 1,3, respectively, and are met by the reference as set forth above. Claim 18 further recites “a plurality of cameras arranged such that at least a portion of their fields of view overlap in a viewed region to perform capturing images including any objects within their fields of view (Nayar-para.0061 - The same light rays--or different light rays coming from the environmental light sources(s)--are detected using one or more detectors which can include, for example, one or more imagers); para.0067,0128).
As to Claim 21 has limitations similar to those of Claim 1 and are met by the reference as set forth above.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 2, 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Nayar et al. (US 2004/0070565) in view of Hsu (US 2010/0201808).
As to Claim 2, Nayar et al. does not expressly disclose, but Hsu discloses wherein the sensor includes at least one of a charge-coupled device (CCD) sensor or a complementary metal-oxide-semiconductor (CMOS) sensor (para.0034-0035).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the device of Nayar et al. with the teachings of Hsu, the motivation being to provide an image sensor capable of converting visual image to an electric signal, and use the information to manipulate an object outputted on the display device.
As to Claim 19 has limitations similar to those of Claim 2 and are met by the references as set forth above.
Claim(s) 4 is/are rejected under 35 U.S.C. 103 as being unpatentable over Nayar et al. (US 2004/0070565) in view of McEldowney (US 2012/0026085).
As to Claim 4, Nayar et al. does not expressly disclose, but McEldowney discloses wherein a rate is determined for the change in brightness by measuring frequencies of the one or more light sources over time (para.0022, 0026-0027).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the device of Nayar et al. with the teachings of McEldowney, the motivation being to modulate structured light intensity to thereby increase image contrast.
Claim(s) 6 is/are rejected under 35 U.S.C. 103 as being unpatentable over Nayar et al. (US 2004/0070565).
As to Claim 6, Nayar et al. does not expressly disclose wherein the images are captured at a rate of at least one of 50, 60, 75, 100, or 120 frames per second.
However, Nayar et al. discloses where the image is updated at a rate equal to or greater than 24 frames/second so that the rendering appears continuous to the viewer (para.0080, 0089).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the device of Nayar et al. by capturing images at a rate of at least one of 50, 60, 75, 100, or 120 frames per second, since in doing so would not have modified the operation of the device, yielding predictable results. In particular, so that the rendering appears continuous to the viewer.
Claim(s) 8-9, 11-12 is/are rejected under 35 U.S.C. 103 as being unpatentable over Nayar et al. (US 2004/0070565) in view of Clarkson et al. (US 2011/0080490).
As to Claim 8, Nayar et al. does not expressly disclose, but Clarkson et al. discloses: wherein the instructions implement determining a distance between an object in the field of view and the display (para.0105- distance of object 1616 in field of view of camera 1614 from the display 1620 is detected; para.0115, 0117-0118).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the device of Nayar et al. with the teachings of Clarkson et al, the motivation being to determine the position and user input of one more objects (e.g. finger) relative to the display and controlling an application based on the user input.
As to Claim 9, Nayar et al. does not expressly disclose but Clarkson et al. discloses: wherein the instructions implement: determining a distance between a user in the field of view and the display (para.0104-0105- distance of object 1616 in field of view of camera 1614 from the display 1620 is detected; para.0042, 0115, 0117-0118); and setting a context for interpreting a gesture as an input command based, at least in part, on the determined distance (fig.9-11,16,19; para.0034, 0097,0115, 0117-0118).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the device of Nayar et al. with the teachings of Clarkson et al, the motivation being to determine user input based on detected gesture relative to the display and control an application based on the user input.
As to Claim 11, Nayar et al. does not expressly disclose, but Clarkson et al. discloses wherein the instructions implement prompting pointing, with a stylus, to a target presented on the display and detecting a position of the stylus (para.0030, 0042, 0067-0068, 0104-0105, 0115, 0117-0118).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the device of Nayar et al. with the teachings of Clarkson et al, the motivation being to detect user input and control an application based on the user input.
As to Claim 12, Nayar et al. does not expressly disclose, but Clarkson et al. discloses: wherein the instructions implement (i) prompting pointing, with a stylus, to a target presented on the display and detecting a position of the stylus (para.0067-0068), and (ii) prompting pointing multiple times at multiple targets and determining a plane occupied by the display based, at least in part, on the detected position of the stylus (para.0066-0068, 0070).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the device of Nayar et al. with the teachings of Clarkson et al, the motivation being to detect user input and control an application based on the user input.
Claim(s) 10 is/are rejected under 35 U.S.C. 103 as being unpatentable over Nayar et al. (US 2004/0070565) in view of Ferren et al. (US 2009/0316952).
As to Claim 10, Nayar et al. does not expressly disclose, but Ferren et al. discloses: wherein the instructions implement prompting a user to make a plurality of contacts at multiple locations on the display and determining a plane occupied by the display based, at least in part, on the plurality of contacts (para.0020-0021,0030-0032, 0039, 0041).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the device of Nayar et al. with the teachings of Ferren et al, the motivation being to allow the user to interact and provide inputs.
Claim(s) 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Nayar et al. (US 2004/0070565) in view of Yokozawa et al. (US 2006/0125968).
As to Claim 17, Nayar et al. does not expressly disclose, Yokozawa et al. discloses wherein the recognizing of the one or more light sources as the display includes comparing a length, a width, or an aspect ratio of the display to a determined length, a determined width, or a determined aspect ratio of the display (fig.8, para.0068-0070,0072).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the device of Nayar et al. with the teachings of Yokozawa et al., the motivation being to discriminate a control object based on the outer shape of the control object.
Allowable Subject Matter
Claim 7, 13 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
The following is a statement of reasons for the indication of allowable subject matter:
Claim 7 is allowable over the prior art of record since the cited references do not teach or suggest “wherein the instructions implement determining a rate of a change in brightness based, at least in part on, a beat frequency of periodic brightness variations and a capture rate” in combination with the other limitations in the claim.
Claim 13 is allowable over the prior art of record since the cited references do not teach or suggest wherein the recognizing of the one or more light sources as the display includes: computing, based at least in part, on a frequency of change in an intensity, a curve for an object; and identifying the object based, at least in part, on the curve as computed” in combination with the other limitations in the claim.
Response to Arguments
Applicant’s arguments with respect to claim(s) 1,18,21 have been considered but are moot because the new ground of rejection as necessitated by amendment.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure: see PTO-892 form.
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DISMERY E. MERCEDES whose telephone number is (571)272-7558. The examiner can normally be reached Monday-Friday, 9am-5pm, EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ke Xiao can be reached at 571-272-7776. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DISMERY MERCEDES/ Primary Examiner, Art Unit 2627