Part III DETAILED ACTION
1. The present application is being examined under the pre-AIA first to invent provisions. This application has been examined. Claims 1-20 are pending in this application.
Claim Objections
2. Claim 3 is objected to because of the following informalities: On line 3, the phase “an hardware” appears that it should change to “a hardware”. Appropriate correction is required.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
3. Claim(s) 1-20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Leng et
al. (2024/0007583), hereafter referred as Leng.
With regard claim 1, Leng teaches a method comprising: receiving, by a first device( see fig. 8, item labelled first device), one or more requests from one or more communication devices to utilize a camera of a second device as a virtual camera of the one or more communication devices (read on figs. 9-1, access to an external display can be implemented from another device); allowing access to the camera of the second device to at least one communication device of the one or more communication devices (read on figs. 9-11, allowing access to the external display can be implemented from another device) ; enabling the at least one communication device to utilize
the camera of the second device to capture one or more images or videos (e.g. the portable
device or cellular as depicted by fig. 11, can receive the captured image by the first device as
depicted in figs. 9-11); and enabling provision of the captured one or more images or videos to
the at least one communication device to enable display of the captured one or more images
the first device as depicted in figs. 8-11 and display the captured image as depicted in fig.11).
With regard claim 2, Leng further teaches performing image processing on one or more
frames associated with the captured one or more images or videos, by an image processing
module of an abstraction layer interface, to enable the at least one communication device to
modify or change one or more image or video features of the captured one or more images or
videos prior to the at least one communication device receiving the captured one or more
images or videos from the camera of the second device (reads on fig. 11, a frame or portion of
the captured image of the first device is displayed by the communication device, editing or
performing synthesis processing as suggested by claim 23).
With regard claim 3, Leng further teaches providing to the one or more communication
devices access to an abstraction layer interface, comprising an hardware abstraction layer
which enables the at least one communication device to utilize the camera of the second
device, and the image processing module which enables the one or more communication
devices to modify or change the one or more image or video features associated with the
captured one or more images or videos reads on fig. 11, a layer or portion of the captured
image of the first device is displayed by the communication device, editing or performing
synthesis processing as suggested by claim 23).
.
With regard claim 4, Leng further teaches wherein the features comprise one or more of high
dynamic range capture, noise reduction, distortion reduction, or one or more frame exposure
times (see p[0152], the camera uses high dynamic range).
With regard claim 5, Leng further teaches wherein the abstraction layer interface is configured to
enable the provision of the captured one or more images or videos independent of a
communication protocol utilized by the at least one communication device sending a request of
the one or more requests (See abstract).
With regard claim 6, Leng further teaches wherein enabling, based on the hardware abstraction
Layer, the at least one communication device to utilize one or more applications, on the second
device, wherein the one or more applications cause the camera to capture other images or other
videos to provide to the at least one communication device (see abstract and p[0068]).
With regard claim 7, Leng further teaches wherein the at least one communication device lacks an
internal camera (inherently reads on fig. 11, the cellular or communication does have to incorporate a
camera since the images are captured by a first device see fig. 9)..
With regard claim 8, Leng further teaches wherein the at least one communication device
a wearable device, smart glasses or another communication device (reads on fig. 11, the device is a
communication device as depicted in fig. 11).
With regard claim 9, Leng further teaches enabling access, based on the hardware abstraction
layer, of the camera on the second device to a plurality of the one or more communication devices, to
enable the plurality of the one or more communication devices to obtain other images or other videos
captured by the camera of the second device (reads on figs. 8-11).
With regard claim 10, Leng further enabling provision of different communication protocols to
enable the plurality of communication devices to dynamically switch between the different
communication protocols (inherently reads on p[0148], different protocols can be used).
With regard to claims 11-20, the limitations of claims 11-20 are covered by the limitations of claims
1-10 above.
Conclusion
4. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Gladkov et al. (11,132,827) teaches an artificial reality system architecture for concurrent application execution and collaborative 3D scene rendering.
5. Any inquiry concerning this communication or earlier communications from the
examiner should be directed to Gabriel I. Garcia whose telephone number is (571)
272-7434. The examiner can normally be reached Monday-Thursday from 7:30 AM-6:00 PM.. The fax
phone number for this group is (571) 273-8600.
If attempts to reach the examiner by telephone are unsuccessful, the examiner's
supervisor, Benny Tieu can be reached on (571) 272-7490. The fax phone
number for the organization where this application or proceeding is assigned is 571-
273-8300.
Information regarding the status of an application may be obtained from the Patent
Application Information Retrieval (PAIR) system. Status information for published
applications may be obtained from either Private PAIR or Public PAIR. Status
information for unpublished applications is available through Private PAIR only. For
more information about the PAIR system, see http://pair-direct.uspto.gov. Should you
have questions on access to the Private PAIR system, contact the Electronic Business
Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO
Customer Service Representative or access to the automated information system, call
800-786-9199 (IN USA OR CANADA) or 571-272-1000.
Any inquiry of a general nature or relating to the status of this application should be
directed to the Group receptionist whose telephone number is (571) 272-2600.
/Gabriel I Garcia/
Primary Examiner, Art Unit 2682
January 06, 2026