DETAILED ACTION
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 03/19/2025. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Drawings
The drawings are objected to because descriptive labels other than numerical labels are needed for Figure 16. See 37 CFR 1.84(o). A proposed drawing correction or corrected drawings are required in reply to the Office action to avoid abandonment of the application. The objection to the drawings will not be held in abeyance.
Claim Objections
Claims 18 and 29 are objected to because of the following informalities:
As per claims 18 and 29, claims 18 and 29 should be ended with a period (.). Each claim begins with a capital letter and ends with a period. Periods may not be used elsewhere in the claims except for abbreviations. See MPEP 608.01(m).
Appropriate correction is required.
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the claims at issue are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); and In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on a nonstatutory double patenting ground provided the reference application or patent either is shown to be commonly owned with this application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP §§ 706.02(l)(1) - 706.02(l)(3) for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/forms/. The filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to http://www.uspto.gov/patents/process/file/efs/guidance/eTD-info-I.jsp.
Claims 16, 22-27 and 31-32 are rejected on the ground of nonstatutory obviousness-type double patenting as being unpatentable over claims 1, 5-10 and 12-13 of US Patent No. US 12260062. Although the conflicting claims are not identical, they are not patentably distinct from each other because the following reasons.
Claim 16 of the Present Application
Claim 1 of US Patent No. 12260062
A display apparatus comprising:
at least one light source;
at least one tracking means; and
at least one processor configured to:
control the at least one light source to display a virtual user interface comprising a panel in a three-dimensional space;
process tracking data, collected by the at least one tracking means, to determine whether an interaction element is in proximity to a given segment of a virtual widget associated with the panel,
wherein the virtual widget comprises at least one interactive element positioned in relation to the panel;
when it is determined that the interaction element is in proximity to the given segment of the virtual widget, control the at least one light source to display the given segment in the three-dimensional space;
determine whether the given segment is activated;
when it is determined that the given segment is activated, process the tracking data to determine an interaction characteristic of the interaction element upon said activation and digitally manipulate the virtual user interface based on the interaction characteristic and a corresponding interaction effect associated with the given segment.
A display apparatus comprising:
at least one light source;
at least one tracking means; and
at least one processor configured to:
control the at least one light source to display a virtual user interface (200) in a three-dimensional space;
process tracking data, collected by the at least one tracking means, to determine whether an interaction element is in proximity to a given segment of a virtual widget that is invisible in the three-dimensional space,
wherein the virtual widget comprises at least a virtual border of the virtual user interface;
when it is determined that the interaction element is in proximity to the given segment of the virtual widget, control the at least one light source to display the given segment in the three-dimensional space;
determine whether the given segment is activated;
when it is determined that the given segment is activated, process the tracking data to determine a change in the position of the interaction element upon said activation and digitally manipulate the virtual user interface according to a visual effect associated with the given segment and the change in the position of the interaction element;
wherein the interaction element is a finger of a user, and wherein when processing the tracking data to determine whether the interaction element is in proximity to the given segment of the virtual widget, the at least one processor is configured to:
define a first origin point at a position of a tip of the finger, and an interaction volume around the first origin point;
determine whether the interaction volume intersects with at least one segment of the virtual widget;
when it is determined that the interaction volume intersects with the at least one segment of the virtual widget, identify a first point and a second point of said intersection;
determine whether the first point and the second point lie on two different segments of the virtual widget, when it is determined that the first point and the second point lie on the two different segments of the virtual widget, determine interaction lengths of the two different segments within the interaction volume, and select the given segment as that segment amongst the two different segments whose interaction length is higher than that of another segment amongst the two different segments;
when it is determined that the first point and the second point do not lie on the two different segments of the virtual widget, select the given segment as that segment on which both the first point and the second point lie; determine a first perpendicular distance from the first origin point to the given segment; and
determine whether the first perpendicular distance is less than or equal to a first predefined distance threshold, wherein it is determined that the finger is in proximity to the given segment of the virtual widget, when it is determined that the first perpendicular distance is less than or equal to the first predefined distance threshold.
Claim 22 of the Present Application
The display apparatus of claim 16, wherein when it is determined that the given segment is activated, the at least one processor is further configured to control the at least one light source to display a visual cue indicative of said activation, in the three-dimensional space,
wherein upon displaying of the visual cue, the position of the interaction element is changeable for digitally manipulating the virtual user interface according to the interaction characteristic of the interaction element and the corresponding interaction effect associated with the given segment.
Claims 5 of US Patent No. 12260062
The display apparatus of claim 1, wherein when it is determined that the given segment is activated, the at least one processor is further configured to control the at least one light source to display a visual cue indicative of said activation, in the three-dimensional space,
wherein upon displaying of the visual cue, the position of the interaction element is changeable for digitally manipulating the virtual user interface according to the visual effect associated with the given segment.
Claim 23 of the Present Application
The display apparatus of claim 16, wherein the interaction effect associated with the given segment of the virtual widget comprises one of:
a resizing effect, a movement effect,
or another effect corresponding to an interaction characteristic of the interaction element.
Claim 6 of US Patent No. 12260062
The display apparatus of claim 1, wherein the visual effect associated with the given segment of the virtual widget comprises one of:
a resizing effect, a movement effect.
Claim 24 of the Present Application
The display apparatus of claim 16, wherein the virtual widget is divided into a plurality of segments such that at least one first segment amongst the plurality of segments is associated with a different interaction effect than at least one second segment amongst the plurality of segments, wherein each interaction effect is determined based on an interaction characteristic of the interaction element.
Claim 7 of US Patent No. 12260062
The display apparatus of claim 1, wherein the virtual widget is divided into a plurality of segments such that at least one first segment amongst the plurality of segments is associated with a different visual effect than at least one second segment amongst the plurality of segments.
Claim 25 of the Present Application
The display apparatus of claim 24, wherein the plurality of segments comprise eight segments such that four first segments amongst the eight segments are arranged at four corners of the virtual user interface, and four second segments amongst the eight segments are arranged at four sides of the virtual user interface.
Claim 8 of US Patent No. 12260062
The display apparatus of claim 7, wherein the plurality of segments comprise eight segments such that four first segments amongst the eight segments are arranged at four corners of the virtual user interface, and four second segments amongst the eight segments are arranged at four sides of the virtual user interface.
Claim 26 of the Present Application
The display apparatus of claim 24, wherein a length of each segment amongst the plurality of segments depends on:
dimensions of the panel of the virtual user interface; and
dimension that is to be used for defining an interaction volume or an interaction area of the interaction element when processing the tracking data to determine whether the interaction element is in proximity to the given segment of the virtual widget.
Claim 9 of US Patent No. 12260062
The display apparatus of claim 7, wherein a length of each segment amongst the plurality of segments depends on:
dimensions of the virtual user interface; and
dimension that is to be used for defining an interaction volume or an interaction area of the interaction element when processing the tracking data to determine whether the interaction element is in proximity to the given segment of the virtual widget.
Claim 27 of the Present Application
A method for digital manipulation of a virtual user interface, the method comprising:
controlling at least one light source to display the virtual user interface comprising a panel in a three-dimensional space;
processing tracking data, collected by at least one tracking means, to determine whether an interaction element is in proximity to a given segment of a virtual widget associated with the panel, wherein the virtual widget comprises at least one interactive element positioned in relation to the panel;
when it is determined that the interaction element is in proximity to the given segment of the virtual widget, controlling the at least one light source to display the given segment in the three-dimensional space;
determining whether the given segment is activated; and
when it is determined that the given segment is activated, processing the tracking data to determine an interaction characteristic of the interaction element upon said activation and digitally manipulating the virtual user interface based on the interaction characteristic and a corresponding interaction effect associated with the given segment.
Claim 10 of US Patent No. 12260062
A method for digital manipulation of a virtual user interface, the method comprising:
controlling at least one light source to display the virtual user interface in a three-dimensional space;
processing tracking data, collected by at least one tracking means, to determine whether an interaction element is in proximity to a given segment of a virtual widget that is invisible in the three-dimensional space, wherein the virtual widget comprises at least a virtual border of the virtual user interface;
when it is determined that the interaction element is in proximity to the given segment of the virtual widget, controlling the at least one light source to display the given segment in the three-dimensional space;
determining whether the given segment is activated; and
when it is determined that the given segment is activated, processing the tracking data to determine a change in the position of the interaction element upon said activation and digitally manipulate the virtual user interface according to a visual effect associated with the given segment and the change in the position of the interaction element;
wherein the interaction element is a finger of a user, and wherein for processing the tracking data to determine whether the interaction element is in proximity to the given segment of the virtual widget, the method further comprises:
defining a first origin point at a position of a tip of the finger, and an interaction volume around the first origin point; determining whether the interaction volume intersects with at least one segment of the virtual widget;
when it is determined that the interaction volume intersects with the at least one segment of the virtual widget, identifying a first point and a second point of said intersection; determining whether the first point and the second point lie on two different segments of the virtual widget, when it is determined that the first point and the second point lie on the two different segments of the virtual widget, determining interaction lengths of the two different segments within the interaction volume, and select the given segment as that segment amongst the two different segments whose interaction length is higher than that of another segment amongst the two different segments;
when it is determined that the first point and the second point do not lie on the two different segments of the virtual widget, selecting the given segment as that segment on which both the first point and the second point lie;
determining a first perpendicular distance from the first origin point to the given segment; and
determining whether the first perpendicular distance is less than or equal to a first predefined distance threshold, wherein it is determined that the finger is in proximity to the given segment of the virtual widget, when it is determined that the first perpendicular distance is less than or equal to the first predefined distance threshold.
Claim 31 of the Present Application
The method of claim 27, wherein the interaction element is a pointer of a user-interaction controller, and wherein for processing the tracking data to determine whether the interaction element is in proximity to the given segment of the virtual widget, the method further comprises:
defining a second origin point at a position of an intersection point of the pointer with a plane in which the panel of the virtual user interface lies, and an interaction area around the second origin point;
determining whether the interaction area intersects with at least one segment of the virtual widget;
when it is determined that the interaction area intersects with the at least one segment of the virtual widget, identifying a third point and a fourth point of said intersection;
determining whether the third point and the fourth point lie on two different segments of the virtual widget,
when it is determined that the third point and the fourth point lie on the two different segments of the virtual widget, determining interaction lengths of the two different segments within the interaction volume, and select the given segment as that segment amongst the two different segments whose interaction length is higher than that of another segment amongst the two different segments;
when it is determined that the third point and the fourth point do not lie on the two different segments of the virtual widget, selecting the given segment as that segment on which both the third point and the fourth point lie;
determining a second perpendicular distance from the second origin point to the given segment; and
determining whether the second perpendicular distance is less than or equal to a first predefined distance threshold, wherein it is determined that the pointer is in proximity to the given segment of the virtual widget based on the interaction characteristic of the pointer, and wherein the interaction characteristic influences the corresponding interaction effect associated with the given segment.
Claim 12 of US Patent No. 12260062
The method of claim 10, wherein the interaction element is a pointer of a user-interaction controller, and wherein for processing the tracking data to determine whether the interaction element is in proximity to the given segment of the virtual widget, the method further comprises:
defining a second origin point at a position of an intersection point of the pointer with a plane in which the virtual user interface lies, and an interaction area around the second origin point;
determining whether the interaction area intersects with at least one segment of the virtual widget;
when it is determined that the interaction area intersects with the at least one segment of the virtual widget, identifying a third point and a fourth point of said intersection;
determining whether the third point and the fourth point lie on two different segments of the virtual widget,
when it is determined that the third point and the fourth point lie on the two different segments of the virtual widget, determining interaction lengths of the two different segments within the interaction volume, and select the given segment as that segment amongst the two different segments whose interaction length is higher than that of another segment amongst the two different segments;
when it is determined that the third point and the fourth point do not lie on the two different segments of the virtual widget, selecting the given segment as that segment on which both the third point and the fourth point lie;
determining a second perpendicular distance from the second origin point to the given segment; and
determining whether the second perpendicular distance is less than or equal to a first predefined distance threshold, wherein it is determined that the pointer is in proximity to the given segment of the virtual widget, when it is determined that the second perpendicular distance is less than or equal to the first predefined distance threshold.
Claim 32 of the Present Application
The method of claim 31, wherein when determining whether the given segment is activated, the method further comprises:
receiving, from the user-interaction controller, a user input provided by a user; and
processing the user input to determine whether the user input is indicative of activation of the given segment,
wherein the activation is based on the interaction characteristic of the user input and modifies the corresponding interaction effect associated with the given segment.
Claim 13 of US Patent No. 12260062
The method of claim 12, wherein when determining whether the given segment is activated, the method further comprises:
receiving, from the user-interaction controller, a user input provided by a user; and
processing the user input to determine whether the user input is indicative of activation of the given segment.
Although the conflicting claims are not identical, they are not patentably distinct from each other because claims 16, 22-27 and 31-32 of the Present Application anticipates claims 1, 5-10 and 12-13 of US Patent No. 12260062 since pending claims 16, 22-27 and 31-32 are a broader version of patented claims 1, 5-10 and 12-13 of US Patent No. 12260062.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 19 and 30 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claims 19 and 30 recites the limitation "the user’s fingers". There is insufficient antecedent basis for this limitation in the claim.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 16-18 and 22-29 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Good (US 20230400957).
As per claim 16, Good discloses a display apparatus (Fig. 10, #1050; [0060]) comprising:
at least one light source ([0055]; [0060]);
at least one tracking means ([0029]-[0030]); and
at least one processor (#1020; [0058]-[0059]) configured to:
control the at least one light source to display a virtual user interface comprising a panel in a three-dimensional space ([0042]; [0046]; [0053]; [0055]);
process tracking data, collected by the at least one tracking means, to determine whether an interaction element is in proximity to a given segment of a virtual widget (Fig. 4, #401) associated with the panel ([0029]-[0031]), wherein the virtual widget comprises at least one interactive element positioned in relation to the panel ([0029]-[0031]);
when it is determined that the interaction element is in proximity to the given segment of the virtual widget, control the at least one light source to display the given segment in the three-dimensional space ([0042]; [0046]; [0053]; [0055]);
determine whether the given segment is activated ([0042]; [0046]; [0053]; [0051]-[0055]);
when it is determined that the given segment is activated, process the tracking data to determine an interaction characteristic of the interaction element upon said activation and digitally manipulate the virtual user interface based on the interaction characteristic and a corresponding interaction effect associated with the given segment ([0042]; [0046]; [0053]; [0051]-[0055]).
As per claims 17 and 28, Good discloses the display apparatus (method) of claim 16 (claim 27), wherein the (at least one) interactive element being configured for selection, manipulation, or activation of the virtual user interface ([0024]-[0027]; [0029]-[0030]).
As per claims 18 and 29, Good discloses the display apparatus (method) of claim 16 (claim 27), wherein the interaction characteristic comprises at least one of movement direction, movement speed, pressure, distance, or input type ([0029]; [0044]; [0057]),
As per claim 22, Good discloses the display apparatus of claim 16, wherein when it is determined that the given segment is activated, the at least one processor is further configured to control the at least one light source to display a visual cue indicative of said activation, in the three-dimensional space, wherein upon displaying of the visual cue, the position of the interaction element is changeable for digitally manipulating the virtual user interface according to the interaction characteristic of the interaction element and the corresponding interaction effect associated with the given segment ([0042]; [0045]-[0046]; [0053]; [0055]).
As per claim 23, Good discloses the display apparatus of claim 16, wherein the interaction effect associated with the given segment of the virtual widget comprises one of:
a resizing effect, a movement effect, or another effect corresponding to an interaction characteristic of the interaction element ([0024]-[0027]; [0029]-[0031]).
As per claim 24, Good discloses the display apparatus of claim 16, wherein the virtual widget is divided into a plurality of segments such that at least one first segment amongst the plurality of segments is associated with a different interaction effect than at least one second segment amongst the plurality of segments, wherein each interaction effect is determined based on an interaction characteristic of the interaction element ([0024]-[0027]; [0029]-[0033]).
As per claim 25, Good discloses the display apparatus of claim 24, wherein (Fig. 5 discloses) the plurality of segments comprise eight segments such that four first segments amongst the eight segments are arranged at four corners of the virtual user interface, and four second segments amongst the eight segments are arranged at four sides of the virtual user interface ([0032]-[0033]).
As per claim 26, Good discloses the display apparatus of claim 24, wherein a length of each segment amongst the plurality of segments depends on:
dimensions of the panel of the virtual user interface (Figs. 1-2; [0017]-[0022]); and
dimension that is to be used for defining an interaction volume or an interaction area of the interaction element when processing the tracking data to determine whether the interaction element is in proximity to the given segment of the virtual widget ([0017]-[0022]).
As per claim 27, Good discloses a method for digital manipulation of a virtual user interface (Abstract; [0014]), the method comprising:
controlling at least one light source to display the virtual user interface comprising a panel in a three-dimensional space ([0042]; [0046]; [0053]; [0055]);
processing tracking data, collected by the at least one tracking means, to determine whether an interaction element is in proximity to a given segment of a virtual widget (Fig. 4, #401) associated with the panel ([0029]-[0031]), wherein the virtual widget comprises at least one interactive element positioned in relation to the panel ([0029]-[0031]);
when it is determined that the interaction element is in proximity to the given segment of the virtual widget, controlling the at least one light source to display the given segment in the three-dimensional space ([0042]; [0046]; [0053]; [0055]);
determining whether the given segment is activated ([0042]; [0046]; [0053]; [0051]-[0055]); and
when it is determined that the given segment is activated, processing the tracking data to determine an interaction characteristic of the interaction element upon said activation and digitally manipulating the virtual user interface based on the interaction characteristic and a corresponding interaction effect associated with the given segment ([0042]; [0046]; [0053]; [0051]-[0055]).
Allowable Subject Matter
Claims 19 and 30 would be allowable if rewritten to overcome the rejection(s) under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), 2nd paragraph, set forth in this Office action and to include all of the limitations of the base claim and any intervening claims.
Claims 20-21 and 31-32 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
The following is a statement of reasons for the indication of allowable subject matter: The prior art of a display apparatus comprising control the at least one light source to display a virtual user interface comprising a panel in a three-dimensional space; process tracking data, collected by the at least one tracking means, to determine whether an interaction element is in proximity to a given segment of a virtual widget associated with the panel does not teach or fairly suggest wherein when determining whether the given segment is activated, the at least one processor is configured to: define a third origin point at a position of a tip of another finger of the user; determine a distance between the first origin point and the third origin point, wherein the given segment is determined to be activated when said distance is less than or equal to a second predefined distance threshold, and wherein the activation is associated with an interaction characteristic of the interaction element, wherein the interaction characteristic comprises at least one of movement speed, pressure, or relative positioning of the user’s fingers, wherein the interaction element is a pointer of a user-interaction controller, and wherein when processing the tracking data to determine whether the interaction element is in proximity to the given segment of the virtual widget, the at least one processor is configured to: define a second origin point at a position of an intersection point of the pointer with a plane in which the panel of the virtual user interface lies, and an interaction area around the second origin point; determine whether the interaction area intersects with at least one segment of the virtual widget; when it is determined that the interaction area intersects with the at least one segment of the virtual widget, identify a third point and a fourth point of said intersection; determine whether the third point and the fourth point lie on two different segments of the virtual widget, when it is determined that the third point and the fourth point lie on the two different segments of the virtual widget, determine interaction lengths of the two different segments within the interaction area, and select the given segment as that segment amongst the two different segments whose interaction length is higher than that of another segment amongst the two different segments; when it is determined that the third point and the fourth point do not lie on the two different segments of the virtual widget, select the given segment as that segment on which both the third point and the fourth point lie; determine a second perpendicular distance from the second origin point to the given segment; and determine whether the second perpendicular distance is less than or equal to a first predefined distance threshold, wherein it is determined that the pointer is in proximity to the given segment of the virtual widget based on the interaction characteristic of the pointer, and wherein the interaction characteristic influences the corresponding interaction effect associated with the given segment.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Nelson Lam whose telephone number is (571)272-8044. The examiner can normally be reached 1pm-9pm.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ke Xiao can be reached at 571 272-7776. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Nelson Lam/Examiner, Art Unit 2627
/KE XIAO/Supervisory Patent Examiner, Art Unit 2627