DETAILED ACTION
1. This action is responsive to application communication filed on 12/15/2023.
2. Claims 1-12 and 14-16 are pending in the case.
3. Claim 13 is unknown.
4. Claims 1, 12, 14 and 15 are independent claims.
Claim Objections
Claim 13 is objected to because of the following informalities:
Claim 13 is missing and it is unclear if claim 13 was cancelled.
Appropriate correction is required.
Drawings
The drawings are objected to as failing to comply with 37 CFR 1.84(p)(5) because they do not include the following reference sign(s) mentioned in the description: reference no. 20 (see sensor 20 of par. 146). Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance.
The drawings are objected to as failing to comply with 37 CFR 1.84(p)(5) because they include the following reference character(s) not mentioned in the description: reference no. 22 of Figure 3. Corrected drawing sheets in compliance with 37 CFR 1.121(d), or amendment to the specification to add the reference character(s) in the description in compliance with 37 CFR 1.121(b) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance.
The drawings are objected to as failing to comply with 37 CFR 1.84(p)(4) because reference character “21” has been used to designate both user interface 21 and camera 21 (see par. 147). Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 2-5 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more.
Step 1
Claims 1-11 are directed towards a method (i.e., process) for managing at least one display window of a virtual workstation in an immersive environment.
Claim 12 is directed towards a method (i.e., process) for managing an immersive environment comprising a virtual workstation.
Claim 14 is directed towards an apparatus (i.e., machine) for managing an immersive environment comprising a virtual workstation.
Claims 15-16 are directed towards an immersive reality system (i.e., machine) for managing at least one display window of a virtual workstation in an immersive environment.
Therefore, claims 1-12 and 14-16 recite one of the enumerated statutory categories of eligible subject matter in 35 U.S.C. §101.
Claim 1: recites A method performed by a management device and comprising: managing at least one display window of a virtual workstation in an immersive environment, the display window being associated with a processing reproducing data produced by the associated processing, managing comprising:
setting a parameter for reproducing the display window, during a modification of the display window, to a value that is determined as a function of a current user context.
Claim 2 depends on claim 1:
the method comprising determining the value of the parameter as a function of the current user context.
Claim 3 depends on claim 1:
the method comprising determining the value of the parameter as a function of a set reproduction zone, in which a reproduction of a display window of the virtual workstation is subject to a reproduction rule, the set reproduction zone depending on the current user context.
Claim 4 depends on claim 3:
the method comprising determining the set reproduction zone as a function of the current user context prior to the determining of the value of the parameter.
Claim 5 depends on claim 3:
wherein the reproduction rule associated with the set reproduction zone is a rule from among the following rules: an authorization for reproducing display windows in the set reproduction zone, called authorized zone; a prohibition for reproducing display windows in the set reproduction zone, called prohibited zone; a prioritization for reproducing display windows in the set reproduction zone, called ideal zone, as long as the set reproduction zone has enough free space for reproducing the display window; a tolerance for reproducing display windows in the set reproduction zone, called tolerance zone, in which reproduction of a display window is authorized if the ideal zone does not have enough free space for reproducing the display window.
The bolded limitations, under its broadest reasonable interpretation, covers an abstract idea, which includes a mental process because it recites concepts performed in the human mind (including an observation, evaluation, judgment, opinion). Any limitations not identified above is part of the abstract idea are deemed "additional elements", are underlined.
Step 2A-Prong 1
If a claim limitation, under its broadest reasonable interpretation, cover concepts performed in the human mind (including an observation, evaluation, judgment, opinion), then it falls within the "mental process" grouping of abstract ideas. The independent claims recite method, machine and manufacture product for analyzing design options. Specifically, the dependent claims 2-5 recite determining the value of the parameter as a function of the current user context; (see claim 2)
determining the value of the parameter as a function of a set reproduction zone, in which a reproduction of a display window of the virtual workstation is subject to a reproduction rule, the set reproduction zone depending on the current user context. (see claim 3)
determining the set reproduction zone as a function of the current user context prior to the determining of the value of the parameter. (see claim 4)
wherein the reproduction rule associated with the set reproduction zone is a rule from among the following rules: an authorization for reproducing display windows in the set reproduction zone, called authorized zone; a prohibition for reproducing display windows in the set reproduction zone, called prohibited zone; a prioritization for reproducing display windows in the set reproduction zone, called ideal zone, as long as the set reproduction zone has enough free space for reproducing the display window; a tolerance for reproducing display windows in the set reproduction zone, called tolerance zone, in which reproduction of a display window is authorized if the ideal zone does not have enough free space for reproducing the display window. (see claim 5)
These limitations, under its broadest reasonable interpretations, cover
performance of the limitations in the human mind, or by a human using a pen and
paper. Therefore, these limitations are grouped within the "mental process" grouping
(including an observation, evaluation, judgement, opinion) of abstract ideas. (see
MPEP 2016.04(a)(2)(iii)). Accordingly, the dependent claims 2-5 recite an abstract idea.
Step 2A – Prong 2
Independent claim 1 merely use computer elements as tools to perform
abstract ideas and generally link the use of a judicial exception to a particular
technological environment. The use of the computer elements as tools to implement the
abstract idea and generally to link the use of the abstract idea to a particular technological environment does not render the claim patent eligible because it requires no more than a computer performing functions that correspond to acts required to carry out the abstract idea.
Independent claim 1 recites the combination of additional elements of 1) management device is recited at a high-level of generality such that it amounts no more than mere generic computer component limitations to apply the exception using a generic computer component or amounts to merely invoking a computer as a tool to perform the abstract idea.
2) managing at least one display window of a virtual workstation in an immersive environment, the display window being associated with a processing reproducing data produced by the associated processing, generally links the abstract idea to a particular technological environment or field of use. MPEP 2106.04(d)(I) indicates that generally linking an abstract idea to a particular technological environment or field of use cannot provide a practical application. Specifically, the computer elements may be any number of hardware architectures including processors, user devices, computer devices, and function to perform managing and reproducing data (see specification paras. 84-85, Figure 1).
3) managing comprising: setting a parameter for reproducing the display window, during a modification of the display window, to a value that is determined as a function of a current user context is merely data gathering or well-known pre-solution of insignificant extra solution activity.
Integration into a practical application requires an additional element or a combination of additional elements in the claim to apply, rely on, or use the judicial exception in a manner that imposes a meaningful limit on the judicial exception, such that the claim is more than a drafting effort designed to monopolize the exception.
As explained above, the additional elements do not impose any meaningful limits on practicing the abstract idea and the additional limitations are not indicative of materializing into a practical application. Accordingly, the claims are directed to an abstract idea.
Step 2B
Independent claim 1 does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional element of using computer elements such as 1) management device 2) managing at least one display window of a virtual workstation in an immersive environment, the display window being associated with a processing reproducing data produced by the associated processing, to perform the noted steps amounts to no more than mere instructions to apply the exception using generic computer components. Generic computer elements recited as performing generic computer functions that are well- understood, routine, or conventional activities amount to no more than implementing the abstract idea with a computerized system. Mere instructions to apply an exception using a generic computer component cannot provide an inventive concept ("significantly more").
3) managing comprising: setting a parameter for reproducing the display window, during a modification of the display window, to a value that is determined as a function of a current user context is considered well- understood, routine, conventional activity in the field. (Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information); OJP Techs., Inc., V. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed.
Cir. 2015) (sending messages over a network); buySAFE, Inc. V. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network and performing repetitive calculations); Bancorp Services V. Sun Life, 687 F.3d 1266, 1278, 103 USPQ2d 1425, 1433 (Fed. Cir. 2012) ("The computer required by some of Bancorp's claims is employed only for its most basic function, the performance of repetitive calculations, and as such does not impose meaningful limits on the scope of those claims."); See MPEP 2106.05(d) Here, the claim limitation of r setting a parameter for reproducing the display window, during a modification of the display window, to a value that is determined as a function of a current user context is similar to the receiving, sending, and performing calculations stated above.
Therefore, the additional elements in the independent claims do not amount to significantly more than a judicial exception. Furthermore, there is no indication that the additional limitations alone or in combination improves the functioning of a computer or any other technology, improves another technology or technical field, or effects a transformation or reduction of a particular article to a different state or thing. Therefore, the claims are not patent eligible.
Remaining Claims
Claims 1, 6-12 and 14-16 do not recite an abstract idea and therefore are not rejected under 35 U.S.C. §101.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1-12 and 14-16 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. The dependent claims included in the statement of rejection but not specifically addressed in the body of the rejection have inherited the deficiencies of their parent claim and have not resolved the deficiencies. Therefore, they are rejected based on the same rationale as applied to their parent claims above.
Claims 1, 12, 14 and 15:
Claims 1, 12, 14 and 15 recite the limitation "the associated processing". There is insufficient antecedent basis for this limitation in the claims. Examiner suggest to amend to “the display window being associated with a processing reproducing data
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-9, 11,12 and 14-16 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Vranjes et al. (hereinafter “Vranjes”), U.S. Published Application No. 20160034155 A1.
Claim 1:
Vranjes teaches A method performed by a management device and comprising: managing at least one display window of a virtual workstation in an immersive environment, the display window being associated with a processing reproducing data produced by the associated processing, managing comprising:
(e.g., managing application windows in a virtual immersive environment par. 42; Thus, in some cases multi-application environment module 118 presents a multi-application environment as an immersive environment par. 54; Alternately or additionally, multi-application environment 202 may be implemented as a desktop, virtual or otherwise, and include a control area, which is shown as application management UI 210 or a start menu (not shown).)
setting a parameter for reproducing the display window, during a modification of the display window, to a value that is determined as a function of a current user context. (e.g., setting a new size or position (i.e., parameter) of an application window based on dragging input by a user (i.e., current user context such as declared input of a user interface as described by instant specification par. 85) par. 63; In the context of workspace 400, assume window 438 is being dragged into a corner region of workspace 400. Here, window manager 132 identifies an edge of application window 410, which is adjacent to the region into which application window 438 is moving. Par. 64; At 306, a size or a position is determined for the application window based on the edge of the other application window. The size or position of the application window may be determined such that the application window fills the region to the edge of the other application window. Par. 64; At 306, a size or a position is determined for the application window based on the edge of the other application window. The size or position of the application window may be determined such that the application window fills the region to the edge of the other application window. In some cases, the size or position is determined such that an edge of the application window aligns with a complimentary edge of an adjacent application window. In such cases, the application window and adjacent application window may have a same width or a same height. Par. 69; Placement of the application window may be responsive to input to add, switch, or move an application window in the multi-application environment. In some cases, the input is a gesture or edge trigger action in which an application window, or visual representation thereof, is dragged to or moved against an edge of the multi-application environment. )
Claim 2 depends on claim 1:
Vranjes teaches the method comprising determining the value of the parameter as a function of the current user context. (e.g., determining the value of size or position of application window based on user input (i.e., current user context) par. 63; In the context of workspace 400, assume window 438 is being dragged into a corner region of workspace 400. par. 64; At 306, a size or a position is determined for the application window based on the edge of the other application window. The size or position of the application window may be determined such that the application window fills the region to the edge of the other application window. In some cases, the size or position is determined such that an edge of the application window aligns with a complimentary edge of an adjacent application window. In such cases, the application window and adjacent application window may have a same width or a same height. par. 69; Placement of the application window may be responsive to input to add, switch, or move an application window in the multi-application environment. In some cases, the input is a gesture or edge trigger action in which an application window, or visual representation thereof, is dragged to or moved against an edge of the multi-application environment.)
Claim 3 depends on claim 1:
Vranjes teaches the method comprising determining the value of the parameter as a function of a set reproduction zone, in which a reproduction of a display window of the virtual workstation is subject to a reproduction rule, the set reproduction zone depending on the current user context. (e.g., determining the size of an application window based on region of the workspace which depends on the user input (i.e., current user context) Examiner considers “aligning with a complimentary edge of an adjacent application window as an example of recited “reproduction rule” par. 63; In the context of workspace 400, assume window 438 is being dragged into a corner region of workspace 400. Here, window manager 132 identifies an edge of application window 410, which is adjacent to the region into which application window 438 is moving. Par. 64; At 306, a size or a position is determined for the application window based on the edge of the other application window. The size or position of the application window may be determined such that the application window fills the region to the edge of the other application window. In some cases, the size or position is determined such that an edge of the application window aligns with a complimentary edge of an adjacent application window.)
Claim 4 depends on claim 3:
Vranjes teaches the method comprising determining the set reproduction zone as a function of the current user context prior to the determining of the value of the parameter. (e.g., determining user-defined regions of a workspace prior to determining the size of dragged application windows par. 58; In some cases, a user may define or configure particular areas (e.g., sections or strips of screen area) within the multi-application environment as user-defined regions. par. 59; The region may be fixed, predefined, or dynamic, such as a region that changes size or position due to an orientation of a display or type of input received. In some cases, a region may be associated with a corresponding operation, such as a “snap” operation, which fills the region with an application window at a predefined size or predefined position.)
Claim 5 depends on claim 3:
Vranjes teaches wherein the reproduction rule associated with the set reproduction zone is a rule from among the following rules: an authorization for reproducing display windows in the set reproduction zone, called authorized zone; a prohibition for reproducing display windows in the set reproduction zone, called prohibited zone; a prioritization for reproducing display windows in the set reproduction zone, called ideal zone, as long as the set reproduction zone has enough free space for reproducing the display window; a tolerance for reproducing display windows in the set reproduction zone, called tolerance zone, in which reproduction of a display window is authorized if the ideal zone does not have enough free space for reproducing the display window. (e.g., user-defined regions of a workspace that allow dragged application windows (i.e., authorized zone) par. 58; In some cases, selection of the region is received via an application window being added to, switched to, or moved within the multi-application environment. par. 59; The region may be fixed, predefined, or dynamic, such as a region that changes size or position due to an orientation of a display or type of input received.)
Claim 6 depends on claim 1:
Vranjes teaches wherein modifying the display window is a modification from among the following modifications: opening the display window; moving the display window; resizing the display window. (e.g., moving application window into a region of a workspace par. 63; In the context of workspace 400, assume window 438 is being dragged into a corner region of workspace 400. Here, window manager 132 identifies an edge of application window 410, which is adjacent to the region into which application window 438 is moving.)
Claim 7 depends on claim 1:
Vranjes teaches wherein the parameter for reproducing the display window is a parameter from among the following parameters: a reproduction position of the display window; a width of the display window; a height of the display window; a size of the display window. (e.g., position, width, height and size of application window are determined parameters par. 64; In some cases, the size or position is determined such that an edge of the application window aligns with a complimentary edge of an adjacent application window. In such cases, the application window and adjacent application window may have a same width or a same height.)
Claim 8 depends on claim 1:
Vranjes teaches wherein the method comprises: measuring the current user context by using at least one sensor. (e.g., touch based sensors par. 38; Input mechanisms 142 may include gesture-sensitive sensors and devices, such as touch-based sensors and movement-tracking sensors (e.g., camera-based), as well as mice (free-standing or integral with a keyboard), a stylus, touch pads, accelerometers, and microphones with accompanying voice recognition software, to name a few. Input mechanisms 136 may be separate or integral with displays 134; integral examples include gesture-sensitive displays with integrated touch-sensitive or motion-sensitive sensors.)
Claim 9 depends on claim 1:
Vranjes teaches wherein the method comprises requesting the current user context by using a user interface and receiving a declared user context from the user interface. (e.g., presenting user defined regions of workspace responsive to gesture input is considered implicitly requesting current user context via user interface and receiving the gesture input is an example of receiving a declared user context par. 38; Input mechanisms 142 may include gesture-sensitive sensors and devices, such as touch-based sensors and movement-tracking sensors (e.g., camera-based) par. 58; In some cases, a user may define or configure particular areas (e.g., sections or strips of screen area) within the multi-application environment as user-defined regions. par. 59; The region may be fixed, predefined, or dynamic, such as a region that changes size or position due to an orientation of a display or type of input received.)
Claim 11 depends on claim 1:
Vranjes teaches wherein the current user context comprises data relating to a user during the modification of the display window from among the following data:
declared data of the current user context; (e.g., gesture input; par. 38; Input mechanisms 142 may include gesture-sensitive sensors and devices, such as touch-based sensors and movement-tracking sensors (e.g., camera-based)
measured data of the current user context; (e.g., sensors measuring data of gesture input par. 38; Input mechanisms 142 may include gesture-sensitive sensors and devices, such as touch-based sensors and movement-tracking sensors (e.g., camera-based))
position data of the user; (e.g., sensors measuring position data of gesture input par. 38; Input mechanisms 142 may include gesture-sensitive sensors and devices, such as touch-based sensors and movement-tracking sensors (e.g., camera-based))
field of vision data of the user;
mobility data of the user;
data relating to a real environment in which the user is located.
Independent Claim 12:
Vranjes teaches A method implemented by a device and comprising:
managing an immersive environment comprising a virtual workstation,
the virtual workstation comprising a display window associated with a processing reproducing data produced by the associated processing,
(e.g., managing application windows in a virtual immersive environment par. 42; Thus, in some cases multi-application environment module 118 presents a multi-application environment as an immersive environment par. 54; Alternately or additionally, multi-application environment 202 may be implemented as a desktop, virtual or otherwise, and include a control area, which is shown as application management UI 210 or a start menu (not shown).)
the managing comprising: triggering a modification of a display window, the triggering of the modification of the display window triggering setting a parameter for reproducing a display window to a value that is determined as a function of a current user context prior to the reproduction of the modified display window. (e.g., triggering by dragging an application window to a region of a workspace allows the window manager to identify when to set parameter for a size and/or position parameter of the application window. Examiner notes that the dragging is prior to reproduction of the application window and that based on dragging input by a user is considered “a function current user context” such as declared input of a user interface as described by instant specification par. 85 par. 63; In the context of workspace 400, assume window 438 is being dragged into a corner region of workspace 400. Here, window manager 132 identifies an edge of application window 410, which is adjacent to the region into which application window 438 is moving. Par. 64; At 306, a size or a position is determined for the application window based on the edge of the other application window. The size or position of the application window may be determined such that the application window fills the region to the edge of the other application window. Par. 64; At 306, a size or a position is determined for the application window based on the edge of the other application window. The size or position of the application window may be determined such that the application window fills the region to the edge of the other application window. In some cases, the size or position is determined such that an edge of the application window aligns with a complimentary edge of an adjacent application window. In such cases, the application window and adjacent application window may have a same width or a same height. Par. 69; Placement of the application window may be responsive to input to add, switch, or move an application window in the multi-application environment. In some cases, the input is a gesture or edge trigger action in which an application window, or visual representation thereof, is dragged to or moved against an edge of the multi-application environment. )
Independent Claim 14:
Claim 14 is substantially encompassed in claim 1, therefore, Examiner relies on the same rationale set forth in claim 1 to reject claim 14.
Independent Claim 15:
Vranjes teaches An immersive reality system comprising:
an immersive environment reproduction device; (e.g., tablet device; par. 7; FIG. 2 illustrates an example tablet computing device having a touch-sensitive display presenting an immersive interface.)
a user context receiver; (e.g., sensors par. 38; 36. FIG. 1 illustrates four example displays, which may be separate or integrated with computing device 102. Input mechanisms 142 may include gesture-sensitive sensors and devices, such as touch-based sensors and movement-tracking sensors (e.g., camera-based), as well as mice (free-standing or integral with a keyboard), a stylus, touch pads, accelerometers, and microphones with accompanying voice recognition software, to name a few. Input mechanisms 136 may be separate or integral with displays 134; integral examples include gesture-sensitive displays with integrated touch-sensitive or motion-sensitive sensors.)
and an immersive environment manager comprising a virtual workstation, the virtual workstation comprising a display window associated with a processing reproducing data produced by the associated processing, (e.g., managing application windows in a virtual immersive environment par. 42; Thus, in some cases multi-application environment module 118 presents a multi-application environment as an immersive environment par. 54; Alternately or additionally, multi-application environment 202 may be implemented as a desktop, virtual or otherwise, and include a control area, which is shown as application management UI 210 or a start menu (not shown).)
the immersive environment manager comprising a trigger for triggering a modification of the display window, the triggering of the modification of the display window triggering setting a parameter for reproducing the display window to a value that is determined as a function of a current user context prior to the reproduction of the modified display window. (e.g., triggering by dragging an application window to a region of a workspace allows the window manager to identify when to set parameter for a size and/or position parameter of the application window. Examiner notes that the dragging is prior to reproduction of the application window and that based on dragging input by a user is considered “a function current user context” such as declared input of a user interface as described by instant specification par. 85 par. 63; In the context of workspace 400, assume window 438 is being dragged into a corner region of workspace 400. Here, window manager 132 identifies an edge of application window 410, which is adjacent to the region into which application window 438 is moving. Par. 64; At 306, a size or a position is determined for the application window based on the edge of the other application window. The size or position of the application window may be determined such that the application window fills the region to the edge of the other application window. Par. 64; At 306, a size or a position is determined for the application window based on the edge of the other application window. The size or position of the application window may be determined such that the application window fills the region to the edge of the other application window. In some cases, the size or position is determined such that an edge of the application window aligns with a complimentary edge of an adjacent application window. In such cases, the application window and adjacent application window may have a same width or a same height. Par. 69; Placement of the application window may be responsive to input to add, switch, or move an application window in the multi-application environment. In some cases, the input is a gesture or edge trigger action in which an application window, or visual representation thereof, is dragged to or moved against an edge of the multi-application environment. )
Claim 16 depends on claim 15:
Vranjes teaches wherein the user context receiver is a device from among the following devices: a user interface of the immersive environment reproduction device; a sensor. (e.g., sensors par. 38; 36. FIG. 1 illustrates four example displays, which may be separate or integrated with computing device 102. Input mechanisms 142 may include gesture-sensitive sensors and devices, such as touch-based sensors and movement-tracking sensors (e.g., camera-based), as well as mice (free-standing or integral with a keyboard), a stylus, touch pads, accelerometers, and microphones with accompanying voice recognition software, to name a few. Input mechanisms 136 may be separate or integral with displays 134; integral examples include gesture-sensitive displays with integrated touch-sensitive or motion-sensitive sensors.)
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Vranjes as cited above, in view of Gorur et al. (hereinafter “Gorur”), U.S. Published Application No. 20190026936 A1.
Claim 10 depends on claim 1:
Vranjes fails to expressly teach wherein the current user context comprises a user constraint.
However, Gorur teaches wherein the current user context comprises a user constraint. (e.g., determining viewpoint or visibility constraints of a user par. 24; the extended reality computing system or the mobile device can determine a portion of the virtual environment (e.g., a “scene”) currently visible to the user through the HMD or augmented reality eyewear. Par. 28; a determined visibility of a face of each of the other users to the virtual assistant disposed at the particular candidate position par. 89; Based on the relationships shown in FIGS. 6B-6D, position determination module 170 determines c.sub.vis, a value indicative of visibility of a face of the user 601 to the candidate virtual assistant 604A disposed at the candidate position. Par. 95; In view of the determined lack of visibility, position determination module 170 can assign a value of 100 to c.sub.vis for user 601 and the candidate position associated with candidate virtual assistant 604A. par. 98; The computed placement scores may reflect structural and physical constraints imposed on the candidate positions by the augmented reality environment and, further, may reflect the visibility and proximity of the virtual assistant to each user within the augmented reality environment when the virtual assistant is placed at each of the candidate positions.)
In the same field of endeavor, namely determining desired placement of virtual objects for a user, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the placement of virtual objects as taught by Vranjes to be based on the visibility constraints of a user as taught by Gorur, to provide the benefit of improving the interaction between virtual objects and user (see Gorur, par. 2)
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Sakai; Yusuke et al. US 10785447 B2
Col. 6 line 8; Further, the space information acquisition unit 102 recognizes a user on the basis of the acquired space information. To recognize a user, it is assumed to identify the individual user in that space, or recognize the position (where the user is in the room, or the like), attitude (whether the user is standing, sitting, lying, or the like), emotion (whether the user is happy, sad, or the like), action (the user is cooking dinner, watching television, reading a book, or the like), busyness degree (whether or not the user is busying, or the like) of the user.
Bae; Seok Hyung US 20200242844 A1
[0090] In response to a hand motion of the user 50, the controller 120 may move the selected virtual window 510 to a location of the virtual window 520. The controller 120 may scale up the virtual window 510 based on the locations of the eyes 70 and the location of the hand 60. At this time, the virtual window 510 and the virtual window 520 may be the same in absolute size as each other and may differ in apparent size from each other. For example, the virtual window 520 may be larger in apparent size than the virtual window 510. That is, the controller 120 may scale up the virtual window 510 to the virtual window 520.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to HENRY ORR whose telephone number is (571)270-1308. The examiner can normally be reached 9AM-5PM EST M-F.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Adam Queler can be reached at (571)272-4140. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/HENRY ORR/Primary Examiner, Art Unit 2172