Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Specification
The disclosure is objected to because of the following informalities: page 9, paragraph 1 alternately refers to probability function 212 as either “function_1” or “funciton_1.” This function should be labeled consistently throughout the specification.
Appropriate correction is required.
Claim Rejections - 35 USC § 112
2. The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1 and 14 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
The term “uncertain” in claim 1 is a relative term which renders the claim indefinite. The term “uncertain” is not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. It is unclear which physical properties, if any, may be considered “uncertain,” or how certainty can be determined. For instance, the specification describes a simplistic example of a tree wherein the shape of a tree body are certain, but the coloration or presence of leaves is uncertain. However, the trunk of a tree can grow as a function of time, or be chopped down/deformed by lightning. The examiner asserts that no virtual data collected can be 100% representative of a real object in any situation where data is not being constantly collected at all times.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-4 and 13-17 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Van Dusen (US 11080336 B2).
Regarding claim 1, Van Dusen teaches a method comprising:
storing constant data in a non-volatile computer storage (col. 526, lines 65-67: “A database is required, in one embodiment, to save much of the work that the collective set of users have accomplished. This database holds the CMMDB.”),
wherein the constant data is a portion of a data (col.35, lines 2-6: “As used herein, the term “Common Mental Map Database” (CMMDB) refers to a stored collection of explanatory constructs making up a CMM, and all structural control and website data necessary for establishing and controlling the system.”),
wherein the constant data corresponds to physical properties of a plurality of virtual objects (col. 48, lines 9-12: “In one embodiment, Priority and Marking Filters mark displayed objects for importance or priority or other purpose utilizing shape enhancement, colors, fonts, shading, modified dimensions, etc.”),
wherein the physical properties of virtual objects remain constant when the data is read (col. 239, lines 52-55: “The data reader will store the locations and types of the original data sources, and the Query and Result Set Manager will record the actions of the user on that data.”),
wherein the constant data comprises at least one constant element(s) (col. 477, lines 15-26: “Category cnxpts prescribe their sub graph constraints. Each cnxpt is sized appropriately according to its relative importance and fit within its parent category cnxpt or grouping. Cnxpts are drawn with appropriate predefined shapes based upon their type. Where possible, positions previously calculated for a cnxpt, relative to its parent category, and secondarily relative to the elastic surface, are retained where no changes have occurred to the base information for the cnxpt. Cnxpts are kept from overlapping.”),
wherein the at least one constant element(s) is representative of physical properties of at least one of the plurality of virtual objects (col. 48, lines 9-12, as above); and
storing variable data in a non-volatile computer storage (col. 527, lines 47-53: “All of this will be transparent to the user through the use of a Data Abstraction Layer. This device will be able to keep track of internal and external data and present it to the user as one single data source. Users will still be able to re-import data that has changed or change the data in a linked data source, but the Data Abstraction Layer will show the data as if it comes from the same database.”),
wherein the variable data is a portion of the data (col. 527, liens 54-60: “The application will contain several plug-ins that will allow it to communicate with the various data sources. Additional plug-ins can be developed in the future by Patent Professionals or by a third party. The plug-ins will know how to open a particular type of data source and how to query it, and can thus manage the application's relationship with that given data source.”),
wherein the variable data corresponds to physical properties of the plurality of virtual objects (col. 48, lines 9-12, as above),
wherein the physical properties of virtual objects are uncertain during storing the data (col. 381, lines 57-59: “Generate a set of txo property vote summary items calculated for this cnxpt to generate a ‘fuzzy’ value for a single property of the cnxpt.”),
wherein the variable data comprises at least one variable element(s) (col. 381, lines 57-59, as above),
wherein the at least one variable element(s) is representative of uncertain physical properties of the least one of the plurality of virtual objects (col. 48, lines 9-12, as above),
wherein each variable element comprises a range of values and a probability function for the range of values (col. 557, lines 47-48: “xx. assigning said cnxpt to a fuzzy cnxpt class with a fuzziness given by a probability density function;”).
Regarding claim 2, Van Dusen teaches the method of claim 1, wherein one of the constant data and the variable data corresponds to a 3D scene description (col. 4, lines 45-53: “the Representation step comprising: calculating the similarities of ttxs; summarizing fxxt calculation specifications to extract pertinent ttxs and relationships; forming representative scene graph maps; distributing the scene graphs to a user computing system; generating the visualization on the user computing system; accepting user navigation of and interaction with the visualization; accepting votes for refinement; accumulating user interest information; reforming the visualization”).
Regarding claim 3, Van Dusen teaches the method of claim 1, wherein the probability function for a first variable element of the at least variable element(s) is chosen from the group consisting of a Gaussian distribution, a Poisson distribution, a Delta function, a discrete probability distribution, a continuous probability distribution, or a conditional probability distribution (col. 590, lines 23-28: “c. locating, for the set of second cnxpts, sample distributions having a significant variance indicating a multimodal probability density function for the test of whether a single belief is held by those voting on the association between the first cnxpt and one second cnxpt”).
Regarding claim 4, Van Dusen teaches the method of claim 1,
wherein the range of values is based on a user-defined range for each variable element (col. 78, lines 7-16: “The formulas specified will generally follow the style used for spreadsheet formulas, where relationship infxtypx reference iterators are similar to range specifications and specify, including, but not limited to a: relationship infxtypx, fxxts, scopxs, relationship role, relationship list; and cnxpt infxtypx references are similar to cell specifications and refer to, including, but not limited to: characteristic references, scopxs, cnxpt ranges, cnxpt lists, cnxpt characteristics, fxxts, txos, infxtypxs, txo characteristics, txo lists, and qualifications by txo characteristic.”),
wherein the probability function is based on a user-defined function for each variable element (col. 338, lines 4-15: “Fxxt calculation step templates provide, including but not limited to… …custom commonalities, such as: common text string; common specific value or range for some characteristic (attribute or txo property)”).
Regarding claim 13, Van Dusen teaches a computer program stored on a non-transitory medium, wherein the computer program when executed on a processor performs the method as claimed in claim 1 (col. 552, lines 22-29: “The Infrastructure described here provides a distributed framework and process for deployment, update, and administration of the CMM and CMMDB and the devices it is provided through. This framework encompasses the apparatus and process for implementing access, provisioning, and configuration policies, called ‘CMMSYS information packages’. A CMMSYS information package is a body of computer program code.”).
Claim 14 is substantially similar to claim 1, and differs primarily in that it teaches an apparatus rather than a method. As such, it is rejected on a similar basis as claim 1.
Claim 15 is substantially similar to claim 2, and differs only in that it derives from claim 14 rather than claim 1. As such, it is rejected on a similar basis as claim 2.
Claim 16 is substantially similar to claim 3, and differs only in that it derives from claim 14 rather than claim 1. As such, it is rejected on a similar basis as claim 3.
Claim 17 is substantially similar to claim 4, and differs only in that it derives from claim 14 rather than claim 1. As such, it is rejected on a similar basis as claim 4.
Claim(s) 5-9 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Bradski (US 20190094981 A1).
Regarding claim 5, Bradski teaches a method for controlling a sensor system based on data, wherein the data is representative of a plurality of virtual objects (par. 0176: “Data used to create an environment of a virtual world (including virtual objects) may include, for example, atmospheric data, terrain data, weather data, temperature data, location data, and other data used to define and/or describe a virtual environment.”), wherein the data comprises constant data and variable data (par. 0177: “Objects may be any type of animate or inanimate object, including but not limited to, buildings, plants, vehicles, people, animals, creatures, machines, data, video, text, pictures, and other users. Objects may also be defined in a digital world for storing information about items, behaviors, or conditions actually present in the physical world. The data that describes or defines the entity, object or item, or that stores its current state, is generally referred to herein as object data.”), the method comprising:
controlling the sensor system to search for real objects (par. 0608: “Based on this information, the AR system may retrieve data (e.g., specific geometries of real objects in the room, map points for the room, geometric information of the room, etc.) for that room to appropriately display virtual content in relation to the real objects of the identified room.”),
wherein the real objects correspond to the plurality of virtual objects (par. 0782: “Based on the recognized real objects and/or other information conveyed to the AR system, the desired virtual scene may be accordingly displayed to the user of the wearable AR system (5010).”),
wherein the plurality of virtual objects are represented in the variable data based on a range of values and/or probability functions of at least one variable element(s) (par. 0641: “One approach to find new points that avoids such a large search operation is by render rather than search. In other words, assuming the position of M keyframes are known and each of them has N points, the AR system may project lines (or cones) from N features to the M keyframes to triangulate a 3D position of the various 2D points. Referring now to FIG. 37, in this particular example, there are 6 keyframes 3702, and lines or rays are rendered (using a graphics card) from the 6 keyframes to the points 3704 derived from the respective keyframe. In one or more embodiments, new 3D map points may be determined based on the intersection of the rendered lines. In other words, when two rendered lines intersect, the pixel coordinates of that particular map point in a 3D space may be 2 instead of 1 or 0. Thus, the higher the intersection of the lines at a particular point, the higher the likelihood is that there is a map point corresponding to a particular feature in the 3D space. In one or more embodiments, this intersection approach, as shown in FIG. 37 may be used to find new map points in a 3D space.”); and
obtaining sensor data from the sensor system (par. 0842: “The system may generate various types of data and metadata from the collected sensor data.”),
wherein the sensor data is representative of physical properties of the real objects (par. 0183: “User devices may include additional components that enable user interaction such as sensors, wherein the objects and information (including gestures) detected by the sensors may be provided as input representing user interaction with the virtual world using the user device.”),
wherein the real objects correspond to the at least one variable element(s) (par. 0837: “This is implemented, for example, by having multiple cameras identify the same point from the projection (either from the textured light or from a real-world object), and then triangulating the correct location and depth information for that identified point through a texture extraction module 11508. This may be advantageous over the structured light and patterned light approaches because the texture pattern does not have to be known. Rather, the texture pattern is just triangulated from two more cameras. This is more robust to ambient light conditions.”).
Regarding claim 6, Bradski teaches the method of claim 5, further comprising:
converting the at least one variable element(s) to at least one temporary element(s) based on a portion of sensor data (par. 0203: “in some embodiments, the environment-sensing system (36) may include image-based 3D reconstruction software embedded in a local computing system (e.g., gateway component 14 or processor 38) and operable to digitally reconstruct one or more objects or information detected by the sensors 32.”),
wherein the at least one temporary element(s) are representative of physical properties of at least one of the plurality of virtual objects (par. 0203, as above); and
displaying the plurality of virtual objects based on the at least one constant element(s) and the at least one temporary element(s) (par. 0782: “if the information is new, object recognizers may run on the new data, and the data may be transmitted to one or more wearable AR systems (5008). Based on the recognized real objects and/or other information conveyed to the AR system, the desired virtual scene may be accordingly displayed to the user of the wearable AR system (5010). For example, the desired virtual scene (e.g., the walk with the user in San Francisco) may be displayed accordingly (e.g., comprising a set of real objects at the appropriate orientation, position, etc.) in relation to the various objects and other surroundings of the user in New York.”).
Regarding claim 7, Bradski teaches the method of claim 6, wherein the sensor system is arranged to update sensor data periodically, wherein the converting is repeated each time the sensor data is updated, wherein the displaying is repeated each time the sensor data is updated (par. 0191: “The part of the dynamic object that is changing can be updated by a real-time, threaded high priority data stream from a server 11, through computing network 5, managed by the gateway component 14.”).
Regarding claim 8, Bradski teaches the method of claim 7, wherein the range of values and/or the probability distribution of a first variable element is based on at least one item selected from the group consisting of previous values of the first variable element, values of a second variable element, or previous values of the second variable element (par. 0756: “The approach described herein provides a very complex artificial intelligence (AI) property by performing deterministic acts with completely deterministic globally visible mechanisms for transitioning from one state to another. These actions are implicitly map-able to a behavior that a user cares about. Constant insight through monitoring of these global values of an overall state of the system is required, which allows the insertion of other states or changes to the current state.”).
Regarding claim 9, Bradski teaches the method of claim 5, wherein the sensor data comprises at least one of visual sensor data (par. 0200: “The user sensing system 34 may also include one or more infrared camera sensors, one or more visible spectrum camera sensors, structured light emitters and/or sensors, infrared light emitters, coherent light emitters and/or sensors, gyros, accelerometers, magnetometers, proximity sensors, GPS sensors, ultrasonic emitters and detectors and haptic interfaces.”), infrared sensor data (par. 0200, as above), microwave sensor data, ultrasound sensor data, audio sensor data (par. 0205: “in some embodiments, the environment-sensing system 36 may include a microphone for receiving audio from the local environment.”), position sensor data, accelerometer sensor data (par. 0200, as above), or global positioning system data (par. 0200, as above).
Regarding claim 18, Bradski teaches a computer program stored on a non-transitory medium, wherein the computer program when executed on a processor performs the method as claimed in claim 5 (par. 0170: “The one or more servers 11 each comprise one or more processors for executing program instructions. The servers may also include memory for storing the program instructions and data that is used and/or generated by processes being carried out by the servers 11 under direction of the program instructions.”).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 10-12 is/are rejected under 35 U.S.C. 103 as being unpatentable over Van Dusen (US 11080336 B2) as applied to claim 1 above, and further in view of Bradski (US 20190094981 A1).
Regarding claim 10, Van Dusen teaches the method of claim 1, but fails to teach obtaining a historic sensor data log from a sensor system,
wherein the historic sensor data log comprises the physical properties, and changes of the physical properties,
wherein the physical properties, and changes of the physical properties correspond to the at least one variable element(s); and
modifying the range of values and/or probability functions of the at least one variable element(s) based on historic data of the range of values and/or probability functions.
Bradski teaches obtaining a historic sensor data log from a sensor system (par. 0847: “In some embodiments, as the user utilizes the wearable device, historical data about the user is being acquired and maintained, e.g., to reflect location, activity, and copies of sensor data for that user over a period of time.”),
wherein the historic sensor data log comprises the physical properties, and changes of the physical properties (par. 0847, as above),
wherein the physical properties, and changes of the physical properties correspond to the at least one variable element(s) (par. 0847, as above); and
modifying the range of values and/or probability functions of the at least one variable element(s) based on historic data of the range of values and/or probability functions (par. 0939: “To estimate a pose at n, the wearable system may use historical data gathered from S-poses and O-poses (n−1, n−2, n−3, etc.). The pose at n is then used to project fiducials into the image captured at n to create an image mask from the projection. The wearable system extracts points from the masked regions and calculates the O-pose from the extracted points and mature world fiducials.”).
It would have been obvious to one familiar in the art prior to the effective filing date of the claimed invention to introduce elements of Van Dusen’s fuzzy conept mapping and technology prediction into Bradski’s AR/VR systems, as both are in the same field of endeavor of machine learning-based predictive technology. Utilizing Van Dusen’s crowd-sourced data collection method would enable Bradski’s historic data log to collect far more information with which to make more accurate predictions.
Regarding claim 11, Van Dusen and Bradski teach the method of claim 10. Van Dusen further teaches wherein modifying the range of values and/or probability functions is based on the output of a machine learning algorithm (par. 0844: “One or more remote servers can be used to perform the processing 11602 (e.g., machine learning processing) to analyze sensor data and to identify/generate the relevant semantic map data.”),
wherein the machine learning algorithm is trained to identify patterns between the historic data of the range of values (210) and/or probability functions and historic sensor data logs (par. 0844, as above).
Regarding claim 12, Van Dusen and Bradski teach the method of claim 10. Bradski further teaches modifying the range of values and/or the probability function based on the maximum a posteriori estimate of the historic sensor data log (par. 0939: “To estimate a pose at n, the wearable system may use historical data gathered from S-poses and O-poses (n−1, n−2, n−3, etc.). The pose at n is then used to project fiducials into the image captured at n to create an image mask from the projection.”).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to RYAN A BARHAM whose telephone number is (571)272-4338. The examiner can normally be reached Mon-Fri, 8:30am-5pm EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Xiao Wu, can be reached at (571) 272-7761. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/RYAN ALLEN BARHAM/Examiner, Art Unit 2613
/XIAO M WU/Supervisory Patent Examiner, Art Unit 2613