DETAILED ACTION
This communication is in response to the amendment/remarks filed 17 September 2025.
Claims 6, 18, and 20 have been canceled. Claims 23-25 have been added. Claims 1-5, 7-12, 15-17, and 19 have been amended.
Claims 1-5, 7-17, 19, and 23-25 are currently pending.
Claims 1-5, 7-17, 19, and 23-25 are rejected.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment/Remarks
Regarding 35 USC § 101, Applicant’s arguments have been fully considered but are not persuasive. Applicant argues that the limitations added into the independent claims of determining dwell duration “does not fall within one of the enumerated categories of organizing human activity.” Remarks at 11. Examiner respectfully disagrees. Determining dwell duration of a user in front of an advertisement is advertising activity. Exposure probability is a metric often utilized in advertising and thus, even with these added limitations, the claims continue to recite an abstract idea.
Applicant argues using the SiRF case, arguing that “a human mind cannot practically perform calculations based upon [GPS data, accelerometer data, and/or gyroscope data] said data.” Remarks at 12. Given GPS, accelerometer, or gyroscope data, a human can certainly perform a calculation based on the given data. Of note is that the claims in SiRF recited a GPS receiver and the court determined that the receiver played a “significant part in permitting the claimed method to be performed” which is in opposition to the current claims which utilize data received. The GPS data received is not a significant part of permitting the claimed method to be performed. This argument is not persuasive.
Regarding 35 USC § 102/103, these arguments are moot in view of the rejections below.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-5, 7-17, 19, and 23-25 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more.
Step 1
The claims recite a series of steps and, therefore, is a process.
Step 2A-Prong One
(Claims 1, 3, 15, and 19) The compute exposure probabilities based on a comparison of the received device location information to the accessed object placement information step, as drafted, is a process that under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of generic computer components. For example, but for the “server computer” (claim 1), “computer-implemented” (claim 3), “module” (claim 15), or “computing system” (claim 19) language, the claim encompasses a user manually determining an exposure probability based on known information.
Claims 1-5, 7-17, 19, and 23-25 recite the concept of determining, based on a user’s location and a billboard’s location, the probability that a user actually viewed the advertisement on the billboard. This concept falls into the certain methods of organizing human activity grouping including advertising activities. Thus, the claims recite an abstract idea.
The dependent claims further limit the abstract idea by further narrowing object placement information and location information and by further narrowing how the probability is determined. The dependent claims do not remove the claims from any of the abstract idea groupings.
The mere nominal recitation of a generic server apparatus does not take the claim limitations out of the mental processes grouping. Thus, the claims recite a mental process.
Step 2A-Prong Two
This judicial exception is not integrated into a practical application. The claims recite the additional elements of a system comprising a server computer and memory (claims 1, 2, and 25), a computer (claims 3-5 and 7-14), a system comprising modules (claims 15-17) or a computer-readable storage medium (claims 19, 23, and 24) and includes no more than mere instructions to apply the exception using these generic computer components. The system comprising a server computer and memory (claims 1, 2, and 25), computer (claims 3-5 and 7-14), system comprising modules (claims 15-17) or computer-readable storage medium (claims 19, 23, and 24) does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea.
Additionally, the step of “receive device location information” is mere data gathering. This step is considered insignificant extra-solution activity and does not integrate the abstract idea into a practical application.
Step 2B
The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed previously with respect to Step 2A-Prong Two, the additional element in the claim amounts to no more than mere instructions to apply the exception using a generic computer component. The same analysis applies here in Step 2B, i.e., mere instructions to apply an exception using a generic computer component cannot integrate a judicial exception into a practical application at Step 2A or provide an inventive concept in Step 2B. See MPEP 2106.05(f). The claims do not provide an inventive concept (significantly more than the abstract idea). The claims are ineligible.
Receiving device location information is considered routine, conventional, and well-understood. See MPEP 2106.05(d)(II) i. Receiving or transmitting data over a network.
Further, Claims 19, 23, and 24 are rejected under 35 U.S.C. 101 because the broadest reasonable interpretation of “computer-readable storage medium” in light of the specification as it would be interpreted by one of ordinary skill in the art encompasses transitory forms of signal transmission, non-statutory subject matter.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention.
Claims 15 and 17 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by U.S. 2011/0161163 (“Carlson”).
Regarding Claim 15, Carlson teaches a system comprising:
a device location module that receives device location information associated with a mobile device positioned at a certain geographic location (See “the data processing apparatus or system 100 can be configured as a device such as a Smart Phone or other computing device. System 100 as depicted in FIG. 1 can include, in some embodiments, a wearable brain activity-sensing device or module 102 and an eye tracking device or module 103 that can monitor the relative direction of an eyeball of an individual with respect to a relative direction of signage that is in the field of view of the eyeball” in ¶ 0023, “System 100 can additionally include processor 112 for processing and correlating the user, content, location, surveillance, direction, velocity and field of view data collected by the system sensors such as from the surveillance module 104, eye tracking module 103, brain activity-sensing module 102 and the location detection system 116. The location detection system 116 can include at least one sensor that detects the location and velocity of the system 100 or the user, relative to the earth” in ¶ 0027, “Such systems can detect location and velocity with a cellular communications system using tower directions and a process know as triangulation in a network location system, such as a wireless local area network location system, a GPS system, or a beacon type system, for example” in ¶ 0028, and “The location detection module 116 can continue collecting data regarding the individual's location as that individual moves in relation to signs and specific media content. The location information can be collected and stored in the individual's mobile communications device, which may then be transmitted to a server 124 operated by a service provider at a remote location for processing” in ¶ 0034.);
a placement information module that receives object placement information associated with content presented by a physical object at the certain geographic location (See “A database can store the locus of points that define the plane of the sign as it relates to coordinates, or a street addressed on the earth” in ¶ 0058 and “Because the signage locations can be stored in a sign location database as a locus of points, in addition to street addresses of the signs, the location detection system 116 of the system 100 can locate and transmit the signage location using data contained in the aforementioned sign location database. A remote computer server can then compute the locus of points that define the sign and the locus of coordinates that define the user field of view. The server can then determine if the sign is within the user's field of view. A sign's elevation data can be determined using many different methods, including using elevation data acquired by the Shuttle Radar Topography Mission (SRTM), the Aster Global Digital Elevation Model (Aster GDEM), or the GTOPO30 Global Digital Elevation Model (DEM), and so forth” in ¶ 0059.); and
an exposure determination module that determines an exposure probability that is based on a comparison of the received device location information to the received object placement information (See “The user's impressions or exposure to signage content can then be determined by server 124 to correlate information obtained from the eye-tracking module 100 and from sign location/content database 128. The sign location/content can track what content is on what sign during a specific time period. The eye-tracking module 103 can be co-located with many different location, direction and velocity sensors. Information provided by these sensors can be correlated with a signage location detection system or data in the database 128 to correlate and provide positive information about the signage information viewed by the user” in ¶ 0031 and “determining a probability that the user views the at least one sign utilizing parameters comprising the location of the sign(s), the location of the user, the field of view of the user; and/or the direction of the field of view of the user” in ¶ 0074.),
wherein determining the exposure probability comprises: determining a dwell duration based upon the viewshed characteristics of the physical object at least upon one or more of GPS data, accelerometer data, or gyroscope data collected from the mobile device; and when the dwell duration exceeds a threshold time period, increasing the exposure probability (See “determining when the locus of points defining the sign(s) and the locus of points defined by the field of view intersect, determining a time frame of the intersection of the locus of points, and determining the content present on the sign(s) during the time frame” in ¶ 0077, “the location of the field of view can be detected utilizing one or more of the following: a Global Position Satellite receiver” in ¶ 0080, and “determining a probability, by executing a program instruction in a wearable data-processing apparatus, that the user views the at least one sign utilizing parameters comprising: a dwell time; the location of the at least one sign; the location of the user; the field of view of the user; and the direction of the field of view of the user” in claim 3.).
Regarding Claim 17, Carlson further teaches exposure determination module determines the exposure probability as a function of the received device location information and the received object placement information, as L_exposure= F(EXPOSURE |Pu, Bi); where L_exposure is the exposure probability, Pu is a set of location trace characteristics for the mobile device, and Bi is a set of placement characteristics of the content presented by the physical object (See “The user's impressions or exposure to signage content can then be determined by server 124 to correlate information obtained from the eye-tracking module 100 and from sign location/content database 128. The sign location/content can track what content is on what sign during a specific time period. The eye-tracking module 103 can be co-located with many different location, direction and velocity sensors. Information provided by these sensors can be correlated with a signage location detection system or data in the database 128 to correlate and provide positive information about the signage information viewed by the user” in ¶ 0031 and “determining a probability that the user views the at least one sign utilizing parameters comprising the location of the sign(s), the location of the user, the field of view of the user; and/or the direction of the field of view of the user” in ¶ 0074.).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-3, 5, 8-10, 13, 14, 19, 23-25 are rejected under 35 U.S.C. 103 as being unpatentable over Carlson in view of U.S. 2008/0140479 (“Mello”).
Regarding Claim 1, Carlson teaches a system, comprising: at least one server computer; at least one memory storing instructions for execution by the at least one server computer (See “Memory can 114 also store operating instructions for execution by the processor 112. Note that the processor 112 can be implemented as, for example, a CPU (Central Processing Unit) or the portion of a data-processing system/apparatus (e.g., a computer, server, etc) that carries out the instructions of a computer program as the primary element for carrying out the data-processing functions of such a computer” in ¶ 0032.), wherein the instructions are configured to cause the at least one server computer to perform operations comprising:
receive device location information associated with one or more mobile devices positioned at certain geographic locations, wherein a mobile device provides, directly or indirectly, a series of location information to the server computer via a network (See “the data processing apparatus or system 100 can be configured as a device such as a Smart Phone or other computing device. System 100 as depicted in FIG. 1 can include, in some embodiments, a wearable brain activity-sensing device or module 102 and an eye tracking device or module 103 that can monitor the relative direction of an eyeball of an individual with respect to a relative direction of signage that is in the field of view of the eyeball” in ¶ 0023, “System 100 can additionally include processor 112 for processing and correlating the user, content, location, surveillance, direction, velocity and field of view data collected by the system sensors such as from the surveillance module 104, eye tracking module 103, brain activity-sensing module 102 and the location detection system 116. The location detection system 116 can include at least one sensor that detects the location and velocity of the system 100 or the user, relative to the earth” in ¶ 0027, “Such systems can detect location and velocity with a cellular communications system using tower directions and a process know as triangulation in a network location system, such as a wireless local area network location system, a GPS system, or a beacon type system, for example” in ¶ 0028, and “The location detection module 116 can continue collecting data regarding the individual's location as that individual moves in relation to signs and specific media content. The location information can be collected and stored in the individual's mobile communications device, which may then be transmitted to a server 124 operated by a service provider at a remote location for processing” in ¶ 0034. Coordinates are an inherent output of a GPS system.);
access object placement information associated with visually-perceptible content presented by physical objects at the certain geographic locations, wherein the object placement information for each physical object includes viewshed characteristics for the physical object (See “A database can store the locus of points that define the plane of the sign as it relates to coordinates, or a street addressed on the earth” in ¶ 0058 and “Because the signage locations can be stored in a sign location database as a locus of points, in addition to street addresses of the signs, the location detection system 116 of the system 100 can locate and transmit the signage location using data contained in the aforementioned sign location database. A remote computer server can then compute the locus of points that define the sign and the locus of coordinates that define the user field of view. The server can then determine if the sign is within the user's field of view. A sign's elevation data can be determined using many different methods, including using elevation data acquired by the Shuttle Radar Topography Mission (SRTM), the Aster Global Digital Elevation Model (Aster GDEM), or the GTOPO30 Global Digital Elevation Model (DEM), and so forth” in ¶ 0059. A viewshed characteristic is interpreted to mean any characteristic having an impact on a viewshed (e.g., elevation).); and
compute exposure probabilities based on a comparison of the received device location information to the accessed object placement information, wherein a computed exposure probability represents a probability that a user of one of the one or more mobile devices visually perceived content presented by one of the physical objects (See “The user's impressions or exposure to signage content can then be determined by server 124 to correlate information obtained from the eye-tracking module 100 and from sign location/content database 128. The sign location/content can track what content is on what sign during a specific time period. The eye-tracking module 103 can be co-located with many different location, direction and velocity sensors. Information provided by these sensors can be correlated with a signage location detection system or data in the database 128 to correlate and provide positive information about the signage information viewed by the user” in ¶ 0031 and “determining a probability that the user views the at least one sign utilizing parameters comprising the location of the sign(s), the location of the user, the field of view of the user; and/or the direction of the field of view of the user” in ¶ 0074.),
wherein determining the exposure probability comprises: determining a dwell duration based upon the viewshed characteristics of the physical object at least upon one or more of GPS data, accelerometer data, or gyroscope data collected from the mobile device; and when the dwell duration exceeds a threshold time period, increasing the exposure probability; wherein the exposure probability is used to determine a conversion event associated with the mobile device (See “determining when the locus of points defining the sign(s) and the locus of points defined by the field of view intersect, determining a time frame of the intersection of the locus of points, and determining the content present on the sign(s) during the time frame” in ¶ 0077, “the location of the field of view can be detected utilizing one or more of the following: a Global Position Satellite receiver” in ¶ 0080, and “determining a probability, by executing a program instruction in a wearable data-processing apparatus, that the user views the at least one sign utilizing parameters comprising: a dwell time; the location of the at least one sign; the location of the user; the field of view of the user; and the direction of the field of view of the user” in claim 3.).
Carlson does not expressly teach providing the exposure probability.
However, Mello teaches providing the exposure probability (See “example, for each valid location identifier, the data analyzer 208 can determine if that location is sufficiently proximate to an advertisement at a fixed or stationary location such that the monitored person 32 would have been exposed to the advertisement so that the monitored person 32 could be influenced by the advertisement to perform some action (e.g., request information, purchase a product/service, visit an advertised retail establishment, etc.) related to the advertisement. An example implementation of the advertisement exposure analysis based on location information is described below in connection with the example process of FIG. 7. In the illustrated example, the data file interface 206 stores information indicative of advertisements to which the monitored person 32 was exposed in an advertisement exposure data structure 624 (e.g., a file, a database, a table, etc.)” in ¶ 0090 and Fig. 2 showing a transmission/providing step must be performed between analyzer 208 and file interface 206.).
It would have been obvious to one having ordinary skill in the art at the time of filing to combine the teachings of Carlson and Mello to provide the exposure probability to an attribution system. The motivation, as shown in the Mello flowchart of Fig. 6, is to perform a next step of correlating ad exposure and retail visits with query information and ultimately credit query results as having influence on the user.
Regarding Claim 2, Carlson further teaches the object placement information for each physical object includes viewshed characteristics for content on facing surface of the physical object (See “This locus of points can be compared to and coordinated with the latitude and longitude information of the corners of the sign (defining a plane) in two dimensional space to determine if the user is being, or has been, exposed to the content on the sign” in ¶ 0029. Indication of a conclusion that the user has been exposed to the content indicates that the invention is referencing a facing surface rather than a back/unoccupied surface.), and wherein the receive device location information includes estimated heading information for the one or more mobile devices (See “Part of the system 300 can be collocated with earphones 302 and comprised of a number of eye tracking sensors, location sensors, direction sensors, and brain activity sensors” in ¶ 0047.).
Regarding Claim 3, Carlson teaches computer-implemented method of providing content exposure information to an attribution system, the method comprising:
receiving device location information associated with a target mobile device positioned at a certain geographic location (See “the data processing apparatus or system 100 can be configured as a device such as a Smart Phone or other computing device. System 100 as depicted in FIG. 1 can include, in some embodiments, a wearable brain activity-sensing device or module 102 and an eye tracking device or module 103 that can monitor the relative direction of an eyeball of an individual with respect to a relative direction of signage that is in the field of view of the eyeball” in ¶ 0023, “System 100 can additionally include processor 112 for processing and correlating the user, content, location, surveillance, direction, velocity and field of view data collected by the system sensors such as from the surveillance module 104, eye tracking module 103, brain activity-sensing module 102 and the location detection system 116. The location detection system 116 can include at least one sensor that detects the location and velocity of the system 100 or the user, relative to the earth” in ¶ 0027, “Such systems can detect location and velocity with a cellular communications system using tower directions and a process know as triangulation in a network location system, such as a wireless local area network location system, a GPS system, or a beacon type system, for example” in ¶ 0028, and “The location detection module 116 can continue collecting data regarding the individual's location as that individual moves in relation to signs and specific media content. The location information can be collected and stored in the individual's mobile communications device, which may then be transmitted to a server 124 operated by a service provider at a remote location for processing” in ¶ 0034.);
receiving object placement information associated with content presented by a physical object at the certain geographic location (See “A database can store the locus of points that define the plane of the sign as it relates to coordinates, or a street addressed on the earth” in ¶ 0058 and “Because the signage locations can be stored in a sign location database as a locus of points, in addition to street addresses of the signs, the location detection system 116 of the system 100 can locate and transmit the signage location using data contained in the aforementioned sign location database. A remote computer server can then compute the locus of points that define the sign and the locus of coordinates that define the user field of view. The server can then determine if the sign is within the user's field of view. A sign's elevation data can be determined using many different methods, including using elevation data acquired by the Shuttle Radar Topography Mission (SRTM), the Aster Global Digital Elevation Model (Aster GDEM), or the GTOPO30 Global Digital Elevation Model (DEM), and so forth” in ¶ 0059.);
determining an exposure probability that is based on a comparison of the received device location information to the received object placement information (See “The user's impressions or exposure to signage content can then be determined by server 124 to correlate information obtained from the eye-tracking module 100 and from sign location/content database 128. The sign location/content can track what content is on what sign during a specific time period. The eye-tracking module 103 can be co-located with many different location, direction and velocity sensors. Information provided by these sensors can be correlated with a signage location detection system or data in the database 128 to correlate and provide positive information about the signage information viewed by the user” in ¶ 0031 and “determining a probability that the user views the at least one sign utilizing parameters comprising the location of the sign(s), the location of the user, the field of view of the user; and/or the direction of the field of view of the user” in ¶ 0074.),
wherein determining the exposure probability comprises: determining a dwell duration based upon the viewshed characteristics of the physical object at least upon one or more of GPS data, accelerometer data, or gyroscope data collected from the mobile device; and when the dwell duration exceeds a threshold time period, increasing the exposure probability (See “determining when the locus of points defining the sign(s) and the locus of points defined by the field of view intersect, determining a time frame of the intersection of the locus of points, and determining the content present on the sign(s) during the time frame” in ¶ 0077, “the location of the field of view can be detected utilizing one or more of the following: a Global Position Satellite receiver” in ¶ 0080, and “determining a probability, by executing a program instruction in a wearable data-processing apparatus, that the user views the at least one sign utilizing parameters comprising: a dwell time; the location of the at least one sign; the location of the user; the field of view of the user; and the direction of the field of view of the user” in claim 3.).
Carlson does not expressly teach providing the determined exposure probability to the attribution system.
However, Mello teaches providing the determined exposure probability to the attribution system (See “example, for each valid location identifier, the data analyzer 208 can determine if that location is sufficiently proximate to an advertisement at a fixed or stationary location such that the monitored person 32 would have been exposed to the advertisement so that the monitored person 32 could be influenced by the advertisement to perform some action (e.g., request information, purchase a product/service, visit an advertised retail establishment, etc.) related to the advertisement. An example implementation of the advertisement exposure analysis based on location information is described below in connection with the example process of FIG. 7. In the illustrated example, the data file interface 206 stores information indicative of advertisements to which the monitored person 32 was exposed in an advertisement exposure data structure 624 (e.g., a file, a database, a table, etc.)” in ¶ 0090 and Fig. 2 showing a transmission/providing step must be performed between analyzer 208 and file interface 206.).
It would have been obvious to one having ordinary skill in the art at the time of filing to combine the teachings of Carlson and Mello to provide the exposure probability to an attribution system. The motivation, as shown in the Mello flowchart of Fig. 6, is to perform a next step of correlating ad exposure and retail visits with query information and ultimately credit query results as having influence on the user.
Regarding Claim 5, Carlson further teaches determining an exposure probability that is based on a comparison of the received device location information to the received object placement information includes determining an exposure probability as a function of the received device location information and the received object placement information, as L_exposure= F(EXPOSURE |Pu, Bi); where L_exposure is the exposure probability, EXPOSURE is a state of exposure, Pu is a set of location trace characteristics for the mobile device, and Bi is a set of placement characteristics of the content presented by the physical object (See “The user's impressions or exposure to signage content can then be determined by server 124 to correlate information obtained from the eye-tracking module 100 and from sign location/content database 128. The sign location/content can track what content is on what sign during a specific time period. The eye-tracking module 103 can be co-located with many different location, direction and velocity sensors. Information provided by these sensors can be correlated with a signage location detection system or data in the database 128 to correlate and provide positive information about the signage information viewed by the user” in ¶ 0031 and “determining a probability that the user views the at least one sign utilizing parameters comprising the location of the sign(s), the location of the user, the field of view of the user; and/or the direction of the field of view of the user” in ¶ 0074.).
Regarding Claim 8, Carlson further teaches determining an exposure probability that is based on a comparison of the received device location information to the received object placement information further includes determining an exposure probability between 0 percent to 100 percent, inclusive (See “determining a probability that the user views the at least one sign utilizing parameters comprising the location of the sign(s), the location of the user, the field of view of the user; and/or the direction of the field of view of the user” in ¶ 0074.).
Regarding Claim 9, Carlson further teaches determining an exposure probability that is based on a comparison of the received device location information to the received object placement information further comprises determining an exposure probability as a binary value indicative of a confirmed exposure (See “determining a probability that the user views the at least one sign utilizing parameters comprising the location of the sign(s), the location of the user, the field of view of the user; and/or the direction of the field of view of the user” in ¶ 0074.).
Regarding Claim 10, Carlson further teaches receiving device location information associated with a mobile device positioned at a certain geographic location includes comprises information from one or more GPS sensors of the target mobile device (See “Such systems can detect location and velocity with a cellular communications system using tower directions and a process know[n] as triangulation in a network location system, such as a wireless local area network location system, a GPS system, or a beacon type system, for example” in ¶ 0028.).
Regarding Claim 13, Carlson further teaches the content presented by the physical object at the certain geographic location is an advertisement presented by a billboard (See “Billboards and poster content is static if printed on paper or other material. Recently, however, the advertising industry began using electronic billboards and digital displays, generally. These digital billboards appear primarily on major highways, expressways or principal arterials, and command high-density consumer exposure, mostly to vehicular traffic. Digital billboards provide greater visibility to viewers than static, content-based display because of the digital billboard's commanding size and the advertiser's ability to perform customizable updates on the digital billboard. Not only can many advertisers share a single digital billboard each month, but an advertiser can change their content in near real-time. These new types of displays are also capable of imparting different advertisements on a single digital display, with each display changing content advertisements over a period of a few minutes” in ¶ 0005.).
Regarding Claim 14, Carlson further teaches the content presented by the physical object at the certain geographic location is an advertisement presented by a vehicle traveling within the certain geographic location (See “Smaller displays of electronic and printed matter are also present … in taxis” in ¶ 0004.).
Regarding Claim 19, this claim is sufficiently similar to claim 3 and is rejected similarly.
Regarding Claim 23, Carlson further teaches the content presented by the physical object is a stationary advertisement (See “Billboards and poster content is static if printed on paper or other material. Recently, however, the advertising industry began using electronic billboards and digital displays, generally. These digital billboards appear primarily on major highways, expressways or principal arterials, and command high-density consumer exposure, mostly to vehicular traffic. Digital billboards provide greater visibility to viewers than static, content-based display because of the digital billboard's commanding size and the advertiser's ability to perform customizable updates on the digital billboard. Not only can many advertisers share a single digital billboard each month, but an advertiser can change their content in near real-time. These new types of displays are also capable of imparting different advertisements on a single digital display, with each display changing content advertisements over a period of a few minutes” in ¶ 0005.).
Regarding Claim 24, Carlson further teaches the content presented by the physical object is an advertisement presented by a vehicle traveling within the certain geographic location (See “Smaller displays of electronic and printed matter are also present … in taxis” in ¶ 0004.).
Regarding Claim 25, Carlson further teaches the content presented by the physical object is one of a stationary advertisement or an advertisement presented by a vehicle traveling within the certain geographic location (See “Smaller displays of electronic and printed matter are also present … in taxis” in ¶ 0004 and See “Billboards and poster content is static if printed on paper or other material. Recently, however, the advertising industry began using electronic billboards and digital displays, generally. These digital billboards appear primarily on major highways, expressways or principal arterials, and command high-density consumer exposure, mostly to vehicular traffic. Digital billboards provide greater visibility to viewers than static, content-based display because of the digital billboard's commanding size and the advertiser's ability to perform customizable updates on the digital billboard. Not only can many advertisers share a single digital billboard each month, but an advertiser can change their content in near real-time. These new types of displays are also capable of imparting different advertisements on a single digital display, with each display changing content advertisements over a period of a few minutes” in ¶ 0005.).
Claim 7 is rejected under 35 U.S.C. 103 as being unpatentable over Carlson in view of Mello, and further in view of U.S. Patent Application Publication No. 2009/0265215 (“Lindstrom”).
Regarding Claim 7, Carlson teaches receiving survey information associated with the content presented by the physical object, wherein the survey information includes information from users of other mobile devices that viewed the content presented by the physical object (See “In some embodiments, the system 200 can be passive and acquire user related data in an entirely automated process. In other embodiments, the system 200 can be interactive where a user can be queried to see if the user viewed specific content of the advertisement. The interactive system can further query the user to determine if the user wants additional information, coupons, or some sort of an upgrade, deal or promotion. The advertising facilitator 202, via the rewards program, may also provide an incentive for the user to provide feedback as to whether the user viewed specific content and the user's reaction to such content, such as a "thumbs up" and a "thumbs down" selection, or a question to see if the user knows exactly what is being advertised. A user could also request additional information that is related to the advertisement's content through the system 200” in ¶ 0046.).
Carlson does not expressly teach wherein determining the exposure probability further comprises determining the exposure probability based on a comparison of the received device location information to the received object placement information and the received survey information.
However, Lindstrom teaches wherein determining the exposure probability further comprises determining the exposure probability based on a comparison of the received device location information to the received object placement information and the received survey information (See “In some instances, a media research company can recruit panel members that are surveyed or tracked to determine advertisement/informational media to which each panel member was exposed. For example, if a panel member indicates that he or she visited a particular area, it may be concluded that the panel member was exposed to an advertisement or signage displayed in that area. The survey results or location tracking information can then be processed to determine the number of exposure instances for each advertisement or signage that is part of a media research study. The panel member exposures can then be used to infer the number of exposures to the generic public for each advertisement or signage. These exposure numbers can be used by product manufacturers, service providers, and advertisers to better market their products” in ¶ 0003 and “To determine the duration of exposure to each of the media 110a-h, a metering entity may provide a predetermined typical duration of stay or dwell time for a typical person that visits the fitness environment 100. In some example implementations, different predetermined typical durations or dwell times may be provided for each of the different areas 102, 104, 106, and 108 of the fitness environment. For example, a predetermined typical dwell time of a person in the foyer 102 may be thirty seconds, while a predetermined typical dwell time of a person in the cardio area 104 may be thirty minutes. The predetermined typical durations of stay or dwell times can be determined based on responses to survey questionnaires via which people are asked to provide the amounts of times they spent in particular ones of the areas 102, 104, 106, and 108 during one or more typical exercise sessions. This technique for determining predetermined typical durations of stay or dwell times may be used in connection with any other environments described below in connection with FIGS. 2-4 and or any other environment for which the example methods and apparatus described herein are used to monitor audience exposure to media. In the illustrated example of FIG. 1, the predetermined typical dwell times or durations of stay can be used in connection with transaction data collected using the card swipe station 112, the people counters 116a-d, and/or survey questionnaires to determine durations of exposure to the media 110a-h” in ¶ 0029.).
It would have been obvious to one having ordinary skill in the art at the time of filing to combine the teachings of Carlson and Lindstrom to utilize survey data from others to calculate exposure probability. The motivation, as shown in Lindstrom, is to “infer the number of exposures to the generic public for each advertisement or signage.”
Claims 11 and 12 are rejected under 35 U.S.C. 103 as being unpatentable over Carlson in view of Mello, and further in view of U.S. Patent Application Publication No. 2013/0339140 (“Pokorny”).
Regarding Claim 11, Carlson teaches receiving object placement information associated with content presented by a physical object at the certain geographic location comprises receiving information identifying lat-long coordinates of the physical object (See “defining a sign utilizing a locus of points in three-dimensional space, wherein the locus of points defines a plane, the plane having latitudinal and longitudinal coordinates” in claim 5.).
Carlson does not expressly teach receiving information identifying a facing angle of the content presented by the physical object.
However, Pokorny teaches receiving information identifying a facing angle of the content presented by the physical object (See “Similarly, the orientation of the asset 102 informs an analysis of which lines of sight are more valuable based on angles of view for a consumer” in ¶ 0024, “From a planning perspective, different location angles may be tested to determine the effect on visibility of the asset 102 to potential consumers. The exposure value assessment system enables the identification of the azimuth of the asset 102 by permits quick angle to travel segment 104 analysis” in ¶ 0029, and “For example, the transport information and the viewshed 108 may be filtered such that the viewshed 108 is restricted to a certain or optimum viewing angle of the asset 102 by a consumer” in ¶ 0035.).
It would have been obvious to one having ordinary skill in the art at the time of filing to combine the teachings of Carlson and Pokorny to receive facing angle of the content. The motivation, as shown in Pokorny, is to determine the effect on visibility of the asset to potential customers.
Regarding Claim 12, Carlson teaches receiving object placement information associated with content presented by a physical object at the certain geographic location comprises receiving information identifying lat-long coordinates of the physical object (See “defining a sign utilizing a locus of points in three-dimensional space, wherein the locus of points defines a plane, the plane having latitudinal and longitudinal coordinates” in claim 5.).
Carlson does not expressly teach receiving information identifying an estimated viewshed for the content presented by the physical object.
However, Pokorny teaches receiving information identifying an estimated viewshed for the content presented by the physical object (See “A viewshed is generated based on the location information and the digital elevation model data. Exposure value information is generated by intersecting the viewshed with transport information” in the abstract and “From the digital elevation model and the location information for the OOH asset, a viewshed is created. Stated differently, using location information for the OOH asset, the digital elevation model may be processed to determine where the location point connects to all other points in the digital elevation model, as limited by a radius. A line of sight is the viewshed along "one-line," while the viewshed represents the combination of all lines of sight for the OOH asset” in ¶ 0017.).
It would have been obvious to one having ordinary skill in the art at the time of filing to combine the teachings of Carlson and Pokorny to receive viewshed for the content. The motivation, as shown in Pokorny, is to utilize viewshed information to determine exposure value information.
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/process/file/efs/guidance/eTD-info-I.jsp.
Claims 1 and 2 are rejected on the ground of nonstatutory double patenting as being unpatentable over claim 1 of U.S. Patent No. 10,817,898. Although the claims at issue are not identical, they are not patentably distinct from each other because the reference claim anticipates the claims under examination.
Claim 3, 4, and 15 are rejected on the ground of nonstatutory double patenting as being unpatentable over claim 2 of U.S. Patent No. 10,817,898. Although the claims at issue are not identical, they are not patentably distinct from each other because the reference claim anticipates the claims under examination.
Claims 5 and 17 is rejected on the ground of nonstatutory double patenting as being unpatentable over claim 3 of U.S. Patent No. 10,817,898. Although the claims at issue are not identical, they are not patentably distinct from each other because the reference claim anticipates the claims under examination.
Claim 7 is rejected on the ground of nonstatutory double patenting as being unpatentable over claim 5 of U.S. Patent No. 10,817,898. Although the claims at issue are not identical, they are not patentably distinct from each other because the reference claim anticipates the claim under examination.
Claim 8 is rejected on the ground of nonstatutory double patenting as being unpatentable over claim 6 of U.S. Patent No. 10,817,898. Although the claims at issue are not identical, they are not patentably distinct from each other because the reference claim anticipates the claim under examination.
Claims 9 and 16 is rejected on the ground of nonstatutory double patenting as being unpatentable over claim 7 of U.S. Patent No. 10,817,898. Although the claims at issue are not identical, they are not patentably distinct from each other because the reference claim anticipates the claims under examination.
Claim 10 is rejected on the ground of nonstatutory do