DETAILED ACTION
This Office Action is in response to Request for Continued Examination, and Applicant’s Amendment and Remarks filed on 12/23/2025.
Claim 21 have been added.
Claims 1-21 are pending for examination.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 12/23/2025 has been entered.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 08/25/2025. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Response to Argument
Applicant’s arguments, see pages 8-12, filed 11/13/2025, with respect to the rejections of claims 1-20 under U.S.C. 103 have been fully considered and persuasive. Therefore, the rejections have been withdrawn. However, upon further consideration, a new ground of rejection is made in view of Maki et al. (JP-2022035771; hereafter Maki) as evidenced by Shin et al. (KR 20180076583 A; hereafter Shin) and further evidenced by Alon et al. (US 8549028 B1; hereafter Alon).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-3, 7-10, 14-17 are rejected under 35 U.S.C. 103 as being anticipated by Maki as evidenced by Shin and further evidenced by Alon.
Maki and Shin were cited in the previous Office action.
Regarding claim 1, Maki discloses:
A method, implemented by programmed one or more processors in a server, for collecting evidence to investigate an event, the method comprising:
obtaining a time, a location, to be investigated ([0022]; “When a request for information is received from the police or other authorities, the service provider PC 100 or the customer PC 30 can search for video data that meets the conditions (retention conditions) within the recorded data of the vehicle-mounted device 10 installed in each vehicle 90, using the date (time information), location information (latitude and longitude information), and radius information as conditions.”);
transmitting a signal including the time, the location, for requesting sensor data corresponding to the event from a plurality of vehicles; ([0040]; “The communication unit 32 transmits, via the server 80, to the vehicle-mounted device 10, a data storage command (specification information) for specifying video data corresponding to the associated information that meets the storage conditions.”);
receiving the sensor data corresponding to the event from at least one of the plurality of vehicles based on the time, the location; and
([0061]; “Furthermore, according to this embodiment, the service provider PC 100 or the customer PC 30 searches for accompanying information that meets the retention conditions in the vehicle image list (stored data list 300 transmitted from multiple vehicle-mounted devices 10) stored in the database 85.”)
[0062]; “The data may be transmitted from the vehicle-mounted device 10 to the service provider PC 100, and then transmitted from the service provider PC 100 to a communication terminal of an external organization.”)
providing the received sensor data for investigating the event. ([0062]; “The data may be transmitted from the vehicle-mounted device 10 to the service provider PC 100, and then transmitted from the service provider PC 100 to a communication terminal of an external organization.”)
Although Maki discloses requesting event data using parameters entered by the user as discussed above, Maki does not explicitly use an event type as one of the parameters.
However, Shin within the same field of endeavor does teach:
Obtaining a type of event ([0022]; “In a preferred embodiment, the application accesses the accident risk image database through the cloud server and outputs an accident search screen to the terminal for searching information, and the accident search screen includes an accident occurrence date and time and accident type input window for selecting and searching accident occurrence date and
time and accident type information”)
Transmitting a signal including the type of the event ([0020]; “In a preferred embodiment, the embedded board sends user information of the black box device, accident type information which is the type of the accident risk situation, accident occurrence date information, and accident occurrence location information to the cloud server along with accident risk image information.”)
Receiving the sensor data corresponding to the event from one of the plurality of vehicles based on the type of event of the signal ([0022]; “an accident occurrence list output window for outputting a list of accident occurrences searched through the accident occurrence date and time and accident type input window.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Maki with Shin. This modification would have been obvious as both Maki and Shin cover subject matter within the same field of endeavor (vehicle event detection and analysis) and it would have been beneficial for the user to specify an event type when requesting data to filter out unrelated data and only receive data relevant to the event being investigated.
Although Maki in combination with Shin teaches requesting event data using event types as a parameter entered by the user as discussed above, Maki in combination with Shin does not explicitly teach the use of both an event category and sub-category.
However, Alon within the same field of endeavor does teach:
Wherein the type of event includes both a category and a sub-category ([col. 16 lines 47-49]; “The sub-category section 544 provides information related to a sub-category within the selected main category that is selected for the currently displayed incident.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Maki with Shin and Alon. This modification would have been obvious as both Maki, Shin and Alon cover subject matter within the same field of endeavor (vehicle event detection and analysis) and it would have been beneficial for the user to specify an event type when requesting data to filter out unrelated data and only receive data relevant to the event being investigated.
Regarding claim 2, Maki in combination with Shin and Alon teaches all of the limitations of claim 1. Additionally, Maki discloses the time comprises a time range corresponding to the event and/or the location comprises a location range corresponding to the event. ([0055]; “Based on the information request, the service provider PC 100 inputs the latitude, longitude, time, and radius (xx m) as search keys (step S3), and executes a search (step S4). The service provider PC 100 searches the recorded data in DB 85 (vehicle image list, stored data list 300 collected from each vehicle-mounted device 10) for data equivalent to the value entered in step S3 or this value with an error added (step S5). In step S5, the service providing PC 100 searches for data in which the recording start time 301 and the GPS 310 information (accompanying information) match the inputted value (+ error).”)
Regarding claim 3, Maki in combination with Shin and Alon teaches all of the limitations of claim 1. Additionally, Maki discloses the sensor data comprises at least one of image data, LiDAR sensor data, accelerometer data, audio data, and infrared image data captured by onboard sensors of the vehicle. ([0026]-[0029]; “the camera I/F 16 has a function of taking in the image signals output by the cameras 23A and 23B, converting them into predetermined digital image data suitable for computer processing, and acquiring the image data.”)
Regarding claim 7, Maki in combination with Shin and Alon teaches all of the limitations of claim 1. Additionally, Maki discloses performing, on the received sensor data, one or more of: pre-processing, converting, collating, and filtering. ([0028]; “…the camera I/F 16 has a function of taking in the image signals output by the cameras 23A and 23B, converting them into predetermined digital image data suitable for computer processing, and acquiring the image data.”)
Regarding claim 8, Maki discloses:
A system for collecting evidence to investigate an event, the system comprising:
a memory storing instructions ([0010]; “a recording unit that records the vehicle operation information including the video data and the time information and the location information in a memory unit, and a management unit that deletes the vehicle operation information recorded in the memory unit in accordance with the specified conditions”); and
at least one programmed processor configured to execute the instructions ([0030]; “A control unit 11 that realizes the main functions of the vehicle-mounted device 10 is composed of electronic circuits mainly including a processor of a microcomputer (CPU). This microcomputer executes a program stored in advance in the non-volatile memory 26A or the like, thereby realizing a control function of the vehicle-mounted device 10, which will be described later.”) to:
obtain a time, a location, for the event to be investigated ([0022]; “When a request for information is received from the police or other authorities, the service provider PC 100 or the customer PC 30 can search for video data that meets the conditions (retention conditions) within the recorded data of the vehicle-mounted device 10 installed in each vehicle 90, using the date (time information), location information (latitude and longitude information), and radius information as conditions.”);
transmit a signal including the time, the location, for requesting sensor data corresponding to the event from a plurality of vehicles ([0022]; “When a request for information is received from the police or other authorities, the service provider PC 100 or the customer PC 30 can search for video data that meets the conditions (retention conditions) within the recorded data of the vehicle-mounted device 10 installed in each vehicle 90, using the date (time information), location information (latitude and longitude information), and radius information as conditions.”);
receive the sensor data corresponding to the event from at least one of the plurality of vehicles based on the time, the location; and
([0061]; “Furthermore, according to this embodiment, the service provider PC 100 or the customer PC 30 searches for accompanying information that meets the retention conditions in the vehicle image list (stored data list 300 transmitted from multiple vehicle-mounted devices 10) stored in the database 85.”)
[0062]; “The data may be transmitted from the vehicle-mounted device 10 to the service provider PC 100, and then transmitted from the service provider PC 100 to a communication terminal of an external organization.”)
provide the received sensor data for investigating the event. ([0062]; “The data may be transmitted from the vehicle-mounted device 10 to the service provider PC 100, and then transmitted from the service provider PC 100 to a communication terminal of an external organization.”)
Although Maki discloses requesting event data using parameters entered by the user as discussed above, Maki does not explicitly use an event type as one of the parameters.
However, Shin within the same field of endeavor does teach:
Obtaining a type of event ([0022]; “In a preferred embodiment, the application accesses the accident risk image database through the cloud server and outputs an accident search screen to the terminal for searching information, and the accident search screen includes an accident occurrence date and time and accident type input window for selecting and searching accident occurrence date and
time and accident type information”)
Transmitting a signal including the type of the event ([0020]; “In a preferred embodiment, the embedded board sends user information of the black box device, accident type information which is the type of the accident risk situation, accident occurrence date information, and accident occurrence location information to the cloud server along with accident risk image information.”)
Receiving the sensor data corresponding to the event from one of the plurality of vehicles based on the type of event of the signal ([0022]; “an accident occurrence list output window for outputting a list of accident occurrences searched through the accident occurrence date and time and accident type input window.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Maki with Shin. This modification would have been obvious as both Maki and Shin cover subject matter within the same field of endeavor (vehicle event detection and analysis) and it would have been beneficial for the user to specify an event type when requesting data to filter out unrelated data and only receive data relevant to the event being investigated.
Although Maki in combination with Shin teaches requesting event data using event types as a parameter entered by the user as discussed above, Maki in combination with Shin does not explicitly teach the use of both an event category and sub-category.
However, Alon within the same field of endeavor does teach:
Wherein the type of event includes both a category and a sub-category ([col. 16 lines 47-49]; “The sub-category section 544 provides information related to a sub-category within the selected main category that is selected for the currently displayed incident.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Maki with Shin and Alon. This modification would have been obvious as both Maki, Shin and Alon cover subject matter within the same field of endeavor (vehicle event detection and analysis) and it would have been beneficial for the user to specify an event type when requesting data to filter out unrelated data and only receive data relevant to the event being investigated.
Regarding claim 9, Maki in combination with Shin and Alon teaches all of the limitations of claim 8. Additionally, Maki discloses the time comprises a time range corresponding to the event and/or the location comprises a location range corresponding to the event. ([0055]; “Based on the information request, the service provider PC 100 inputs the latitude, longitude, time, and radius (xx m) as search keys (step S3), and executes a search (step S4). The service provider PC 100 searches the recorded data in DB 85 (vehicle image list, stored data list 300 collected from each vehicle-mounted device 10) for data equivalent to the value entered in step S3 or this value with an error added (step S5). In step S5, the service providing PC 100 searches for data in which the recording start time 301 and the GPS 310 information (accompanying information) match the inputted value (+ error).”)
Regarding claim 10, Maki in combination with Shin and Alon teaches all of the limitations of claim 8. Additionally, Maki discloses the sensor data comprises at least one of image data, LiDAR sensor data, accelerometer data, audio data, and infrared image data captured by onboard sensors of the vehicle. ([0026]-[0029]; “the camera I/F 16 has a function of taking in the image signals output by the cameras 23A and 23B, converting them into predetermined digital image data suitable for computer processing, and acquiring the image data.”)
Regarding claim 14, Maki in combination with Shin and Alon teaches all of the limitations of claim 8. Additionally, Maki discloses at least one programmed processor is further configured to execute the instructions to perform, on the received sensor data, one or more of: pre-processing, converting, collating, and filtering. ([0028]; “…the camera I/F 16 has a function of taking in the image signals output by the cameras 23A and 23B, converting them into predetermined digital image data suitable for computer processing, and acquiring the image data.”)
Regarding claim 15, Maki discloses:
A non-transitory computer-readable recording medium having recorded thereon instructions executable by at least one programmed processor to cause the at least one programmed processor to perform a method for collecting evidence to investigate an event ([0030]; “A control unit 11 that realizes the main functions of the vehicle-mounted device 10 is composed of electronic circuits mainly including a processor of a microcomputer (CPU). This microcomputer executes a program stored in advance in the non-volatile memory 26A or the like, thereby realizing a control function of the vehicle-mounted device 10, which will be described later.”), the method comprising:
obtaining a time, a location, for the event to be investigated; ([0022]; “When a request for information is received from the police or other authorities, the service provider PC 100 or the customer PC 30 can search for video data that meets the conditions (retention conditions) within the recorded data of the vehicle-mounted device 10 installed in each vehicle 90, using the date (time information), location information (latitude and longitude information), and radius information as conditions.”);
transmitting a signal including the time, the location, for requesting sensor data corresponding to the event from a plurality of vehicles; ([0022]; “When a request for information is received from the police or other authorities, the service provider PC 100 or the customer PC 30 can search for video data that meets the conditions (retention conditions) within the recorded data of the vehicle-mounted device 10 installed in each vehicle 90, using the date (time information), location information (latitude and longitude information), and radius information as conditions.”);
receiving the sensor data corresponding to the event from at least one of the plurality of vehicles based on the time, the location; and
([0061]; “Furthermore, according to this embodiment, the service provider PC 100 or the customer PC 30 searches for accompanying information that meets the retention conditions in the vehicle image list (stored data list 300 transmitted from multiple vehicle-mounted devices 10) stored in the database 85.”)
[0062]; “The data may be transmitted from the vehicle-mounted device 10 to the service provider PC 100, and then transmitted from the service provider PC 100 to a communication terminal of an external organization.”)
providing the received sensor data for investigating the event. ([0062]; “The data may be transmitted from the vehicle-mounted device 10 to the service provider PC 100, and then transmitted from the service provider PC 100 to a communication terminal of an external organization.”)
Although Maki discloses requesting event data using parameters entered by the user as discussed above, Maki does not explicitly use an event type as one of the parameters.
However, Shin within the same field of endeavor does teach:
Obtaining a type of event ([0022]; “In a preferred embodiment, the application accesses the accident risk image database through the cloud server and outputs an accident search screen to the terminal for searching information, and the accident search screen includes an accident occurrence date and time and accident type input window for selecting and searching accident occurrence date and
time and accident type information”)
Transmitting a signal including the type of the event ([0020]; “In a preferred embodiment, the embedded board sends user information of the black box device, accident type information which is the type of the accident risk situation, accident occurrence date information, and accident occurrence location information to the cloud server along with accident risk image information.”)
Receiving the sensor data corresponding to the event from one of the plurality of vehicles based on the type of event of the signal ([0022]; “an accident occurrence list output window for outputting a list of accident occurrences searched through the accident occurrence date and time and accident type input window.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Maki with Shin. This modification would have been obvious as both Maki and Shin cover subject matter within the same field of endeavor (vehicle event detection and analysis) and it would have been beneficial for the user to specify an event type when requesting data to filter out unrelated data and only receive data relevant to the event being investigated.
Although Maki in combination with Shin teaches requesting event data using event types as a parameter entered by the user as discussed above, Maki in combination with Shin does not explicitly teach the use of both an event category and sub-category.
However, Alon within the same field of endeavor does teach:
Wherein the type of event includes both a category and a sub-category ([col. 16 lines 47-49]; “The sub-category section 544 provides information related to a sub-category within the selected main category that is selected for the currently displayed incident.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Maki with Shin and Alon. This modification would have been obvious as both Maki, Shin and Alon cover subject matter within the same field of endeavor (vehicle event detection and analysis) and it would have been beneficial for the user to specify an event type when requesting data to filter out unrelated data and only receive data relevant to the event being investigated.
Regarding claim 16, Maki in combination with Shin and Alon teaches all of the limitations of claim 15. Additionally, Maki discloses the time comprises a time range corresponding to the event and/or the location comprises a location range corresponding to the event. ([0055]; “Based on the information request, the service provider PC 100 inputs the latitude, longitude, time, and radius (xx m) as search keys (step S3), and executes a search (step S4). The service provider PC 100 searches the recorded data in DB 85 (vehicle image list, stored data list 300 collected from each vehicle-mounted device 10) for data equivalent to the value entered in step S3 or this value with an error added (step S5). In step S5, the service providing PC 100 searches for data in which the recording start time 301 and the GPS 310 information (accompanying information) match the inputted value (+ error).”)
Regarding claim 17, Maki in combination with Shin and Alon teaches all of the limitations of claim 15. Additionally, Maki discloses the sensor data comprises at least one of image data, LiDAR sensor data, accelerometer data, audio data, and infrared image data captured by onboard sensors of the vehicle. ([0026]-[0029]; “the camera I/F 16 has a function of taking in the image signals output by the cameras 23A and 23B, converting them into predetermined digital image data suitable for computer processing, and acquiring the image data.”)
Claims 4, 11, and 18 are rejected under 35 U.S.C. 103 as being obvious in view of Maki as evidenced by Shin and Alon as applied to claims 1, 8, and 15 above, and further evidenced by Walsh et al. (CA 3065731 A1; hereafter Walsh).
Walsh was cited in the previous Office action.
Regarding claim 4, Maki in combination with Shin and Alon teaches all of the limitations of claim 1. Additionally, Walsh in the same field of endeavor teaches the receiving the sensor data comprises receiving the sensor data from another server that anonymizes the sensor data. ([0020]; “Secure data transmission protocols and/or encryption may be used in file transfers to protect the integrity of the data, for example, File Transfer Protocol (FTP), Secure File Transfer Protocol (SFTP), and/or Pretty Good Privacy (PGP) encryption.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Maki, Shin and Alon with Walsh. This modification would have been obvious as both Maki, Shin, Alon, and Walsh cover subject matter within the same field of endeavor (vehicle event detection and analysis) and it would have been beneficial to add the additional step of anonymizing the vehicle data to help protect the vehicle data from data breaches and ensures that the vehicle data is viewed only by intended individuals. SFTP and PGP are also encryption methods well known by those with ordinary skill in the art.
Regarding claim 11, Maki in combination with Shin and Alon teaches all of the limitations of claim 8. Additionally, Walsh in the same field of endeavor teaches the receiving the sensor data comprises receiving the sensor data from another server that anonymizes the sensor data. ([0020]; “Secure data transmission protocols and/or encryption may be used in file transfers to protect the integrity of the data, for example, File Transfer Protocol (FTP), Secure File Transfer Protocol (SFTP), and/or Pretty Good Privacy (PGP) encryption.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Maki, Shin and Alon with Walsh. This modification would have been obvious as both Maki, Shin, Alon, and Walsh cover subject matter within the same field of endeavor (vehicle event detection and analysis) and it would have been beneficial to add the additional step of anonymizing the vehicle data to help protect the vehicle data from data breaches and ensures that the vehicle data is viewed only by intended individuals. SFTP and PGP are also encryption methods well known by those with ordinary skill in the art.
Regarding claim 18, Maki in combination with Shin and Alon teaches all of the limitations of claim 15. Additionally, Walsh in the same field of endeavor teaches the receiving the sensor data comprises receiving the sensor data from another server that anonymizes the sensor data. ([0020]; “Secure data transmission protocols and/or encryption may be used in file transfers to protect the integrity of the data, for example, File Transfer Protocol (FTP), Secure File Transfer Protocol (SFTP), and/or Pretty Good Privacy (PGP) encryption.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Maki, Shin and Alon with Walsh. This modification would have been obvious as both Maki, Shin, Alon, and Walsh cover subject matter within the same field of endeavor (vehicle event detection and analysis) and it would have been beneficial to add the additional step of anonymizing the vehicle data to help protect the vehicle data from data breaches and ensures that the vehicle data is viewed only by intended individuals. SFTP and PGP are also encryption methods well known by those with ordinary skill in the art.
Claims 5, 12, and 19 are rejected under 35 U.S.C. 103 as being obvious in view of Maki as evidenced with Shin and Alon as applied to claims 1, 8, and 15 above, and further in view of Moeller et al. (US 2022/0161760; hereafter Moeller).
Moeller was cited in the previous Office action.
Regarding claim 5, Maki in combination with Shin and Alon teaches all of the limitations of claim 1. Additionally, Moeller in the same field of endeavor teaches processing the received sensor data, ([0028]; “sensor comprises an infrared camera and the method further comprises: A) reading, via the at least one computing device, a plurality of infrared frames from the infrared camera; B) identifying, via the at least one computing device, an area with a thermal variance in at least one of the plurality of infrared frames;”)
wherein the received sensor data comprises infrared image data ([0028]; “sensor comprises an infrared camera and the method further comprises: A) reading, via the at least one computing device, a plurality of infrared frames from the infrared camera; B) identifying, via the at least one computing device, an area with a thermal variance in at least one of the plurality of infrared frames;”), and
wherein the processing the received sensor data comprises determining whether the infrared image data includes a shape of an object with a temperature greater than a predetermined threshold. ([0028]; “C) monitor a temperature of the area with the thermal variance, wherein determining that the particular event has occurred further comprises determining that the temperature of the area exceeds a temperature threshold.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Maki, Shin and Alon with Moeller. This modification would have been obvious as both Maki, Shin, Alon, and Moeller cover subject matter within the same field of endeavor (vehicle event detection and analysis) and it would have been beneficial to include infrared image data/processing with the vehicle data/processing found in Maki in order to gather more information regarding the event. Adding additional data will provide the investigator with more evidence that may be crucial to their investigation.
Regarding claim 12, Maki in combination with Shin and Alon teaches all of the limitations of claim 8. Additionally, Moeller in the same field of endeavor teaches process the received sensor data, ([0028]; “sensor comprises an infrared camera and the method further comprises: A) reading, via the at least one computing device, a plurality of infrared frames from the infrared camera; B) identifying, via the at least one computing device, an area with a thermal variance in at least one of the plurality of infrared frames;”)
wherein the received sensor data comprises infrared image data, and ([0028]; “sensor comprises an infrared camera and the method further comprises: A) reading, via the at least one computing device, a plurality of infrared frames from the infrared camera; B) identifying, via the at least one computing device, an area with a thermal variance in at least one of the plurality of infrared frames;”),
wherein the processing the received sensor data comprises determining whether the infrared image data includes a shape of an object with a temperature greater than a predetermined threshold. ([0028]; “C) monitor a temperature of the area with the thermal variance, wherein determining that the particular event has occurred further comprises determining that the temperature of the area exceeds a temperature threshold.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Maki, Shin and Alon with Moeller. This modification would have been obvious as both Maki, Shin, Alon, and Moeller cover subject matter within the same field of endeavor (vehicle event detection and analysis) and it would have been beneficial to include infrared image data/processing with the vehicle data/processing found in Maki in order to gather more information regarding the event. Adding additional data will provide the investigator with more evidence that may be crucial to their investigation.
Regarding claim 19, Maki in combination with Shin and Alon teaches all of the limitations of claim 15. Additionally, Moeller in the same field of endeavor teaches processing the received sensor data, ([0028]; “sensor comprises an infrared camera and the method further comprises: A) reading, via the at least one computing device, a plurality of infrared frames from the infrared camera; B) identifying, via the at least one computing device, an area with a thermal variance in at least one of the plurality of infrared frames;”)
wherein the received sensor data comprises infrared image data, and ([0028]; “sensor comprises an infrared camera and the method further comprises: A) reading, via the at least one computing device, a plurality of infrared frames from the infrared camera; B) identifying, via the at least one computing device, an area with a thermal variance in at least one of the plurality of infrared frames;”)
wherein the processing the received sensor data comprises determining whether the infrared image data includes a shape of an object with a temperature greater than a predetermined threshold. ([0028]; “C) monitor a temperature of the area with the thermal variance, wherein determining that the particular event has occurred further comprises determining that the temperature of the area exceeds a temperature threshold.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Maki, Shin and Alon with Moeller. This modification would have been obvious as both Maki, Shin, Alon, and Moeller cover subject matter within the same field of endeavor (vehicle event detection and analysis) and it would have been beneficial to include infrared image data/processing with the vehicle data/processing found in Maki in order to gather more information regarding the event. Adding additional data will provide the investigator with more evidence that may be crucial to their investigation.
Claims 6, 13, and 20 are rejected under 35 U.S.C. 103 as being obvious in view of Maki as evidenced by Shin and Alon as applied to claims 1, 8, and 15 above, and further in view of Castano et al. (US 2021/0097784; hereafter Castano).
Castano was cited in the previous Office action.
Regarding claim 6, Maki in combination with Shin and Alon teaches all of the limitations of claim 1. Additionally, Castano in the same field of endeavor teaches processing the received sensor data, ([0027]; “The image sensor 108 may detect image data corresponding to light in one or more of the visible spectrum, the infrared spectrum, the ultraviolet spectrum, or any other light spectrum.”)
wherein the received sensor data comprises LiDAR sensor data ([0027]; “The image sensor 108 may include… ranging (LIDAR) sensor”), and
wherein the processing the received sensor data comprises determining whether the event is captured based on the LiDAR sensor data. ([0095]; “…the server may identify that image data immediately before and immediately after the event would assist in analyzing the vehicle event.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Maki, Shin and Alon with Castano. This modification would have been obvious as both Maki, Shin, Alon, and Castano cover subject matter within the same field of endeavor (vehicle event detection and analysis) and it would have been beneficial to include LiDAR sensor data/processing with the vehicle data/processing found in Maki in order to gather more information regarding the event. Adding additional data will provide the investigator with more evidence that may be crucial to their investigation.
Regarding claim 13, Maki in combination with Shin and Alon teaches all of the limitations of claim 8. Additionally, Castano in the same field of endeavor teaches the at least one programmed processor is further configured to execute the instructions to:
process the received sensor data, ([0027]; “The image sensor 108 may detect image data corresponding to light in one or more of the visible spectrum, the infrared spectrum, the ultraviolet spectrum, or any other light spectrum.”)
wherein the received sensor data comprises LiDAR sensor data, and ([0027]; “The image sensor 108 may include… ranging (LIDAR) sensor”),
wherein the processing the received sensor data comprises determining whether the event is captured based on the LiDAR sensor data. ([0095]; “…the server may identify that image data immediately before and immediately after the event would assist in analyzing the vehicle event.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Maki, Shin and Alon with Castano. This modification would have been obvious as both Maki, Shin, Alon, and Castano cover subject matter within the same field of endeavor (vehicle event detection and analysis) and it would have been beneficial to include LiDAR sensor data/processing with the vehicle data/processing found in Maki in order to gather more information regarding the event. Adding additional data will provide the investigator with more evidence that may be crucial to their investigation.
Regarding claim 20, Maki in combination with Shin and Alon teaches all of the limitations of claim 15. Additionally, Castano in the same field of endeavor teaches processing the received sensor data, ([0027]; “The image sensor 108 may detect image data corresponding to light in one or more of the visible spectrum, the infrared spectrum, the ultraviolet spectrum, or any other light spectrum.”)
wherein the received sensor data comprises LiDAR sensor data, and ([0027]; “The image sensor 108 may include… ranging (LIDAR) sensor”),
wherein the processing the received sensor data comprises determining whether the event is captured based on the LiDAR sensor data. ([0095]; “…the server may identify that image data immediately before and immediately after the event would assist in analyzing the vehicle event.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Maki, Shin and Alon with Castano. This modification would have been obvious as both Maki, Shin, Alon, and Castano cover subject matter within the same field of endeavor (vehicle event detection and analysis) and it would have been beneficial to include LiDAR sensor data/processing with the vehicle data/processing found in Maki in order to gather more information regarding the event. Adding additional data will provide the investigator with more evidence that may be crucial to their investigation.
Claim 21 is rejected under 35 U.S.C. 103 as being obvious in view of Maki as evidenced by Shin and Alon as applied to claim 1 above, and further in view of Toda (JP2009020774A).
Regarding claim 21, Maki in combination with Shin and Alon teaches all of the limitations of claim 1. Maki briefly mentions that data is requested and received from nearby vehicles of an incident, by referencing to Toda. ([0003]; “The drive recorder of Patent Document 2, in the event of an accident, not only stores the video data collected by the vehicle's own vehicle's camera, but also requests video data from drive recorders installed in nearby vehicles and from surveillance cameras on the road, and stores the received video data.”
[0005]; “JP 2009-20774 A”)
Toda within the same field of endeavor does explicitly teach the plurality of vehicles are vehicles that were located within a threshold distance of the location of the event at the time of the event. ([0037]; “The control unit 33 sounds the buzzer 32 when the emergency switch 31 is pressed, and also outputs a request signal REQ to the in-vehicle device 10 or the fixed device 20 present in the DSRC area to request the transmission of their radio call number WCN and captured video data DATA, and has the function of writing the radio call number WCN etc. received from the in-vehicle device 10 or the fixed device 20 to the video data memory unit 36.”
Note: DSRC (Dedicated Short Range Communication) is known to send signals within a set range.)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Maki, Shin and Alon with Toda. This modification would have been obvious as both Maki, Shin, Alon, and Toda cover subject matter within the same field of endeavor (vehicle event detection and analysis) and it would have been obvious that when receiving data from a plurality of vehicles that these vehicles would be within a set range as only vehicles within a certain range would have relevant data.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to BRANDON SUNG EUN LEE whose telephone number is (571)272-5684. The examiner can normally be reached Monday - Friday 9:00 am - 5:00 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, James Lee can be reached on (571) 270-5965. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/B.S.L./Examiner, Art Unit 3668
/JAMES J LEE/Supervisory Patent Examiner, Art Unit 3668