Prosecution Insights
Last updated: April 19, 2026
Application No. 18/609,305

Map Data Processing Method and Apparatus

Final Rejection §103
Filed
Mar 19, 2024
Examiner
WU, MING HAN
Art Unit
2618
Tech Center
2600 — Communications
Assignee
Shenzhen Yinwang Intelligent Technologies Co., Ltd.
OA Round
2 (Final)
76%
Grant Probability
Favorable
3-4
OA Rounds
2y 8m
To Grant
99%
With Interview

Examiner Intelligence

Grants 76% — above average
76%
Career Allow Rate
282 granted / 370 resolved
+14.2% vs TC avg
Strong +23% interview lift
Without
With
+23.3%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
35 currently pending
Career history
405
Total Applications
across all art units

Statute-Specific Performance

§101
7.8%
-32.2% vs TC avg
§103
68.3%
+28.3% vs TC avg
§102
2.1%
-37.9% vs TC avg
§112
12.6%
-27.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 370 resolved cases

Office Action

§103
DETAILED ACTION In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1 – 18, 19, and 21 are rejected under 35 U.S.C. 103 as being unpatentable over Kim (Publication: US 2011/0234631 A1) in view of Krumm et al. (Publication: US 2007/0006098 A1), and Markey et al. (Publication: US 9,734,722 B1). Regarding claim 1, see rejection on claim 21. Regarding claim 2, see rejection on claim 12. Regarding claim 3, see rejection on claim 13. Regarding claim 4, see rejection on claim 14. Regarding claim 5, see rejection on claim 15. Regarding claim 6, see rejection on claim 16. Regarding claim 7, Kim in view of Krumm, Markey disclose all the limitation of claim 6. Kim discloses receiving a shadow display trigger instruction [0055] - In block 835, a VO registration unit of the AR system selects and register a virtual object (VO) with the real world image, and in block 840, a shadow image registration unit of the AR system generates a shadow image(s) for the selected VO based on at least one of the light source information, the pose information, the weather information, and/or the geographical information.); and superimposing, based on the shadow display trigger instruction, the shadow information on other information [0055] - In block 835, a VO registration unit of the AR system selects and register a virtual object (VO) with the real world image, and in block 840, a shadow image registration unit of the AR system generates a shadow image(s) for the selected VO based on at least one of the light source information, the pose information, the weather information, and/or the geographical information. In block 845, the AR image generating unit of the AR system generates an AR image by superimposing the captured real world image with the image(s) of the virtual object(s) and its shadow image(s).). Krumm discloses instruction from a user ([0069] - At 712, device operation and the presentation of data can be initiated according to manual selections of the user.) Information of a map for display ([0050] - which GPS fix node 322 also receives data from the GPS shadow node 306. Processing this received information, the GPS fix node 322 can compute the user location also considering the GPS shadow information. The GPS fix node 322 also passes data to the GPS reliability node 312. See Fig. 9.) Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify Kim in view Krumm with instruction from a user Information of a map for display as taught by Krumm. The motivation for doing is to enable to obtain information/data more efficiently. Regarding claim 8, Kim in view of Krumm, Markey disclose all the limitation of claim 6. Kim discloses wherein a first display region on a display interface corresponding to the shadow region has a different grayscale relative to a second display region on the display interface,a different color relative to the second display region, a different saturation relative to the second display region, or a different superimposed pattern relative to [0044] - the shadow image registration unit may generate darker and more clearly-defined shadow image(s) for real-world images captured in rural areas and lighter and blurry shadow image(s) for real-world images captured in downtown areas. Clouds in cloudy weather and high-storey buildings of downtown areas may scatter the rays from the sun, thereby preventing casting of a clearly-defined dark shadow. The shadow image registration unit may further consider the weather information and/or the geographical information in performing shading operations on a registered image of a virtual object(s) [0017] AR generator 120 may be configured to generate a virtual shadow image(s) of the virtual object(s) whose image(s) are to be overlaid onto the real-world image captured by image capture unit 110, “first display region, second display region”. [0014] - FIG. 1, an AR system 100 may include an image capture unit 110 configured to capture a real-world image, an AR generator 120 configured to generate an AR image by overlaying the captured real-world image with the image(s) of one or more virtual object(s) and their respective virtual shadow images, and a display unit 130 configured to display the augmented reality image generated by AR generator 120.). Regarding claim 9, Kim in view of Krumm, Markey disclose all the limitation of claim 6. Kim discloses wherein the shadow information further [0019] - to generate the virtual shadow images based on the estimated location. In one embodiment, AR generator 120 may be configured to estimate the position of the real-world light source based on the location, the time, and the date the real-world image was captured by image capture unit 110. AR generator 120 may at least partially obtain such information on the location, the time, and/or the date from camera unit 110 and/or an external device (e.g., a server). The technical details on (a) estimating the location of the sun and (b) generating virtual shadow images and AR images. [0045] there may be instances where the pose (and thus, the point of view) of an image capture unit is changed by a user or by some other means. an AR generator may track such changes in the pose of an image capture unit (e.g., 110) and re-register a registered virtual object (e.g., update the relationship between a camera reference frame (e.g., x.sub.c, y.sub.c, and z.sub.c) and a real-world reference frame (e.g., x.sub.w, y.sub.w, and z.sub.w)). A shadow image registration unit (e.g., 520) of the AR generator may generate a new virtual shadow image based on the re-registration. the VO registration unit may perform tracking by periodically or intermittently receiving pose information updates from a pose detection unit (e.g., 320) installed in the image capture unit, “display a change of the shadow” . [0014] - FIG. 1, an AR system 100 may include an image capture unit 110 configured to capture a real-world image, an AR generator 120 configured to generate an AR image by overlaying the captured real-world image with the image(s) of one or more virtual object(s) and their respective virtual shadow images, and a display unit 130 configured to display the augmented reality image generated by AR generator 120.). Regarding claim 10, see rejection on claim 20. Regarding claim 11, see rejection on claim 21. Regarding claim 12, Kim in view of Krumm, Markey disclose all the limitation of claim 11. Kim discloses to cause the apparatus to:generate the shadow information; or[0002] - The device further includes a shadow image registration unit that receives the light source information generated from the light source information generating unit. The shadow image registration unit generates a shadow image of a virtual object overlaid onto the real-world image base d on the light source information generated from the light source information generating unit.). Regarding claim 13, Kim in view of Krumm, Markey disclose all the limitation of claim 11. Kim discloses wherein the shadow information further comprises at least one of: shadow degree information indicating a shadow degree of the shadow region: confidencelevel information indicating a reliability degree of the shadow region; or [0002]- the shadow image registration unit generates a shadow image of a virtual object overlaid onto the real-world image based on the light source information generated from the light source information generating unit. [0055] - In block 835, a VO registration unit of the AR system selects and register a virtual object (VO) with the real world image, and in block 840, a shadow image registration unit of the AR system generates a shadow image(s) for the selected VO based on at least one of the light source information, the pose information, the weather information, and/or the geographical information. [0017] AR generator 120 may be configured to generate a virtual shadow image(s) of the virtual object(s) whose image(s) are to be overlaid onto the real-world image captured by image capture unit 110. The virtual object(s) may be pre-stored in AR generator 120, or may be received by AR generator 120 from an external device (e.g., a server). In one embodiment, AR generator 120 may be configured to generate virtual shadow images whose size, shape, direction, and/or intensity conform to or is consistent with the real-world shadow images of real objects in the real-world image. ). Regarding claim 14, Kim in view of Krumm, Markey disclose all the limitation of claim 11. Krumm discloses to further store the information in a data structure for storing an event in the map ([0048] The location node 304 connects to provide lat/long/alt data to a GPS shadow node 306, which data can be used to access GPS shadow data (or maps) for associated lat/long locations, and structures at those locations. The GPS shadow mapping data is stored in a GPS shadow log store 307.). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify Kim in view Krumm with store the information in a data structure for storing an event in the map as taught by Krumm. The motivation for doing is to enable to obtain information/data more efficiently. Regarding claim 15, Kim in view of Krumm, Markey disclose all the limitation of claim 11. Kim discloses processor is further configured to execute the instructions to cause the apparatus to send the shadow information ([0044] - wherein the shadow image registration unit is further configured to receive at least one of weather information and geographical information for the real-world image from a server). Regarding claim 16, Kim in view of Krumm, Markey disclose all the limitation of claim 11. Kim discloses to display the shadow region based on the shadow information ([0014] - FIG. 1, an AR system 100 may include an image capture unit 110 configured to capture a real-world image, an AR generator 120 configured to generate an AR image by overlaying the captured real-world image with the image(s) of one or more virtual object(s) and their respective virtual shadow images, and a display unit 130 configured to display the augmented reality image generated by AR generator 120.). Regarding claim 17, Kim in view of Krumm, Markey disclose all the limitation of claim 16. Kim discloses receive a shadow display trigger instruction [0055] - In block 835, a VO registration unit of the AR system selects and register a virtual object (VO) with the real world image, and in block 840, a shadow image registration unit of the AR system generates a shadow image(s) for the selected VO based on at least one of the light source information, the pose information, the weather information, and/or the geographical information.); and superimpose[0055] - In block 835, a VO registration unit of the AR system selects and register a virtual object (VO) with the real world image, and in block 840, a shadow image registration unit of the AR system generates a shadow image(s) for the selected VO based on at least one of the light source information, the pose information, the weather information, and/or the geographical information. In block 845, the AR image generating unit of the AR system generates an AR image by superimposing the captured real world image with the image(s) of the virtual object(s) and its shadow image(s).). Krumm discloses instruction from the user ([0069] - At 712, device operation and the presentation of data can be initiated according to manual selections of the user.) Information of the map for display ([0050] - which GPS fix node 322 also receives data from the GPS shadow node 306. Processing this received information, the GPS fix node 322 can compute the user location also considering the GPS shadow information. The GPS fix node 322 also passes data to the GPS reliability node 312. See Fig. 9.) Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify Kim in view Krumm with instruction from a user Information of a map for display as taught by Krumm. The motivation for doing is to enable to obtain information/data more efficiently. Regarding claim 18, Kim in view of Krumm, Markey disclose all the limitation of claim 16. Kim discloses wherein a first display region on a display interface corresponding to the shadow region has a different grayscale relative to a second display region on the display interface, a different color relative to the second display region,a different saturation relative to the second display region, or a different superimposed pattern relative to [0044] - the shadow image registration unit may generate darker and more clearly-defined shadow image(s) for real-world images captured in rural areas and lighter and blurry shadow image(s) for real-world images captured in downtown areas. Clouds in cloudy weather and high-storey buildings of downtown areas may scatter the rays from the sun, thereby preventing casting of a clearly-defined dark shadow. The shadow image registration unit may further consider the weather information and/or the geographical information in performing shading operations on a registered image of a virtual object(s) [0017] AR generator 120 may be configured to generate a virtual shadow image(s) of the virtual object(s) whose image(s) are to be overlaid onto the real-world image captured by image capture unit 110, “first display region, second display region”. [0014] - FIG. 1, an AR system 100 may include an image capture unit 110 configured to capture a real-world image, an AR generator 120 configured to generate an AR image by overlaying the captured real-world image with the image(s) of one or more virtual object(s) and their respective virtual shadow images, and a display unit 130 configured to display the augmented reality image generated by AR generator 120.), wherein the shadow information further comprises time information indicating a time period in which the shadow region exists, and wherein the processor is further configured to execute the instructions to cause the apparatus to display the change of the shadow region based on the time information ( [0019] - to generate the virtual shadow images based on the estimated location. In one embodiment, AR generator 120 may be configured to estimate the position of the real-world light source based on the location, the time, and the date the real-world image was captured by image capture unit 110. AR generator 120 may at least partially obtain such information on the location, the time, and/or the date from camera unit 110 and/or an external device (e.g., a server). The technical details on (a) estimating the location of the sun and (b) generating virtual shadow images and AR images. [0045] there may be instances where the pose (and thus, the point of view) of an image capture unit is changed by a user or by some other means. an AR generator may track such changes in the pose of an image capture unit (e.g., 110) and re-register a registered virtual object (e.g., update the relationship between a camera reference frame (e.g., x.sub.c, y.sub.c, and z.sub.c) and a real-world reference frame (e.g., x.sub.w, y.sub.w, and z.sub.w)). A shadow image registration unit (e.g., 520) of the AR generator may generate a new virtual shadow image based on the re-registration. the VO registration unit may perform tracking by periodically or intermittently receiving pose information updates from a pose detection unit (e.g., 320) installed in the image capture unit, “display a change of the shadow” . [0014] - FIG. 1, an AR system 100 may include an image capture unit 110 configured to capture a real-world image, an AR generator 120 configured to generate an AR image by overlaying the captured real-world image with the image(s) of one or more virtual object(s) and their respective virtual shadow images, and a display unit 130 configured to display the augmented reality image generated by AR generator 120.). 19. (Cancelled) Regarding claim 20, Kim in view of Krumm, Markey disclose all the limitation of claim 11. Krumm discloses obtain travel information [0080] Referring now to FIG. 15, there is illustrated a methodology of context mapping. A third application builds a map and clickable links to help automatically annotate a trip based on GPS coordinates. At 1500, a GPS-enabled computing device is provided. It is not required that the GPS capability be integrated into the computing device; however, such combined capabilities provide a much more convenient application of the subject invention. At 1502, the computing device continually monitors its location using GPS, and processes the GPS signals to generate lat/long coordinates. At 1504, the device user is identified and associated with the location data. At 1506, the computing device is employed to access a network of existing websites (e.g., the Internet), using the lat/long coordinates as search terms to find websites that process the coordinates. User preferences, as accessed, can also be used to filter and further define the user context at this location. In other words, for each set of coordinates, the user context can be defined in terms of nearby streets, nearby businesses, environment, and so on. At 1508, the context information is used to generate a map of where the user has been, and in more robust implementations, predictions on where the user is likely to head. At 1510, the map can be annotated according to user preferences, and stored.). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify Kim in view Krumm with obtain travel information Regarding claim 21, Kim discloses non-transitory computer-executable instructions and that, when executed by a processor, cause an apparatus to ([059], [0048], Fig. 7A – AR systems, instructions stored in memory and executed by the processor to perform: ): obtain shadow information that indicates a shadow region of a location and that comprises location information, wherein the location information indicates a geographical location of the shadow region( [0045] In addition, there may be instances where the pose (and thus, the point of view) of an image capture unit is changed by a user or by some other means. In one embodiment, an AR generator may track such changes in the pose of an image capture unit (e.g., 110) and re-register a registered virtual object (e.g., update the relationship between a camera reference frame (e.g., x.sub.c, y.sub.c, and z.sub.c) and a real-world reference frame (e.g., x.sub.w, y.sub.w, and z.sub.w)). A shadow image registration unit (e.g., 520) of the AR generator may generate a new virtual shadow image based on the re-registration. [0055] - a shadow image registration unit of the AR system generates a shadow image(s) for the selected VO based on at least one of the light source information, the pose information, the weather information, and/or the geographical information. In block 845, the AR image generating unit of the AR system generates an AR image by superimposing the captured real world image with the image(s) of the virtual object(s) and its shadow image(s).) . Kim does not Krumm discloses store the information as map data ([0048] The location node 304 connects to provide lat/long/alt data to a GPS shadow node 306, which data can be used to access GPS shadow data (or maps) for associated lat/long locations, and structures at those locations. The GPS shadow mapping data is stored in a GPS shadow log store 307.). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify Kim with Shadow region on a map; store the shadow information as map data as taught by Krumm. The motivation for doing is to enable to obtain information/data more efficiently. Kim in view of Krumm do not however Markey discloses Shadow region on a map (column 17 lines 15 to 45 - the graphics rendering engine 230 and the airplane modeling engine 210 operate in concert to generate an image representative of a shadow of the airplane 102 on the map image, “shadow”. Shadow on a map. PNG media_image1.png 630 750 media_image1.png Greyscale ) ; store the shadow information (column 17 lines 15 to 45 - the graphics rendering engine 230 and the airplane modeling engine 210 operate in concert to generate an image representative of a shadow of the airplane 102 on the map image, “shadow”. Shadow on a map stores in the computer memory implied in the display, Fig. 14A. PNG media_image1.png 630 750 media_image1.png Greyscale ); dynamically display, on the map, a change of the shadow region over a predefined time period, wherein a playback progress of the change of the shadow region is controlled by a user of the map ( column 17 lines 15 to 45 - the graphics rendering engine 230 and the airplane modeling engine 210 operate in concert to generate an image representative of a shadow of the airplane 102 on the map image, “shadow”. Column 9 lines 10 to 15 and column 28 lines 34 to 60 - the graphics rendering engine 230 may operate in concert with the progress slider engine 226 to generate an image of a progress indicator bar with a slider as illustrated by elements 1408, 1410, and 1428 of FIGS. 14A and 14D. In particular, a highlighted portion 1090 of the progress indicator bar 1408 may indicate a percentage of travel that has been completed by the airplane 102. The slider may be used to view a past position, a current position, and/or a future position of the airplane that is in-flight. So user can use the slider to view a past position of the airplane, “wherein a playback progress of the change of the shadow region is controlled by a user of the map”, further a predefined time period can be read because user can use the slider to slide back to a predefined time period to view the changes of the shadow region. Slider enables the dynamic display as the user slide left and right. PNG media_image1.png 630 750 media_image1.png Greyscale ). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify Kim in view Krumm with Shadow region on a map; stores the shadow information; dynamically display, on the map, a change of the shadow region over a predefined time period, wherein a playback progress of the change of the shadow region is controlled by a user of the map as taught by Markey. The motivation for doing is to enable to enhance user experience. Response to Arguments Claim Rejection Under 35 U.S.C. 103 Applicant asserts “Claims 1-19 and 21 are allowable over the combination of Kim and Krumm because the combination of Kim and Krumm fails to dynamically display, on the map, a change of the shadow region with time over a predefined time period, wherein a playback progress of the change of the shadow region is controlled by a user of the map. Claim 1 reads: 1. A method comprising: obtaining shadow information that indicates a shadow region of a location, and that comprises location information, wherein the location information indicates a geographical location of the shadow region on a map; storing the shadow information as map data; and dynamically displaying, on the map, a change of the shadow region with time over a predefined time period, wherein a playback progress of the change of the shadow region is controlled by a user of the map.” The argument has been fully considered and is persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of Markey reference. Regarding claims 2 – 10, 12 - 18, and 20, the Applicant asserts that they are not obvious over based on their dependency from independent claims 1, and 11 respectively. The examiner cannot concur with the Applicant respectfully from same reason noted in the examiner’s response to argument asserted from claims 1, and 11 respectively. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Ming Wu whose telephone number is (571) 270-0724. The examiner can normally be reached on Monday-Thursday and alternate Fridays (9:30am - 6:00pm) EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Devona Faulk can be reached on 571-272-7515. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Ming Wu/ Primary Examiner, Art Unit 2616
Read full office action

Prosecution Timeline

Mar 19, 2024
Application Filed
Apr 09, 2024
Response after Non-Final Action
Sep 19, 2025
Non-Final Rejection — §103
Nov 26, 2025
Response Filed
Jan 27, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597109
SYSTEMS AND METHODS FOR GENERATING THREE-DIMENSIONAL MODELS USING CAPTURED VIDEO
2y 5m to grant Granted Apr 07, 2026
Patent 12579702
METHOD AND SYSTEM FOR ADAPTING A DIFFUSION MODEL
2y 5m to grant Granted Mar 17, 2026
Patent 12579623
IMAGE PROCESSING METHOD AND APPARATUS, ELECTRONIC DEVICE, AND READABLE STORAGE MEDIUM
2y 5m to grant Granted Mar 17, 2026
Patent 12567185
Method and system of creating and displaying a visually distinct rendering of an ultrasound image
2y 5m to grant Granted Mar 03, 2026
Patent 12548202
TEXTURE COORDINATE COMPRESSION USING CHART PARTITION
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
76%
Grant Probability
99%
With Interview (+23.3%)
2y 8m
Median Time to Grant
Moderate
PTA Risk
Based on 370 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month