Prosecution Insights
Last updated: April 19, 2026
Application No. 18/292,408

INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND PROGRAM

Final Rejection §101§103
Filed
Jan 26, 2024
Examiner
DHINGRA, PAWANDEEP
Art Unit
2683
Tech Center
2600 — Communications
Assignee
Sony Group Corporation
OA Round
2 (Final)
60%
Grant Probability
Moderate
3-4
OA Rounds
3y 4m
To Grant
77%
With Interview

Examiner Intelligence

Grants 60% of resolved cases
60%
Career Allow Rate
289 granted / 485 resolved
-2.4% vs TC avg
Strong +17% interview lift
Without
With
+17.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 4m
Avg Prosecution
20 currently pending
Career history
505
Total Applications
across all art units

Statute-Specific Performance

§101
11.1%
-28.9% vs TC avg
§103
62.7%
+22.7% vs TC avg
§102
9.1%
-30.9% vs TC avg
§112
11.9%
-28.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 485 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims Claims 1-10 are pending. Claim Rejections - 35 USC § 101 Previous 101 rejection(s) to claim(s) have been withdrawn in view of amendments made by the applicant. Response to Arguments Applicant's arguments filed 02/09/2026 have been fully considered but they are not persuasive. Applicant argues on pages 8-11 of remarks that Katsu or Teppei alone as a references fail to teach all the limitations of newly amended claim 1 such as Katsu only has a moisture sensor and Teppei only detects the cross section of cut ingredients, therefore, each do not teach all the limitations of claim 1 alone or as a combination. In reply, examiner asserts that it indeed a 103 combination rejection and not a 102 rejection, wherein, combination of Katsu with Teppei has been successfully been taught to teach all the limitations of claim 1 such as Katsu teaches having a moisture content sensor 7A for detecting the moisture content of the food by detecting the characteristic of the section of food, paragraphs 34, 36 and the heating time and the finish set temperature are determined according to the moisture content of the food. In addition, the microwave oven 1 determines not only the time and temperature but also the type of food (things to be baked from dough or those that only give a dark grain) and the amount based on the amount of water in the food, paragraph 65. and Teppei teaches actually cutting of the ingredients which are actually cut by the cook, where the cut size is the size of the food piece produced by actually cutting the food by the cook based on the amount of the ingredients actually prepared by the cook and the size of the ingredients piece produced by actually cutting the ingredients by the cook based on “information regarding cooking in progress” and the cooking time of cooking is determined based on the amount of the ingredients actually prepared by the cook and the size of the ingredients piece produced by actually cutting the ingredients by the cook, paragraph 21. Applicant’s rest of the arguments regarding dependent claims are rendered moot as they are related to similar arguments. Moreover, claim 4 has now been rejected on grounds of rejection. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-3, 5-6 and 8-10 are rejected under 35 U.S.C. 103 as being unpatentable over Noda Katsu, JP 2004-084992 in view of Sakamoto Teppei et al., JP 2020-166557. Regarding claim 1, Katsu discloses an information processing device (microwave oven 1, paragraph 31) comprising: a moisture sensor configured to sense a section of an ingredient (moisture content sensor 7A for detecting the moisture content of the food by detecting the characteristic of the section of food, paragraphs 34, 36); a processor unit (control circuit 25 that controls the entire operation of microwave oven 1, paragraph 33) configured to detect a moisture amount of the ingredient based on a sensing result of the ingredient (moisture content of the food ingredient is detected by sensing the characteristic of the food in the heating chamber 10, paragraphs 34, 36), and calculate a heating cooking time and a heating temperature of the ingredient based on the moisture amount of the ingredient (the heating time and the finish set temperature are determined according to the moisture content of the food. In addition, the microwave oven 1 determines not only the time and temperature but also the type of food (things to be baked from dough or those that only give a dark grain) and the amount based on the amount of water in the food, paragraph 65). Katsu fails to explicitly teach sensing/determining a cross section of a cut ingredient, wherein the cut ingredient is cut based on progress of a cooking process; detecting the cut ingredient based on the cross section of the cut ingredient; and calculate a heating cooking time of the cut ingredient based on the cut ingredient. However, Teppei teaches sensing/determining a cross section of a cut ingredient, wherein the cut ingredient is cut based on progress of a cooking process (the ingredients are actually cut by the cook, where the cut size is the size of the food piece produced by actually cutting the food by the cook based on the amount of the ingredients actually prepared by the cook and the size of the ingredients piece produced by actually cutting the ingredients by the cook based on “information regarding cooking in progress”, paragraph 21); detecting the cut ingredient based on the cross section of the cut ingredient (see paragraph 21 with explanation as provided above); and calculate a heating cooking time of the cut ingredient based on number/size of the cut ingredient (cooking time of cooking is determined based on the amount of the ingredients actually prepared by the cook and the size of the ingredients piece produced by actually cutting the ingredients by the cook, paragraph 21). Katsu and Teppei are combinable because they both are in the same field of endeavor dealing with heating cooking time based on ingredients. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the application to combine the teachings of Katsu with the teachings of Teppei for the benefit of convenience to a user such that an appropriate cooking time can be estimated according to the actual cooking situation as taught by Teppei at paragraph 13. Regarding claim 2, Combination of Katsu with Teppei further teaches wherein the processor unit is further configured to calculate the heating cooking time and the heating temperature based on the moisture amount immediately before the cut ingredient (Teppei, cut ingredient, paragraph 21) is heated and cooked (Katsu, microwave oven 1 determines time and temperature based on things to be baked from dough (I.e., before cooking has even started) based on the amount of water in the food to determine an actual cooking menu for cooking when it starts, paragraph 65). Katsu and Teppei are combinable because they both are in the same field of endeavor dealing with heating cooking time based on ingredients. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the application to combine the teachings of Katsu with the teachings of Teppei for the benefit of convenience to a user such that an appropriate cooking time can be estimated according to the actual cooking situation as taught by Teppei at paragraph 13. Regarding claim 3, Combination of Katsu with Teppei further teaches wherein the processor unit is further configured to: detect a size of the cut ingredient based on a cross-sectional image and a cut width of the cut ingredient (Teppei, cut size is the size of the food piece produced by actually cutting the food by the cook, wherein, cook takes a picture of a set of ingredients placed on a cutting board with a camera 7 and cut size specifying unit 252 applies the above identification model to the image data generated by the camera 7 to detect each food piece and specifies the cut size for the food material, paragraphs 21, 31-34); and calculate the heating cooking time and the heating temperature based on the size of the cut ingredient (Katsu, microwave oven 1 determines time and temperature based on ingredients (type of food), paragraph 65, and Teppei, cooking time of cooking is determined based on the size of the ingredients piece produced by actually cutting the ingredients by the cook, paragraph 21). Katsu and Teppei are combinable because they both are in the same field of endeavor dealing with heating cooking time based on ingredients. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the application to combine the teachings of Katsu with the teachings of Teppei for the benefit of convenience to a user such that an appropriate cooking time can be estimated according to the actual cooking situation as taught by Teppei at paragraph 13. Regarding claim 5, Combination of Katsu with Teppei further teaches wherein the processor unit is further configured to calculate the size of the cut ingredient based on calibration data of the image sensor (Teppei, camera) as reference information (Teppei, cut size specifying unit 252 applies the above identification model to the image data generated by the camera 7 to detect each food piece, and calculate size of cut food pieces by using ratio of the average number of pixels N1 calculated for the food piece to the actual average length L1 of the food piece and the ratio of the number of pixels N2 calculated for the index finger to the actual length L2 of the index finger such that average length L1 of the food piece is calculated. The cut size specifying unit 252 stores the calculated average length L1 as the cut size in the work table T4 in association with the name of the cut food material for specifying the reference value of the cut size of the food material with reference to the cooking process table T3, paragraphs 31-34) Katsu and Teppei are combinable because they both are in the same field of endeavor dealing with heating cooking time based on ingredients. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the application to combine the teachings of Katsu with the teachings of Teppei for the benefit of convenience to a user such that an appropriate cooking time can be estimated according to the actual cooking situation as taught by Teppei at paragraph 13. Regarding claim 6, Combination of Katsu with Teppei further teaches wherein the processor unit is further configured to: determine a kind of a foodstuff based on an image of the foodstuff before being cut into the cut ingredient (Teppei, after shooting, the foodstuff quantity specifying unit 23 applies an identification model generated in advance by machine learning to the image data generated by the camera 7 to specify the foodstuff name before cutting or cooking, paragraphs 28, 45); and calculate the heating cooking time and the heating temperature based on the kind of the foodstuff (Teppei, identifying the ingredients to be cooked in the cooking process, and by using the specified estimation model with the specified amount and cut size as inputs, the optimum cooking time for the specified foodstuff is estimated, paragraph 36 and Katsu, the heating time and the finish set temperature are determined according the type of food (things to be baked from dough or those that only give a dark grain) to determine a cooking menu, paragraph 65). Katsu and Teppei are combinable because they both are in the same field of endeavor dealing with heating cooking time based on ingredients. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the application to combine the teachings of Katsu with the teachings of Teppei for the benefit of convenience to a user such that an appropriate cooking time can be estimated according to the actual cooking situation as taught by Teppei at paragraph 13. Regarding claim 8, Combination of Katsu with Teppei further teaches wherein the processor unit is further configured to: receive a sensing result of a weight of the cut ingredient (Teppei, food ingredient quantity specifying unit 23 specifies the quantity of meat among the ingredients prepared for cooking by the cook (for example, "200 g of beef"), and foodstuff name and quantity (weight, 200g beef) are stored in association with each other in table T4 and cooking time estimation unit 27 refers to the work table T4 and identifies the ingredients to be cooked in the cooking process and by using the specified estimation model, the optimum cooking time for the specified foodstuff is estimated, paragraphs 28, 36); and calculate the heating cooking time and the heating temperature (Katsu, the heating time and the finish set temperature are determined according the type of food (things to be baked from dough or those that only give a dark grain) to determine a cooking menu, paragraph 65) based on the sensing result of the weight of the cut ingredient (Teppei, food ingredient quantity specifying unit 23 specifies the quantity of meat among the ingredients prepared for cooking by the cook (for example, "200 g of beef"), and foodstuff name and quantity (weight, 200g beef) are stored in association with each other in table T4 and cooking time estimation unit 27 refers to the work table T4 and identifies the ingredients to be cooked in the cooking process and by using the specified estimation model, the optimum cooking time for the specified foodstuff is estimated, paragraphs 28, 36). Katsu and Teppei are combinable because they both are in the same field of endeavor dealing with heating cooking time based on ingredients. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the application to combine the teachings of Katsu with the teachings of Teppei for the benefit of convenience to a user such that an appropriate cooking time can be estimated according to the actual cooking situation as taught by Teppei at paragraph 13. Regarding claim 9, is a method version of claim 1 and recites similar features, thus it is rejected on the same rationale. Regarding claim 10, which recites a non-transitory computer-readable medium version of performing the method/steps of claim 1, see rationale as applied above. Note that execution of non-transitory computer-readable medium performing a method is taught by Teppei at paragraph 25. Claim 4 is rejected under 35 U.S.C. 103 as being unpatentable over Noda Katsu, JP 2004-084992 in view of Sakamoto Teppei et al., JP 2020-166557 as applied in claims 1 & 3 above and further in view of Suemasu et al., US 2022/0354313 further in view of Pryor, US 2010/0182136. Regarding claim 4, Combination of Katsu with Teppei further teaches wherein the processor unit is further configured to acquire, from an image sensor (Teppei, camera 7) information related to the cross-sectional image and the cut width of the cut ingredient (Teppei, cut size specifying unit 252 applies the above identification model to the image data generated by the camera 7 to detect each food piece and specify the cutting method and cutting size of the foodstuffs produced by cutting, paragraphs 33, 49), the image sensor is attached next to surface of a measurement board, and the cut ingredient is cut on the measurement board (Teppei, camera 7 is arranged so that the direction of the line of sight of the cook can be photographed when the smart glasses 1 are attached, then cook takes a picture of a set of ingredients placed on a cutting board with a camera 7 which is positioned next to (back surface) of the cutting board and the photographed image data is generated, the photographed image data is generated and contour coordinates of the foodstuff to be cut in the camera shooting area based on the image data generated by the camera 7, paragraphs 16, 66). Katsu and Teppei are combinable because they both are in the same field of endeavor dealing with heating cooking time based on ingredients. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the application to combine the teachings of Katsu with the teachings of Teppei for the benefit of convenience to a user such that an appropriate cooking time can be estimated according to the actual cooking situation as taught by Teppei at paragraph 13. Combination of Katsu with Teppei fails to further teach sensor is attached to a back surface of a transparent measurement board, and the cut ingredient is cut on the transparent measurement board. However, Suemasu teaches sensor is attached to a back surface of a measurement board, and the cut ingredient is cut on the measurement board (cooking board has pressure sensors attached to back surface of it and food is cut on the cutting board, paragraphs 103, 123, 232). Katsu and Teppei are combinable with Suemasu because they all are in the same field of endeavor dealing with providing cooking assistance for cooking food ingredients. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the application to combine the teachings of Katsu and Teppei with the teachings of Suemasu for the benefit of providing effective cooking assistance dependent upon the cut of ingredients as taught by Suemasu at paragraphs 7, 9. Combination of Katsu with Teppei and Suemasu fails to further teach that a measurement board is a transparent measurement board for cutting food ingredient. However, Pryor teaches camera is attached next to a transparent measurement board (an LCD or other flat panel display can be used for the work board (preferably with a protective cover glass) and an apparatus 130 having a camera overhead used to obtain information from the work board region, paragraph 81), and cut ingredient is cut on the transparent measurement board (food related information on a work surface on which food is cut, claim 2). Katsu, Teppei and Suemasu are combinable with Pryor because they all are in the same field of endeavor dealing with cooking food in the kitchen. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the application to combine the teachings of Katsu, Teppei, Suemasu with the teachings of Pryor for the benefit of improving safety of operation of home systems by providing easier to see, easier to operate and less distracting controls as taught by Pryor at paragraph 7. Claim 7 is rejected under 35 U.S.C. 103 as being unpatentable over Noda Katsu, JP 2004-084992 in view of Sakamoto Teppei et al., JP 2020-166557 further in view of Suemasu et al., US 2022/0354313 further in view of Pryor, US 2010/0182136 as applied in claim 4 above and further in view of Kubotani et al., US 2018/0218219. Regarding claim 7, Combination of Katsu with Teppei, Suemasu and Pryor fails to explicitly teach wherein processor unit is further configured to: acquire a gesture video from the image sensor; detect a gesture of a cook based on the gesture video and execute processing corresponding to the gesture. However, Kubotani teaches wherein processor unit is configured to: acquire a gesture video from the image sensor; detect a gesture of a cook based on the gesture video and execute processing corresponding to the gesture (based on an image captured by the camera 200 (series of images for plurality of recipes (i.e. video) representing the cook holding a hand over a recipe, and judging a recipe among the projected recipes over which the cook holds a hand to determine the recipe for preparation to be performed by the cook among the recipes registered such that second-information acquirer 103 may determine the recipe based on the gesture by the cook, paragraph 61). Katsu with Teppei, Suemasu and Pryor are combinable with Kubotani because they all are in the same field of endeavor dealing with cooking. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the application to combine the teachings of Katsu with Teppei, Suemasu and Pryor with the teachings of Kubotani for the benefit of providing convenience such that enable effort by a cook to be reduced as taught by Kubotani at paragraph 5. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Vaupot et al, US 2022/0202236 Shigeshiro et al., JP 2020-153774 Ose et al., US 2008/0236404 Nelson et al., US 2022/0283135 Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to PAWANDEEP DHINGRA whose telephone number is (571)270-1231. The examiner can normally be reached 9:00-5:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Abderrahim Merouan can be reached at (571) 270-5254. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /PAWAN DHINGRA/Examiner, Art Unit 2683 /ABDERRAHIM MEROUAN/Supervisory Patent Examiner, Art Unit 2683
Read full office action

Prosecution Timeline

Jan 26, 2024
Application Filed
Dec 13, 2025
Non-Final Rejection — §101, §103
Feb 09, 2026
Response Filed
Mar 04, 2026
Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12603963
INFORMATION PROCESSING APPARATUS, NON-TRANSITORY COMPUTER READABLE MEDIUM, AND METHOD
2y 5m to grant Granted Apr 14, 2026
Patent 12592999
INFORMATION PROCESSING APPARATUS, DISTRIBUTED PROCESSING SYSTEM, NON-TRANSITORY COMPUTER READABLE MEDIUM STORING PROGRAM, AND DISTRIBUTED PROCESSING METHOD
2y 5m to grant Granted Mar 31, 2026
Patent 12578908
INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING METHOD, AND NON-TRANSITORY COMPUTER READABLE MEDIUM FOR SETTING DEVICE NAME FOR NETWORK SERVICE FROM SETTING SCREEN
2y 5m to grant Granted Mar 17, 2026
Patent 12562258
METHODS AND SYSTEMS FOR ASSESSING TREATMENT OF A DISEASE BASED ON LESION FEATURES
2y 5m to grant Granted Feb 24, 2026
Patent 12548324
METHODS AND SYSTEMS FOR DYNAMICALLY MODIFYING METADATA WHILE SERVING IMAGES
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
60%
Grant Probability
77%
With Interview (+17.0%)
3y 4m
Median Time to Grant
Moderate
PTA Risk
Based on 485 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month