Prosecution Insights
Last updated: April 19, 2026
Application No. 18/875,798

INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND PROGRAM

Non-Final OA §101§102§103
Filed
Dec 17, 2024
Examiner
AHMED, MASUD
Art Unit
3657
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Sony Group Corporation
OA Round
1 (Non-Final)
82%
Grant Probability
Favorable
1-2
OA Rounds
2y 10m
To Grant
96%
With Interview

Examiner Intelligence

Grants 82% — above average
82%
Career Allow Rate
969 granted / 1178 resolved
+30.3% vs TC avg
Moderate +13% lift
Without
With
+13.2%
Interview Lift
resolved cases with interview
Typical timeline
2y 10m
Avg Prosecution
27 currently pending
Career history
1205
Total Applications
across all art units

Statute-Specific Performance

§101
10.9%
-29.1% vs TC avg
§103
36.5%
-3.5% vs TC avg
§102
21.7%
-18.3% vs TC avg
§112
10.4%
-29.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1178 resolved cases

Office Action

§101 §102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claim 16 rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. The claim(s) does/do not fall within at least one of the four categories of patent eligible subject matter because of the following: Claim 16 is rejected under 35 U.S.C. § 101 because the claimed invention is directed to non-statutory subject matter. The claim recites “a program that causes an information processing device to execute information processing.” The claim does not recite that the program is embodied in a non-transitory computer-readable medium, stored in memory, or otherwise tied to a statutory category of invention. A “program,” standing alone, constitutes software per se or a set of instructions and does not fall within any of the four statutory categories of patent-eligible subject matter (process, machine, manufacture, or composition of matter). Because the claimed program is not recited as being embodied in a tangible medium or integrated into a statutory class of invention, it is directed to non-statutory subject matter. Accordingly, claim 16 is rejected under 35 U.S.C. § 101. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-3, 6-16 is/are rejected under 35 U.S.C. 102 (a)(1) as being anticipated by Koyama et al (US 2021/0031378). Claims 1, 15 and 16, An information processing device comprising (para [0053] “The autonomous moving body 10 according to an embodiment of the present disclosure is an information processor that estimates circumstances on the basis of collected sensor information and autonomously selects and executes various motions according to circumstances.” ): an environment/body feature extraction unit that receives an input of a detection value of a sensor attached to a robot ( para [0091] “The input unit 110 has a function of collecting various types of information regarding a user and a surrounding environment… The input unit 110 includes various sensors illustrated in FIG. 1.” [0092] “The recognition unit 120 has a function of performing various recognitions of the user, the surrounding environment, and the state of the autonomous moving body 10 on the basis of various types of information collected by the input unit 110.” ), and extracts at least any one feature amount of an environment feature amount that is a feature amount related to an environment around the robot (para [0094] “Further, the recognition unit 120 has a function of estimating and understanding the surrounding environment and circumstances in which the autonomous moving body 10 is located, on the basis of the above-mentioned recognized information.” [0065] “A distance measuring sensor 535 has a function of acquiring circumstances of a floor surface of the front of the autonomous moving body 10.” ), and a body feature amount that is a feature amount related to the robot itself (para [0093] “In addition, the recognition unit 120 is able to recognize… a posture of the autonomous moving body 10, and the like.” [0069] “An inertia sensor 555 is a six-axis sensor that detects physical amounts such as velocities, accelerations, and rotations of the head and the torso.” ); and a behavior control unit that determines and executes a behavior to be executed by the robot based on the feature amount extracted by the environment/body feature extraction unit (para [0096] “The action planning unit 140 has a function of planning an action to be performed by the autonomous moving body 10 on the basis of the circumstance estimated by the recognition unit 120 and the knowledge learned by the learning unit 130.” [0097] “The operation control unit 150 has a function of controlling operations of the drive unit 160 and the output unit 170 on the basis of action planning performed by the action planning unit 140.”). Claim 2, The information processing device according to claim 1, wherein the information processing device causes the robot to execute a behavior for acquiring the detection value of the sensor, and executes active sensor detection value acquisition processing (para [0056] “The autonomous moving body 10 according to an embodiment of the present disclosure comprehensively judges its own state, the surrounding environment, and the like similarly to animals including humans, to thereby determine and execute autonomous motions.” [0066] “According to the touch sensor 540, it is possible to detect a contact action such as touching, stroking, tapping, or pushing by the user, thus making it possible to perform a motion corresponding to the contact action.”). . Claim 3, The information processing device according to claim 1, wherein the environment/body feature extraction unit applies a machine learning model, and extracts the environment feature amount that is the feature amount related to the environment around the robot or the body feature amount that is the feature amount related to the robot itself based on the detection value of the sensor (para [0095] “The learning unit 130 has a function of learning an environment (circumstance) and an action as well as an effect of the action on the environment. The learning unit 130 implements the learning described above using, for example, a machine learning algorithm such as deep learning.” [0096] “The action planning unit 140 has a function of planning an action to be performed by the autonomous moving body 10 on the basis of the circumstance estimated by the recognition unit 120 and the knowledge learned by the learning unit 130.”). Claim 6, The information processing device according to claim 1, wherein the behavior control unit determines at least any one behavior of a movement behavior, a gripping behavior, an environment adaptation behavior, feeling expression, and a conduct of the robot based on the feature amount extracted by the environment/body feature extraction unit (para [0054] “The autonomous moving body 10 … executes various motions dependent on the direction … such as turning a head or a line of sight … or running (moving).” [0081] “The display 510 has a function of visually expressing movements of eyes and emotions of the autonomous moving body 10.”). Claim 7, The information processing device according to claim 1, wherein the behavior control unit determines a walking mode of the robot based on the feature amount extracted by the environment/body feature extraction unit (para [0055] “The autonomous moving body 10 … may take an action such as moving to a location … depending on circumstances.” [0096] “The action planning unit 140 has a function of planning an action to be performed … on the basis of the circumstance estimated.”). . Claim 8, The information processing device according to claim 1, wherein the behavior control unit determines at least any one of a walking speed and a stride of the robot based on the feature amount extracted by the environment/body feature extraction unit (para [0097] “The operation control unit 150 performs rotational control of the actuators 570 … on the basis of the above-mentioned action plan.”). Claim 9, The information processing device according to claim 1, wherein the behavior control unit determines an action that the robot needs to execute, based on the feature amount extracted by the environment/body feature extraction unit, and causes the robot to execute the determined action (para [0096] “The action planning unit 140 has a function of planning an action to be performed by the autonomous moving body 10.” [0097] “The operation control unit 150 has a function of controlling operations of the drive unit 160 and the output unit 170.”). Claim 10, The information processing device according to claim 1, wherein the behavior control unit executes processing of determining a facial expression of the robot based on the feature amount extracted by the environment/body feature extraction unit, and causing the robot to change the facial expression thereof into the determined facial expression (para [0081] “The display 510 has a function of visually expressing movements of eyes and emotions of the autonomous moving body 10.” [0099] “The output unit 170 has a function of outputting visual information … under the control of the operation control unit 150.” ). Claim 11, The information processing device according to claim 1, wherein the behavior control unit includes a behavior determination unit that determines a behavior to be executed by the robot based on the feature amount extracted by the environment/body feature extraction unit, and a behavior execution unit that causes the robot to execute the behavior determined by the behavior determination unit (para [0096] “The action planning unit 140 has a function of planning an action to be performed.” [0097] “The operation control unit 150 has a function of controlling operations … on the basis of action planning.”). Claim 12, The information processing device according to claim 11, wherein the behavior control unit further includes a behavior switching control unit that performs behavior switching control when the behavior determined by the behavior determination unit is different from a behavior currently executed by the robot (para [0056] “The autonomous moving body 10 … determines and executes autonomous motions that are presumed to be optimal for each circumstance.” ). Claim 13, The information processing device according to claim 1, further comprising a sensing database that holds the detection value of the sensor, wherein the environment/body feature extraction unit compares a sensor detection value at a current point of time and a past sensor detection value stored in the sensing database, and determines whether the environment around the robot is a previously experienced environment or a new environment (para [0110] “The evaluation result holding section 350 holds results of the reliability evaluation … The results … are used for the action planning.” ). Claim 14, The information processing device according to claim 1, further comprising an environment/body feature amount database that stores at least any one of the environment feature amount and the body feature amount extracted by the environment/body feature extraction unit, wherein the behavior control unit compares an environment feature amount or a body feature amount at a current point of time, and a past environment feature amount or body feature amount stored in the environment/body feature amount database, and detects at least any one change of a change of the environment around the robot and a change of the robot itself based on a comparison result, and determines and executes an optimal behavior matching the detected change (para [0115] “The accumulation unit 210 collects motion history and evaluation achievements … and accumulates such information.” [0117] “It is possible … to achieve a more flexible and efficient action plan.” ). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 4 is/are rejected under 35 U.S.C. 103 as being unpatentable over Koyama et al (US 2021/0031378), in view of KR20150006592A, hereon KR’592. Claim 4, The information processing device according to claim 1, wherein the environment/body feature extraction unit extracts as the environment feature amount a feature amount of at least any one of material quality, hardness, and slipperiness of a floor surface on which the robot exists (para [0065] “A distance measuring sensor 535 has a function of acquiring circumstances of a floor surface of the front of the autonomous moving body 10.” [0068] “According to the sole button 550, it is possible detect contact or non-contact between the autonomous moving body 10 and the floor surface.”), however not explicit on floor type such as slippery or other condition, KR’592 teaches extracting a floor-surface characteristic by estimating friction. Specifically, KR’592 discloses a mobile robot including a unit “estimating the friction coefficient between the wheel and the ground” (para 0004), a friction coefficient is a quantitative indicator of surface slipperiness, thus, It would have been obvious to modify Koyama to further extract a floor-surface feature corresponding to slipperiness (e.g., friction coefficient) as taught by KR’592, because both references concern mobile robot locomotion and environmental sensing, and determining friction would predictably improve motion planning on different floor conditions. Claim(s) 5 is/are rejected under 35 U.S.C. 103 as being unpatentable over Koyama et al (US 2021/0031378), in view of Swarup et al (US 8224484). Claim 5, The information processing device according to claim 1, wherein the environment/body feature extraction unit extracts as the body feature amount a feature amount indicating whether or not an accessory is equipped by the robot (para [0093] “In addition, the recognition unit 120 is able to recognize … a posture of the autonomous moving body 10, and the like.” [0070] “The autonomous moving body 10 may further include, aside from the above configuration, various communication devices including, for example, a temperature sensor, a geomagnetic sensor, or a GNSS signal receiver.” ), however not explicit on detecting the object or load attached to the robot, however, Swarup teaches detecting whether a tool (accessory) is attached to a robotic system. US8224484B2 discloses: “The one or more integrated circuits 426 may be used to identify the type of robotic surgical tool coupled to the robotic arm,” (col 7, lines 47-50) and further teaches that “Each surgical tool includes an identification device that communicates information to the robotic system when coupled.” Thus, Swarup expressly teaches determining whether an accessory is equipped, therefore, It would have been obvious to modify Koyama’s robot to additionally extract a body feature indicating whether an accessory is equipped, as taught by Swarup, because knowledge of attached tools directly affects robot operation and control. Incorporating accessory detection into Koyama’s sensor-based circumstance estimation would have been a predictable improvement. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to MASUD AHMED whose telephone number is (571)270-1315. The examiner can normally be reached M-F 9:00-8:30 PM PST with IFP. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Abby Lin can be reached at 571 270 3976. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. MASUD . AHMED Primary Examiner Art Unit 3657A /MASUD AHMED/Primary Examiner, Art Unit 3657
Read full office action

Prosecution Timeline

Dec 17, 2024
Application Filed
Feb 05, 2026
Non-Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596012
METHOD FOR DETERMINING POINT OF INTEREST FOR USER, ELECTRONIC DEVICE AND STORAGE MEDIUM
2y 5m to grant Granted Apr 07, 2026
Patent 12589729
LOAD BALANCING APPROACH TO EXECUTE COST OPTIMIZATION IN MULTI-MODE AND MULTI-GEAR HYBRID ELECTRIC VEHICLES
2y 5m to grant Granted Mar 31, 2026
Patent 12589777
VEHICLE
2y 5m to grant Granted Mar 31, 2026
Patent 12578723
VEHICLE
2y 5m to grant Granted Mar 17, 2026
Patent 12578739
Vehicle Control System
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
82%
Grant Probability
96%
With Interview (+13.2%)
2y 10m
Median Time to Grant
Low
PTA Risk
Based on 1178 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month