DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on December 10, 2025 has been entered.
Response to Amendment
In response to amendment filed December 10, 2025, claims 1 and 11-12 are amended. No claims are cancelled and no new claims are added. Claims 6-10 are withdrawn from consideration as being directed towards a non-elected invention as specified in a previous office action. Claims 1-5 and 11-12 are pending.
Response to Arguments
Applicant’s arguments, see Remarks, filed December 10, 2025, with respect to the rejection(s) of claim(s) 1-5 and 11-12 under 35 U.S.C 103 have been fully considered and are persuasive in view of the amendments. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of Jovanov (US 20140330172 A1).
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1-5 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 1 recites “determine whether a posture of the user satisfies a reference condition that is predetermined and represents a state of proper posture based on the information on the movement of the user acquired from the posture determination sensor unit”. A lack of clarity arises as the movement of the user is acquired from the posture acquisition unit as recited earlier in the claim. The posture determination sensor unit is configured to sense movement of the user. The lack of clarity causes the meaning of the claim to be unclear. For the purposes of examination, it will be interpreted that it is the movement of the user that is acquired from the posture acquisition unit.
Claims 2-5 are rejected by virtue of dependence on claim 1 and because they inherit and do not remedy the deficiencies of claim 1.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1-5 and 11-12 are rejected under 35 U.S.C. 103 as being unpatentable over Jovanov (US 20140330172 A1) in view of Koeppl (US 20180173943 A1; previously cited by applicant).
With respect to claim 1, Jovanov discloses a posture improvement assistance device (see paragraph 0017, system #100 for measuring posture change) comprising:
a posture determination sensor unit (see paragraph 0024 and Fig. 2, subject wears mobile computing device #125 that includes sensors such as gyroscope #222, accelerometer #232, magnetometer #242 and biometric sensor #252) comprising a motion detection sensor (see paragraph 0043-0044 and paragraph 0020-0024, #125 includes sensors capable of detecting movement), the motion detection sensor comprising an acceleration sensor and an angular velocity sensor configured to sensor movement of the user (see paragraphs 0043-0044 and 0020-0024 and Fig. 2, the sensors include an accelerometer #232 and based on gyroscopic measures the test logic #255 can compute angular velocity);
a posture acquisition device configured to acquire information of a movement of the user who is looking at an image displayed on a display of an electronic device (see paragraph 0021 and Fig. 2, system #100 includes memory #250 that acquires and stores data #253 including mobility data; and see paragraph 0023, input interface #212 and output interface #214 are both implemented via tough screen capable of displaying images to user while receiving touch based inputs from a user) or is listening to sound output from an audio output unit of an electronic device from the acceleration sensor and the angular velocity sensor, the audio output comprising at least one of a speaker, an earphone and a headphone; and
a processor (see paragraph 0022 and Fig. 2) configured to: […]
Although Jovanov discloses utilizing a reference condition (see paragraph 0040), Jovanov does not specifically disclose: determine whether a posture of a user satisfies a reference condition that is predetermined and represents a state of proper posture based on the information on the movement of the user acquired from the posture determination sensor unit; and cause the display to display a reference image on the display or the audio output unit to output a reference sound when the posture of a user is determined to satisfy the reference condition; and cause the display to display a difference image being changed from the reference image on the display or the audio output unit to output a difference sound being changed from the reference sound when the posture of the user is not determined to satisfy the reference condition.
Koeppl teaches determine whether a posture of a user satisfies a reference condition that is predetermined and represents a state of proper posture based on the information on the movement of the user acquired from the posture determination sensor unit (see 112b rejection above; and see paragraphs 0032-0033, 0043-0045 and 0050, processing unit #12 determines posture of person from recorded image of person and compares it to a target position where the difference is compared to a threshold; and see paragraph 0043-0045, the position comparer #25 compares determined position of the person to a target position and displays it); and cause the display to display a reference image on the display or the audio output unit to output a reference sound when the posture of a user is determined to satisfy the reference condition (see paragraphs 0032-0033, 0043-0045 and 0050, processing unit #12 determines posture of person from recorded image of person and compares it to a target position where the difference is compared to a threshold; and see paragraph 0043-0045, the position comparer #25 compares determined position of the person to a target position and displays it), and cause the display to display a difference image being changed from the reference image on the display or the audio output unit to output a difference sound being changed from the reference sound when the posture of the user is not determined to satisfy the reference condition (see paragraphs 0032-0033, 0043-0045 and 0050, as long as the target position is not attained the positioning continues; further see MPEP 2111.04 II “Contingent Limitations”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Jovanov with the teachings of Koeppl to determine whether a posture satisfies a reference condition and represents proper posture and displays a reference image when the posture is proper because it would result in the predictable result of allowing a quick positioning of a user in correct posture via visualization on a display and does not require the involvement of a secondary person (Koeppl: see [0006], [0033]).
With respect to claim 2, all limitations of claim 1 apply in which Koeppl further teaches wherein when the posture of the user is determined to satisfy the reference condition, the processor is further configured to control position of the reference image displayed on the display as a reference position (see paragraphs 0032-0033, 0043-0045 and 0050, processing unit #12 determines posture of person from recorded image of person and compares it to a target position where the difference is compared to a threshold; and see paragraph 0043-0045, the position comparer #25 compares determined position of the person to a target position and displays it), and when the posture of the user is not determined to satisfy the reference condition, change the position of the image to a position different from the reference position (see paragraphs 0032-0033, 0043-0045 and 0050, as long as the target position is not attained the positioning continues; further see MPEP 2111.04 II “Contingent Limitations”).
With respect to claim 3, all limitations of claim 1 apply in which Koeppl further teaches wherein when the posture of the user is determined to satisfy the reference condition, the processor is further configured to control the size of the reference image displayed on the display as a reference size (see paragraphs 0032-0033, 0043-0045 and 0050, processing unit #12 determines posture of person from recorded image of person and compares it to a target position as a size seen in Fig. 1 where the difference is compared to a threshold; and see paragraph 0043-0045, the position comparer #25 compares determined position of the person to a target position and displays it as seen in Fig. 1 as an image size), and when the posture of the user is not determined to satisfy the reference condition, change the size of the reference image to a size different from the reference size (see paragraphs 0032-0033, 0043-0045 and 0050, as long as the target position is not attained the positioning continues; further see MPEP 2111.04 II “Contingent Limitations”).
With respect to claim 4, all limitations of claim 1 apply in which Koeppl further teaches wherein when the posture of the user is determined to satisfy the reference condition, the processor is further configured to control posture of the reference image displayed on the display unit as a reference posture (see paragraphs 0032-0033, 0043-0045 and 0050, processing unit #12 determines posture of person from recorded image of person and compares it to a target position where the difference is compared to a threshold; and see paragraph 0043-0045, the position comparer #25 compares determined position/posture of the person to a target position/posture and displays it), and when the posture of the user is not determined to satisfy the reference condition, change the posture of the reference image to a posture rotated from the reference posture (see paragraphs 0032-0033, 0043-0045 and 0050, as long as the target position is not attained the positioning continues; further see MPEP 2111.04 II “Contingent Limitations”).
With respect to claim 5, all limitations of claim 1 apply in which Koeppl further teaches wherein when the posture of the user is determined to satisfy the reference condition, the processor is further configured to control shape of the reference image displayed on the display unit as a reference shape (see paragraphs 0032-0033, 0043-0045 and 0050, processing unit #12 determines posture of person from recorded image of person and compares it to a target shape as a size seen in Fig. 1 where the difference is compared to a threshold; and see paragraph 0043-0045, the position comparer #25 compares determined position of the person to a target position and displays it as seen in Fig. 1 as an image shape), and when the posture of the user is not determined to satisfy the reference condition, change the shape of the reference image to a shape different from the reference shape (see paragraphs 0032-0033, 0043-0045 and 0050, as long as the target position is not attained the positioning continues; further see MPEP 2111.04 II “Contingent Limitations”).
With respect to claim 11, Jovanov discloses a posture improvement assistance method (see paragraph 0016-0017, system #100 and method for measuring posture change and quantifying mobility of a person performing a mobility test) comprising:
a posture acquisition step acquiring information on a movement of a user who is looking at an image displayed on a display of an electronic device (see paragraph 0024 and Fig. 2, subject wears mobile computing device #125 that includes sensors such as gyroscope #222, accelerometer #232, magnetometer #242 and biometric sensor #252; see paragraph 0043-0044 and paragraph 0020-0024, #125 includes sensors capable of detecting movement; and see paragraph 0021 and Fig. 2, system #100 includes memory #250 that acquires and stores data #253 including mobility data; and see paragraph 0023, input interface #212 and output interface #214 are both implemented via tough screen capable of displaying images to user while receiving touch based inputs from a user) or is listening to sound output from an audio output unit of an electronic device, the audio output unit comprising at least one of a speaker, an earphone and a headphone by using an acceleration sensor and an angular velocity sensor attached to the user (see paragraphs 0043-0044 and 0020-0024 and Fig. 2, the sensors include an accelerometer #232 and based on gyroscopic measures the test logic #255 can compute angular velocity) […]; and […].
Although Jovanov discloses utilizing a reference condition (see paragraph 0040), Jovanov does not specifically disclose: to determine whether a posture of the user satisfies a reference condition that is predetermined and represents a state of proper posture; a control step, when the posture of the user satisfies the condition that is predetermined and represents the state of proper posture, controlling a display condition of the image displayed on the display or an output condition of the sound being output from the audio output to be in a reference state, and when the posture of the user does not satisfy the condition, controlling the display condition or the output condition to be in a state that is changed with respect to the reference state.
Koeppl teaches determining whether a posture of the user satisfies a reference condition that is predetermined and represents a state of proper posture (see paragraphs 0032-0033, 0043-0045 and 0050, processing unit #12 determines posture of person from recorded image of person and compares it to a target position where the difference is compared to a threshold; and see paragraph 0043-0045, the position comparer #25 compares determined position of the person to a target position and displays it); and a control step, when the posture of the user satisfies the condition that is predetermined and represents the state of proper posture, controlling a display condition of the image displayed on the display (see paragraphs 0032-0033, 0043-0045 and 0050, processing unit #12 determines posture of person from recorded image of person and compares it to a target position where the difference is compared to a threshold; and see paragraph 0043-0045, the position comparer #25 compares determined position of the person to a target position and displays it; and see paragraph 0050, determined position is compared to target posture/position where a difference is determined and displayed) or an output condition of the sound being output from the audio output unit to be in a reference state, and when the posture of the user does not satisfy the condition, controlling the display condition or the output condition to be in a state that is changed with respect to the reference state (see paragraphs 0032-0033, 0043-0045 and 0050, as long as the target position is not attained the positioning continues; further see MPEP 2111.04 II “Contingent Limitations”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Jovanov with the teachings of Koeppl to determine whether a posture satisfies a reference condition and represents proper posture and displays a reference image when the posture is proper because it would result in the predictable result of allowing a quick positioning of a user in correct posture via visualization on a display and does not require the involvement of a secondary person (Koeppl: see [0006], [0033]).
With respect to claim 12, Jovanov discloses a non-transitory storage medium storing a posture improvement assistance program(see paragraph 0016-0017, system #100 and method for measuring posture change and quantifying mobility of a person performing a mobility test; and see paragraph 0021-0022, memory storing instructions which are executed by processing element), the program causing a computer to execute:
a posture acquisition step acquiring information on a movement of a user who is looking at an image displayed on a display of an electronic device (see paragraph 0024 and Fig. 2, subject wears mobile computing device #125 that includes sensors such as gyroscope #222, accelerometer #232, magnetometer #242 and biometric sensor #252; see paragraph 0043-0044 and paragraph 0020-0024, #125 includes sensors capable of detecting movement; and see paragraph 0021 and Fig. 2, system #100 includes memory #250 that acquires and stores data #253 including mobility data; and see paragraph 0023, input interface #212 and output interface #214 are both implemented via tough screen capable of displaying images to user while receiving touch based inputs from a user) or is listening to sound output from an audio output unit of an electronic device, the audio output unit comprising at least one of a speaker, an earphone and a headphone by using an acceleration sensor and an angular velocity sensor attached to the user (see paragraphs 0043-0044 and 0020-0024 and Fig. 2, the sensors include an accelerometer #232 and based on gyroscopic measures the test logic #255 can compute angular velocity) […]; and […].
Although Jovanov discloses utilizing a reference condition (see paragraph 0040), Jovanov does not specifically disclose: to determine whether a posture of the user satisfies a reference condition that is predetermined and represents a state of proper posture; a control step, when the posture of the user satisfies the condition that is predetermined and represents the state of proper posture, controlling a display condition of the image displayed on the display or an output condition of the sound being output from the audio output to be in a reference state, and when the posture of the user does not satisfy the condition, controlling the display condition or the output condition to be in a state that is changed with respect to the reference state.
Koeppl teaches determining whether a posture of the user satisfies a reference condition that is predetermined and represents a state of proper posture (see paragraphs 0032-0033, 0043-0045 and 0050, processing unit #12 determines posture of person from recorded image of person and compares it to a target position where the difference is compared to a threshold; and see paragraph 0043-0045, the position comparer #25 compares determined position of the person to a target position and displays it); and a control step, when the posture of the user satisfies the condition that is predetermined and represents the state of proper posture, controlling a display condition of the image displayed on the display (see paragraphs 0032-0033, 0043-0045 and 0050, processing unit #12 determines posture of person from recorded image of person and compares it to a target position where the difference is compared to a threshold; and see paragraph 0043-0045, the position comparer #25 compares determined position of the person to a target position and displays it; and see paragraph 0050, determined position is compared to target posture/position where a difference is determined and displayed) or an output condition of the sound being output from the audio output unit to be in a reference state, and when the posture of the user does not satisfy the condition, controlling the display condition or the output condition to be in a state that is changed with respect to the reference state (see paragraphs 0032-0033, 0043-0045 and 0050, as long as the target position is not attained the positioning continues; further see MPEP 2111.04 II “Contingent Limitations”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Jovanov with the teachings of Koeppl to determine whether a posture satisfies a reference condition and represents proper posture and displays a reference image when the posture is proper because it would result in the predictable result of allowing a quick positioning of a user in correct posture via visualization on a display and does not require the involvement of a secondary person (Koeppl: see [0006], [0033]).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to NIDHI PATEL whose telephone number is (571)272-2379. The examiner can normally be reached Mondays to Fridays 9AM-5PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jennifer Robertson can be reached at (571) 272-5001. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/N.N.P./Examiner, Art Unit 3791
/ERIC J MESSERSMITH/Primary Examiner, Art Unit 3791