DETAILED ACTION
Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . It is responsive to the submission dated 12/22/2025. Claims 1-10, and 12-13 are presented for examination.
Response to Arguments
2. Applicant’s arguments, see page 1, filed 12/22/2025, with respect to the rejection(s) of claim(s) 1-3, 10, and 12-13 under 35 USC 102(a1) have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of Sumi et al. (JP 2025071357 A).
Claim Rejections - 35 USC § 103
3. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
4. Claim(s) 1-7, 10, and 12-13 are rejected under 35 U.S.C. 103 as being unpatentable over Jorasch et al. (US 20240073322) in view of Sumi et al. (JP 2025071357 A).
Considering claim 1, Jorasch discloses a non-transitory computer readable medium (215, fig. 2 and/or item 110, fig. 1) storing a program (225, fig. 2) causing a computer (100/110 fig. 1) to execute a process for information processing for a room in which metadata is transmitted and received by a plurality of terminal apparatuses (e.g., the central controller 110 may comprise an electronic and/or computerized controller device, such as a computer server and/or server cluster communicatively coupled to interface with the resource devices 102a-n and/or the user devices 106a-n located at one or more various sites and/or locations; see para. 88. The central controller 110 may store and/or execute specially programmed instructions … and/or routines that facilitate the analysis of meetings …. (i) determine meeting configurations consistent with requirements for a meeting, ….. (vi) facilitate messaging to and between peripheral devices, …. (ix) provide an interface via which a resource and/or a customer (or other user) may view and/or manage meetings, see para. 89. The user device 106a manages the various peripheral devices associated with one or more users, facilitating communication between them and passing information back to the user device. See para. 97), the process comprising:
causing at least one of the plurality of terminal apparatuses (e.g., user devices 106a-n) to present states of a plurality of participants in the room in accordance with a shared rule determined in advance, the states being deduced from responses of the plurality of participants (e.g., The central controller 110 may include one or more servers located at the headquarters of a company, a set of distributed servers at multiple locations throughout the company, ….and may be a central point of processing, taking input from one or more of the devices herein, such as a user device or peripheral device. … Communications with the central controller could include user devices,… conference room control systems, video communication networks, remote learning communication networks, …, corporate data systems, etc. In various embodiments, the central controller may include hardware and software that interfaces with user devices and/or peripheral devices in order to facilitate communications. The central controller may collect analytics from devices… for the purpose of enhancing the experience of a user. See paras. 124-125. In various embodiments, the central controller may include software for providing notifications and/or status updates. The central controller may notify a user when one or more other users is present (e.g., at their respective office locations, e.g., at their respective home computers), when another user wishes to communicate with the user, when a collaborative project has been updated, when the user has been mentioned in a comment, when the user has been assigned work, when the user's productivity has fallen, when the user has been invited to play in a game, or in any other circumstance. Notifications or status updates may be sent to peripheral devices, user devices, smartphones, or to any other devices. See para. 127. In various embodiments, the central controller may include voting software. The voting software may facilitate voting, decision-making, or other joint or group action. Example votes may determine a plan of action at a company, or a strategy in a team video game. Voting software may permit users or other participants to receive notification of votes, receive background information about decisions or actions they are voting on, cast their votes, and see the results of votes. Voting software may be capable of instituting various protocols, such as multiple rounds of runoffs, win by the majority, win by the plurality, win by unanimous decision, anonymous voting, public voting, secure voting, differentially weighted votes, voting for slates of decisions, or any other voting protocol, or any other voting format. Voting results may be stored in data storage device 615, or sent to other devices for storage. See para. 128. In various embodiments, a game controller may be part of the central controller 110 … and can communicate with a user device and one or more computer peripherals. In various embodiments, a game controller may perform such functions as maintaining a game state, updating a game state based on user inputs and game rules, creating a rendering of a game state, facilitating chat or other communication between players of a game,… showing in-game advertisements, etc… see para. 129. Jaresch further teaches the peripheral configuration table 1100 may store one or more parameters controlling how a peripheral device outputs information. A parameter might include the color of an LED light, the brightness of an LED light, the volume at which audio is output, the brightness of a display screen, the color balance of a display screen, or any other parameter of an output. See paras. 151-152. Current setting field 1112 may store the current setting of a parameter for a peripheral device. In other words, if the user were to use the peripheral device at that moment, this would be the setting in effect. See para. 155. Settings field 1308 may include one or more settings or guidelines or rules by which peripheral devices within a group may interact with one another and… may govern communication between the devices. For example, one setting may permit device-to-device messaging amongst any peripheral devices within the group…. or may permit all peripheral devices in a group to interact with a particular online video game. See para. 161. Referring to FIG. 14, a diagram of an example user connections table 1400 may store connections between users that may include “co-worker” connections as during a video conference call,…, “tagging” connections which represent users who often send or receive tags from each other, etc. In various embodiments, table 1400 may include connections that have been inferred or deduced and were not explicitly requested by the users. For example, the central controller may deduce that two users are members of the same company, because they are each members of the same company as is a third user. Connection ID field 1402 may include an identifier (e.g., a unique identifier) that identifies the connection between two users. User 1 ID field 1404 may identify a first user that is part of a connection. User 2 ID field 1406 may identify a second user that is part of a connection. See para. 162. Moreover, Jaresch discloses front view 4205 includes a forward facing camera 4222 at the front of presentation remote 4200 which may capture an image/video of one or more meeting attendees, capture an image of the setup of a room, capture facial expressions of attendees/users. A projector 4276 and laser pointer 4278 may also be positioned on presentation remote 4200 and is capable of displaying different colors, may flash in order to get the attention of the meeting participants, and may display a variety of icons or symbols. Presentation remote 4200 may include one or more physical buttons and/or one or more virtual buttons (e.g. small displays that can register touch input from a user). Selection button 4232 may allow a user to select from various options (e.g. a list of presentation files, names of meeting participants, tag information) presented on display screen 4246. See paras. 253-254. Display 4246 may allow for messaging and displaying options to a user terminal. In various embodiments, display 4246 faces towards a prospective user. This may allow a user to view graphical information that is displayed by presentation remote 4200, such as messages (e.g. meeting participants want to take a break, one meeting participant has not returned from a break, tags received from meeting participants, aggregate tag information about a meeting). In some embodiments, display 4246 is touch enabled so that options (e.g. list of participants in the room, list of questions that participants have asked) on display 4246 may be selected by a user of the terminal touching them. A secondary display 4248 allows for additional information to be provided to the user, such as by displaying questions that have been received by an audience or meeting participants. See para. 259. Presentation remote 4200 may also include optical fibers 4272a and 4272b that are thin strands of diffusing optical fiber where a light source, such as a laser, LED light, or other source is applied at one end and emitted continuously along the length of the fiber. As a consequence, the entire fiber may appear to light up. Furthermore, light sources of different or time varying colors may be applied to the end of the optical fiber. As a result, optical fibers present an opportunity to display information such as a current state (e.g., red when a presentation is expected to exceed a meeting end time), or provide diverse and/or visually entertaining lighting configurations. See para. 261. Signaling lights 4294a, 4294b, and 4294c which may be directed by presentation remote 4200 to light up (in many colors) in order to communicate to meeting presenter and/or participants. In various embodiments, signaling lights 4294a-c may also be under the control of the user, allowing a user to provide visual feedback to a presenter or to other participants in a meeting. In some embodiments, colors indicated via signaling lights 4294a-c may indicate that two participants are in alignment, that a participant would like to speak, that a participant is not clear about something, that a participant has a candid observation that they would like to make, etc. In some embodiments, input buttons 4298a, 4289b, and 4298c allow users to provide information (e.g. voting, ratings, tags, selections from options, questions, identifications or other participants, to presentation remote 4200. See para. 265). See also paras. 309-315, 325-327.
Although, Jaresch discloses substantial features of the claimed invention, Jaresch fails to particularly teach causing the at least one of the plurality of terminal apparatuses to not present images or video images representing the plurality of participants when the number of the plurality of participants exceeds a predetermined value, which is disclosed by Sumi (see abstract).
Particularly, Sumi discloses a communication device and a display method capable of smooth communication to and from people of a plurality of other terminals even without displaying camera videos, when the number of people reaches the predetermined number or more. See abstract, para. 5-6 and claim 1 of Sumi. In addition, paragraphs 25-28 and 64 of Sumi discloses using a video communication device to restrict video display to certain participants during a web conference, when the total number of prescribed participants is determined to be equal to or greater than a predetermined number.
Accordingly, it would have been obvious to one of the ordinary skilled in the art, before the effective filling date of the invention was made, to have modified the teachings of Jaresch to include causing the at least one of the plurality of terminal apparatuses to not present images or video images representing the plurality of participants when the number of the plurality of participants exceeds a predetermined value, in the same conventional manner as taught by Sumi. The motivation to combine would have been to limit the number of attendees of the web conference to only a total prescribed number of interested participants, thus smoothing the communications between the host of the conference and interested number of participants from the plurality of terminals. See paras. 5-7 of Sumi.
As per claim 2, Jaresch, as modified by Sumi, discloses the shared rule requires each of the states to be associated with a color, a symbol (e.g., a tag or identifier), or a sound effect (e.g., voice of the user, voice of the presenter, room sounds, participants sounds), and the process comprises providing the at least one of the plurality of terminal apparatuses with the color, the symbol, or the sound effect that is associated with each of the states. See paras. 258-265, 309-314, 325-327 and 344-346 in view of paras. 180-184 and 251-253.
As per claim 3, Jaresch discloses the shared rule requires each of the states to be associated with a color, and the process comprises presenting a background of an image or a video image by using the color, the image or the video image representing one of the plurality of participants. See paras. 344-345.
As per claims 4-7, Jaresch fails to teach a shared state rule process, associated with a symbol, that requires the number of occurrences of each of the states to be counted, and presenting an image or a video image with a symbol superimposed onto the image or the video image, the image or the video image representing one of the plurality of participants; and also causing the at least one of the plurality of terminal apparatuses to present the states of the plurality of participants in accordance with the number of occurrences of each of the states, wherein, the counted number of occurrences of the at least one of the states being largest or exceeding a predetermined value.
However, Sumi discloses, upon detecting and receiving information from other video communication devices indicating a ratio of predetermined operations or a number of actions, expressions, and/or reactions of persons from the plurality of other terminals to be equal to or greater than a prescribed number, causing a display control unit 20 to display various images or videos of the user of the video communication device 10 and users of other video communication devices participating in the WEB conference display. And when the number of participants in the WEB conference and the number of persons from the plurality of other terminals shows actions or expressions exceeding a threshold value, replacing the video displays to these persons with an icon. See abstract, paras. 5-7 and 25-28 of Sumi. Sumi further teaches that the information indicating actions or facial expressions from persons of the other video communication devices and of each participant of the web conference are based according to modes set to "video on mode," "video restricted mode," and "video off mode." Other modes can be set on other various information such as names and participant names. And, the icon generating device generates various number of icons each reflecting the actions or expressions exceeding the predetermined threshold value. See abstract and paras. 28-34 including the detailed descriptions of figs. 2 to 4 of Sumi.
Accordingly, it would have been obvious to one of the ordinary skilled in the art, before the effective filling date of the invention was made, to have modified the teachings of Jaresch to include a rule process that requires presenting an image or a video image with a symbol superimposed onto the image or the video image, the image or the video image representing one of the plurality of participants in accordance with the number of occurrences of each of the states and according to the counted number of occurrences of the at least one of the states being largest or exceeding a predetermined value; in the same conventional manner as taught by Sumi. The motivation to combine would have been to limit the number of attendees of the web conference to only a total prescribed number of participants from amongst the peoples of the plurality of terminals that shows interest in the web conference. See paras. 5-7 of Sumi.
As per claim 10, Jaresch discloses the process comprises causing the at least one of the plurality of terminal apparatuses to present identification information of one of the plurality of participants when a state of the one of the plurality of participants has changed to a predetermined specific state (e.g., headset 4000 may facilitate the ability to include a checklist with criteria that can be verified by eye gaze/head/body orientation. In some embodiments there may be situations where assembly line workers are needed to visually inspect items for quality control. The employee with a headset 4000 and forward facing cameras 4022a-b may inspect the automobiles coming off the assembly line. Accelerometers 4070a-b may be used to monitor eye gaze time and head movements to validate that a user is actually looking at the exterior of the automobile for defects and not in other locations. If the camera or accelerometer detects the user gazing in a direction other than the automobile, vibration from vibration generator 4080 may occur to alert the user to pay attention, …. or the display 4046 may show a message to the worker to look in the direction of the automobile. Boom lights 4044 may also blink in red to alert the worker to pay attention. See para. 234).
The subject-matter of independent claims 12 and 13 correspond in terms of a system and method, respectively, to that of independent computer readable medium claim 1. As such, the rationale raised above to reject the later also apply, mutatis mutandis, to the formers.
Allowable Subject Matter
5. Claims 8-9 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims, because the prior art of record fail to teach the non-transitory computer readable medium according to claim 5, wherein the shared rule requires each of the states to be associated with a color and requires the number of occurrences of each of the states to be counted, and the process comprises increasing or decreasing brightness or saturation of the color for each hue as the number of occurrences of the state associated with the color increases and providing the at least one of the plurality of terminal apparatuses with the color (as recited in claim 8); and
wherein the shared rule requires each of the states to be associated with a sound effect and requires the number of occurrences of each of the states to be counted, and
the process comprises increasing a volume level of the sound effect, increasing a length of the sound effect, or increasing the number of repetitions of the sound effect as the number of occurrences of the state associated with the sound effect increases and providing the at least one of the plurality of terminal apparatuses with the sound effect. (as recited in claim 9).
Conclusion
6. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Chikyu et al. (20220256117) discloses an information processing apparatus including a processor configured to: allow, in relation to a conference via a communication line in which plural participants participate, plural business card images to be displayable on a terminal of a participant among the plural participants, the plural business card image being images that are obtained by reading business cards of the plural participants including at least one other participant.
7. Any inquiry concerning this communication or earlier communications from the examiner should be directed to WESNER SAJOUS whose telephone number is (571) 272-7791. The examiner can normally be reached on M-F 10:00 TO 7:30 (ET).
Examiner interviews are available via telephone and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice or email the Examiner directly at wesner.sajous@uspto.gov.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Said Broome can be reached on 571-272-2931. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/WESNER SAJOUS/Primary Examiner, Art Unit 2612
WS
03/09/2026