[go: up one dir, main page]

CN115604543B - Automatically generate enhanced activity and event summaries for game sessions - Google Patents

Automatically generate enhanced activity and event summaries for game sessions

Info

Publication number
CN115604543B
CN115604543B CN202210724542.8A CN202210724542A CN115604543B CN 115604543 B CN115604543 B CN 115604543B CN 202210724542 A CN202210724542 A CN 202210724542A CN 115604543 B CN115604543 B CN 115604543B
Authority
CN
China
Prior art keywords
game
event
events
metadata
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210724542.8A
Other languages
Chinese (zh)
Other versions
CN115604543A (en
Inventor
J·L·万·威尔森
G·R·科克伦
J·王
P-S·王
J·D·温特劳布
T·沙玛
S·赖卡尔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nvidia Corp
Original Assignee
Nvidia Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/359,961 external-priority patent/US11617951B2/en
Application filed by Nvidia Corp filed Critical Nvidia Corp
Publication of CN115604543A publication Critical patent/CN115604543A/en
Application granted granted Critical
Publication of CN115604543B publication Critical patent/CN115604543B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The present disclosure relates to automatically generating enhanced activity and event summaries for gaming sessions. A game summary is generated using an event log of in-game events and corresponding game content based on game data associated with the game play session. The event log may include metadata indicating a time of the in-game event and an association between the in-game event and a game content item capturing the in-game event. The user may interact with in-game events through a temporal context to make more informed choices and better understand the game session. Using the event logs, the game summaries may provide features such as a timeline conveying the relative timing of in-game events, a list of in-game events, a map of the virtual environment of the game annotated with time based on in-game events, game state information, and statistics and performance information. The game summary may show trends over time and/or gaming sessions and convey information such as a selected player group of a team.

Description

Automatically generating enhanced activity and event summaries for gaming sessions
Background
Maintaining contact with others through social media, text messaging, or email has become an integral part of the way in which today's society operates. In some popular social media applications, a user is presented with an interface that displays posts, videos, and other (often) curated information in the form of an activity feed (ACTIVITY FEED). The activity feed may be used to evaluate users of an activity in which those persons within the personal network are participating. Video games have become an increasingly popular source of social connections for people to interact with their friends or to connect online with other people having similar game interests. Typically, users maintain a list of friends to facilitate interaction with each other and to establish social connections with other users during a game play session. Historically, however, video games have had limited status in the activity feed, often just as a way to secretly or publicly encourage others to play.
Some gaming systems provide a basic activity feed for games. A user may manually intercept a screenshot or clip of a game play session using a game operating system (e.g., the game operating system of a console) and share with other users within their game social circles. These posts may be accompanied by a brief description of the captured game, but typically do not provide the viewer with too much context for knowing what happens in the game play session. In some cases, the achievement of a player unlocking during game play may trigger a corresponding post in the player's social feed (feed). However, utilizing these functions requires game developers to write code specifically for social feed interactions in the game, and not all developers can devote time and resources to do so. In addition, these functions are implemented using the Application Programming Interface (API) of the gaming system, which limits the release functions to those supported by the game engine and the game platform. Further, in conventional gaming platforms that contain an activity feed, social posts regarding a game play session cannot be reissued once the game play session is completed. In addition, determining whether a particular segment or video clip of video contains interesting content (e.g., more actions) is important to decide which videos may be worth including in the activity feedback so that "interesting content" is automatically captured for review and sharing at a later time.
Disclosure of Invention
The present disclosure relates to automatically generating and updating enhanced summaries of activities and events of a game play session. In an embodiment, a game digest may be generated from metadata extracted from game data of a game play session including in-game events. The present disclosure relates to systems and methods for generating a game digest using metadata that gathers and communicates in-game events that occur within one or more game play sessions so that a viewer can access associated screenshots, video clips, and/or other game content to better understand the one or more game play sessions.
Unlike conventional systems, the event log of in-game events and corresponding screenshots, video clips, and/or other game content may be used to generate a game summary, each of which may be automatically generated based on analysis video data, user input data, game transmission data using an Application Programming Interface (API), and/or other data related to a game play session. The event log may include metadata indicating a time of the in-game event within the game session and an association between the in-game event and a game content item capturing the in-game event. In some embodiments, using the disclosed methods, a user may interact with in-game events having a temporal context, allowing for a more intelligent selection and better understanding of a game play session. In various embodiments, enhanced game summaries may be generated using event logs that provide features such as a timeline conveying relevant timing of in-game events, a list of in-game events (e.g., thumbnails), a map of a game virtual environment annotated in time based on in-game events, game state information, and statistics and performance information. One or more portions of the game summary may correspond to one or more game sessions and/or players, for example, displaying trends over time and/or game sessions, and conveying information of a selected player group (e.g., team).
Drawings
The system and method of the present disclosure relating to generating an automated game digest for a game play session is described in detail below with reference to the accompanying drawings, wherein:
FIG. 1 illustrates an example system diagram of a game summary system according to some embodiments of the present disclosure;
FIG. 2 illustrates an example of a game summary of in-game events including a map of one or more virtual environments in which in-game events occur, according to some embodiments of the present disclosure;
FIG. 3 illustrates an example of a game summary of in-game events, including a library of game content capturing one or more in-game events, according to some embodiments of the present disclosure;
FIG. 4 illustrates an example of additional information that may be included in a game summary of an in-game event, according to some embodiments of the present disclosure;
FIG. 5 illustrates a flow chart showing a method for presenting a game summary including a timeline associated with in-game events, in accordance with some embodiments of the present disclosure;
FIG. 6 illustrates a flowchart of a method for presenting a game summary including a list associated with in-game events, according to some embodiments of the present disclosure;
FIG. 7 illustrates a flowchart of a method for presenting a game summary including a map of a virtual environment associated with an in-game event, in accordance with some embodiments of the present disclosure;
FIG. 8 illustrates an example system diagram of a game streaming system according to some embodiments of the present disclosure;
FIG. 9 is a block diagram of an exemplary computing environment suitable for use in implementing some embodiments of the invention, and
FIG. 10 is an example data center suitable for use in implementing some embodiments of the present disclosure.
Detailed Description
The present disclosure relates to automatically generated game summaries for game play sessions. In an embodiment, a game digest may be generated from metadata extracted from game data of a game play session including an in-game event. The present disclosure relates to systems and methods for generating a game summary using metadata that conveys in-game events that occur in one or more game play sessions so that a viewer can access, view, and plan (e.g., for posting in a social network) associated screenshots, video clips, and/or other game content to better understand the one or more game play sessions.
The game summary may be generated using an event log of in-game events and corresponding screenshots, video clips, and/or other game content, each of which may be generated based on analysis video data, user input data, game transmission data using an Application Programming Interface (API), and/or other data related to a game session. The event log may include metadata indicating a time of the in-game event within the game session and an association between the in-game event and a game content item with which the in-game event was captured. In some embodiments, using the disclosed methods, a user may interact with in-game events having a temporal context, allowing for a more intelligent selection and better understanding of a game play session. One or more portions of the game summary may be displayed in and/or accessed from the user's activity feed, thereby providing a more robust view of the player's activity. In one or more embodiments, one or more portions of the game summary may correspond to one or more game sessions and/or players, e.g., display trends over time and/or game sessions, and convey information for a selected set of players (e.g., team).
In at least one embodiment, using the event log, the game summary can provide an interface in which interface elements (e.g., including thumbnails) corresponding to or otherwise indicating in-game events are displayed. In some embodiments, a timeline may also be displayed to convey the relative timing of events in the game. In at least one embodiment, the interface elements may form a list of in-game events and may be displayed in chronological order (e.g., chronological order) using metadata. For example, each interface element may correspond to a thumbnail of a video clip or screenshot in which one or more in-game events occur, and may be displayed chronologically (e.g., in an image carousel) using metadata. In at least one embodiment, a user may select an interface element that may cause relevant game content items that capture in-game events to be loaded into a user interface.
In embodiments that include a timeline, the timeline may also be updated to indicate in-game events based on the selections. For example, the update may include visual emphasis of at least an icon or other indicator on the timeline representing an in-game event. Additionally or alternatively, the event log may organize in-game events into rounds or contests in a game play session. When the user selects the interface element, the timeline may be updated to correspond to the turn that includes the in-game event (e.g., the turn is displayed in the timeline and/or the timeline is matched to the in-game event of the turn). The game state information displayed in the user interface may also be updated to correspond to in-game events. For example, one or more player scores may be updated using an event log to reflect their status during a game associated with an in-game event.
In a further aspect, the user interface may include a map of a game virtual environment corresponding to a game play session. Based on the selection of the interface element, the map may be annotated with the location and time associated with the in-game event in the metadata. The location may correspond to the player and/or an object in another game at the time and/or during or after the in-game event occurs. In various embodiments, annotating can include displaying and/or updating a path of an object in the game on a map, where an endpoint can be based on location and time. For example, paths or other indicators of one or more locations of players, enemies, non-player characters (NPCs), in-game events, items, and/or other elements may be displayed on a map to reflect game status before, during, and/or after an in-game event. In various embodiments, elements may be displayed using corresponding symbols or icons, and different symbols or icons may be used for each element type.
In at least one embodiment, game data associated with a game play session may be analyzed to automatically detect in-game events, which may trigger recording and/or saving of a game play by screen shots, videos, and/or other game content items that capture game events and metadata (e.g., timestamps, in-game object locations, scores, player status, player statistics, game participants divided by user name, event types, kill/death/assist rates, etc.). Analysis may be performed during the game play session (e.g., by analyzing one or more aspects of the game play (e.g., video, user input, etc.), API messages from the game and/or streams from viewers of the video streaming platform or application), and/or analysis may be performed after the game play session (e.g., on video files and/or saved replays of the game play session or other game data).
In various embodiments, in-game events may be detected using algorithms that observe the screen for visual cues (e.g., using neural networks, computer vision, etc.). When these visual cues are detected, it may automatically trigger the recording and/or saving of the game play. In at least one embodiment, it may additionally or alternatively trigger further analysis of the respective video data to extract one or more elements of the metadata, or at least some of the metadata may be derived by detecting the respective visual cues. Metadata, screenshots, video clips, maps (e.g., extracted from and/or associated with a game), gamer icons in game play, etc. may be stored in the data entity for use in displaying the game summary.
Referring to fig. 1, fig. 1 is an example system diagram of a game summary system ("GSS system") 100 according to some embodiments of the present disclosure. It should be understood that this and other arrangements described herein are set forth only as examples. Other arrangements and elements (e.g., machines, interfaces, functions, orders, groupings of functions, etc.) can be used in addition to or instead of those shown, and some elements may be omitted entirely. Furthermore, many of the elements described herein are functional entities that may be implemented as discrete or distributed components, or in combination with other components, and may be implemented in any suitable combination and location. The various functions described herein as being performed by an entity may be performed by hardware, firmware, and/or software. For example, various functions may be performed by a processor executing instructions stored in a memory.
GSS system 100 may include, among other things, client devices 104 (a), 104 (B), and 104 (C) (collectively referred to herein as "client devices 104"), game summary server 116, and/or game server 126. Although client devices 104 (a), 104 (B), and 104 (C) are shown in fig. 1, this is not intended to be limiting. In any example, there may be any number of client devices 104.GSS system 100 (and its components and/or features) may be implemented using one or more computing devices.
The components of GSS system 100 may communicate over network 102. The network may include a Wide Area Network (WAN) (e.g., the internet, a Public Switched Telephone Network (PSTN), etc.), a Local Area Network (LAN) (e.g., wi-Fi, zigBee, Z-Wave, bluetooth Low Energy (BLE), ethernet, etc.), a Low Power Wide Area Network (LPWAN) (e.g., loRaWAN, sigfox, etc.), a Global Navigation Satellite System (GNSS) network (e.g., global Positioning System (GPS)), and/or other network types. In any example, each component of GSS system 100 may communicate with one or more other components via one or more networks 102.
Client device 104 may include a smart phone, a notebook computer, a tablet computer, a desktop computer, a wearable device, a game console, a virtual reality system (e.g., headphones, a computer, a game console, one or more remote controls, one or more controllers, and/or other components), a streaming media device, a smart home device that may include a smart personal assistant, and/or other types of devices capable of supporting game play.
Client device 104 may include one or more applications 106, a display 108, a communication interface 110, one or more input devices 112, a graphical interface manager 130, a game data capturer 138, and an interest determiner 140. The one or more game summary servers 116 may include a graphical interface manager 130, a communication interface 122, and one or more data stores 124. The one or more game summary servers 116 may include one or more portions of the graphical interface manager 130, the game data capturer 138, and/or the interest determiner 140, in addition to or instead of those components included in the one or more client devices 104. The one or more game servers 126 may include a game engine 128, a communication interface 132, and a data store 134.
Although only some components and/or features of client device 104, game summary server 116, and game server 126 are shown in fig. 1, this is not intended to limit the present disclosure. For example, the client device 104, the game summary server 116, and the game server 126 may include additional or alternative components. Furthermore, the configuration of components is provided as an example only, but may be highly flexible depending on the implementation of GSS system 100. For example, as described herein, one or more of the game servers 126 may be implemented as and/or include at least some of the functionality of one or more game summary servers 116. Examples include one or more portions of the graphical interface manager 130, the game data capturer 138, or the interest determiner 140. Similarly, at least some of the functionality of the game digest server 116 may be included in the application 106, may not be included in the application 106, or may be at least partially performed by the client device 104 or not performed by the client device 104. Examples are indicated in fig. 1 by illustrating a graphical interface manager 130, a game data capturer 138, and an interest determiner 140 in the game summary server 116 and the client device 104.
As an overview, the application 106 may include any of a variety of potential types of software capable of presenting one or more game summaries (e.g., game summary 136) on the display 108 of the client device 104. In at least one embodiment, the applications 106 (and/or different applications on the one or more client devices 104) may include a gaming application that facilitates game play (e.g., cloud and/or local games) on the client devices 104 via the one or more input devices 112. The communication interface 110 may include one or more components and features for communicating across one or more networks (e.g., one or more networks 102), such as receiving and/or transmitting data corresponding to one or more game summaries (e.g., metadata, video clips, screenshots, etc.), user inputs to the input device 112, streaming media content, etc.
Input device 112 may include any type of device capable of providing user input to a game. The input devices may include a keyboard, mouse, microphone, touch screen display, controller, remote control, headphones (e.g., a sensor of a virtual reality headphone), and/or other types of input devices.
The communication interface 110 may include one or more components and features for communicating across one or more networks (e.g., one or more networks 102). The communication interface 110 may be configured to communicate via any number of networks 102 described herein. For example, to communicate in GSS system 100 of fig. 1, client device 104 may access the internet through a router using an ethernet or Wi-Fi connection to communicate with one or more game summary servers 116, one or more game servers 126, and/or other client devices 104.
Graphical interface manager 130 may include any of a variety of computer-readable media. Computer readable media can be any available media that can be accessed by client device 104 (a). Computer readable media can include both volatile and nonvolatile media, as well as removable and non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. The graphical interface manager 130 may include additional or alternative components, such as those described below with respect to the memory 604 of fig. 6.
The graphical interface manager 130 may be configured to generate a game digest and/or manage the display of the game digest in a user interface (e.g., the user interface 150 of the application 106). The game data capturer 138 may be configured to capture game data, such as metadata, video clips, and screen shots, from gameplay of the game. The interest determiner 140 may be used by the game data capturer 138 and may be configured to select and/or identify one or more portions of the game data (e.g., game data that may be of interest to the user). In at least one embodiment, one or more portions of the game data capturer 138, the interest determiner 140, and/or the graphical interface manager 130 may be part of the application 106 (and/or the game engine 128).
As shown in fig. 1, in one or more embodiments, the one or more game summary servers 116 may include at least a portion of the graphical interface manager 130, the game data capturer 138, and/or the interest determiner 140 in addition to the client device 104, or may not include the client device 104. For example, one or more game summary servers 116 may be used in a client-server based implementation of GSS system 100, where application 106 comprises a client application. The communication interface 122 may be configured to communicate over any number of networks 102 described herein to receive and/or transmit data (e.g., video streams of game play sessions, recordings, video clips, screenshots, user inputs to one or more input devices 112, streaming content, etc.) for generating metadata and/or from data related to gameplay. One or more data stores 124 may be used to store any of a variety of information (e.g., video clips, screenshots, thumbnails, game summaries, metadata, etc.) associated with the game summary server 116.
The game engine 128 of the game server 126 may include game functionality that enables one or more users of the client devices 104 to play games over a network. The game engine 128 may be provided at least in part in the game server 126 for cloud-based implementation of the GSS system 100. In some embodiments, however, the game engine 128 may be at least partially included in the application 106. The communication interface 132 may include one or more components and features for communicating across one or more networks (e.g., one or more networks 102). For example, the communication interface 132 may be used to transmit and/or receive user input from one or more input devices 112, video data (e.g., recording and/or real-time streaming) of gameplay, and so forth.
As described herein, the application 106 may include any of a variety of potential types of software capable of presenting one or more game summaries (e.g., game summaries 136 in the user interface 150) on the display 108 of the client device 104. Examples of applications 106 include mobile applications, computer applications, console applications, cloud-based game stream applications, web browsers, gaming applications, local applications, client applications, social applications, system or local applications, and/or other types of applications or software.
For cloud-based game flow applications, the application 106 may include instructions that when executed by one or more processors may cause the processor(s) (but are not limited to) to receive input data representing user input to one or more input devices 112, send the input data to one or more game servers 126, retrieve game data from memory or local storage, receive game data from the game servers 126 using the communication interface 110, and cause a game to be displayed on the display 108. More generally, in some embodiments, the application 106 may operate as an facilitator to enable interaction between a user and a game instance on the client device 104.
The application 106 may additionally or alternatively include instructions that, when executed by one or more processors, cause the one or more processors to send data to and receive data from one or more game digest servers 116 (e.g., game data, game digests, etc.). For example, the application 106 may send 116 video recordings generated while playing a game on one or more client devices 104 to one or more game capture servers 116 and receive video clips, metadata, screenshots, game summary data, and/or other data extracted from the game data from the one or more game capture servers 116. Game data herein may generally refer to data associated with one or more game play sessions of one or more games, such as video data, audio data, one or more API messages from one or more games (e.g., including or identifying one or more in-game events), and/or user input data (e.g., from user input device 112). Additionally or alternatively, one or more game summary servers 116 may receive at least some game data from game server 126.
In one or more embodiments, the client device 104 may render the game using the application 106 (e.g., running on the client device 104), while in other examples, the client device 104 may receive display data (e.g., encoded display data as described with respect to fig. 8) and display the game on the display 108 using the display data (e.g., running on one or more of the game servers 126). In some examples, a first client device (e.g., client device 104 (a)) may render a game and a second client device (e.g., client device 104 (B)) may receive display data and display the game using the display data. In examples where display data is received by a client device (e.g., where client device 104 is not generating a rendering), GSS system 100 may be part of a game streaming system, such as game streaming system 800 described in more detail below in fig. 8.
The display 108 may include any type of display capable of displaying games and/or summaries of games (e.g., light emitting diode display (LED), organic LED display (OLED), liquid Crystal Display (LCD), active matrix OLED display (AMOLED), quantum Dot Display (QDD), plasma display, LED/LCD display, and/or other types of displays). In some examples, the display 108 may include multiple displays (e.g., a dual monitor display for a computer game, a first display for configuring a game, a virtual reality display for playing a game, etc.). In some examples, the display is a touch screen display, such as a touch screen of a smart phone, tablet, notebook, etc., where the touch screen is at least one of the input devices 112 of the client device 104.
As described herein, the application 106 of the client device 104 may display one or more game summaries, such as the game summary 136 in the user interface 150, via the display 108. To this end, the graphical interface manager 130 may generate one or more game summaries and/or manage the display of the game summaries in the user interface 150 of the application 106. In one or more embodiments, one or more data structures representing the game digest and/or portions thereof may be generated at the client and/or server side. Rendering the data structure as a User Interface (UI) may be handled by an application (e.g., application 106) running on the client device.
In at least one embodiment, the graphical interface manager 130 may generate a game summary using metadata of an event log of in-game events and corresponding screenshots, video clips, and/or other game content, each of which may be automatically generated by the game data capturer 138 based on analyzing game data associated with a game play session. The metadata may indicate timing information for one or more in-game events within the game play session, as well as associations between in-game events and screenshots, video clips, and/or other game content capturing in-game events. By way of example and not limitation, the metadata may include a timestamp of the in-game event and/or the game content that corresponds to the in-game event (e.g., start time, end time, time of the non-persistent event, etc.). The time stamp may be used to display data corresponding to the in-game event with a temporal context (e.g., via a timeline, a thumbnail on a grid, an icon on a map, a per-turn statistics, etc.).
In one or more embodiments, the event log may organize in-game events into rounds or plays of a game session. For example, the metadata may describe a round and a time range or a segment corresponding to the round. Other examples of metadata that may be associated with time, in-game events, and/or rounds include in-game object locations, scores, player status, player statistics, game participants divided by user name, event types, score to score or auxiliary rates, and/or other information for configuring the game digest 136.
The game summary 136 displayed in the user interface 150 may take a variety of potential forms and may be displayed in a variety of potential contexts (e.g., using an activity feed, a social feed, a game application, a game streaming application, etc.). By way of example and not limitation, the game summary 136 in FIG. 1 includes a list 155 of in-game events, visual indicators and/or interface elements, a game content display area 142, a timeline 190, a round display area 192, a game name 152 of a game corresponding to the game summary, and an amount of time 154 to play the game in one or more game sessions corresponding to the game summary 136. As described herein, the game digest 136 may include less than all of these features and/or different features or combinations thereof.
List 155 may correspond to screen shots, video clips, and/or other game content captured from the game session (e.g., via game data capturer 138). For example, list 155 includes entries 156, 158, 160, 162, and 164 displayed using visual indicators, which may include corresponding screenshots and/or images (e.g., thumbnails) of video clips. The list 155 may take various forms but is shown as a belt or a dial. In various embodiments, list 155 may take the form of a carousel, gallery, banner rotator, banner slider, vertical list, horizontal list, grid, and/or scroll slider.
The entries of the list 155 may correspond to one or more in-game events of the game play session, and the game content (e.g., video clips and/or screenshots) associated with the entries may capture or otherwise correspond to one or more portions of the respective in-game events. The graphical interface manager 130 may use the metadata to display entries in the list 155 in an order corresponding to in-game events that occur within the gaming session. For example, the time stamps may be used to chronologically display the entries according to the corresponding in-game events.
Video clips, screen shots, and/or other game content may capture highlights from the game play session that the user may view in the game summary 136. For example, the game summary 136 may include interface elements corresponding to in-game events, such as visual indicators and/or associated check boxes, selection areas, and/or buttons (e.g., in the list 155 or associated with the list 155). The user may select one or more interface elements corresponding to one or more in-game events to cause corresponding game content to be displayed, played, and/or loaded in the game content display area 142 of the game summary 136.
For example, upon selection of the entry 158, as shown in FIG. 1, the entry 158 may be emphasized (e.g., highlighted, visually distinguished, etc.) for indicating that the corresponding one or more in-game events and/or game content have been selected. Likewise, upon selection of entry 158, game content 166, such as video clips, screen shots, and/or audio, corresponding to the in-game event may be displayed, played, and/or loaded in game content display area 142 of game summary 136, as shown. Where game content 166 includes multiple items, the user may navigate the items using arrows or other means shown in game content display area 142. Additionally or alternatively, arrows and/or other interface elements in the game content display area 142 may be used to select another entry in the list 155, such as an adjacent entry. When another entry is selected (via list 155, arrow, or otherwise), then the current entry may be deselected. Deselection may result in the emphasis being removed from the visual indicator and the corresponding game content being removed and/or replaced with newly selected game content in the game content display area 142.
Other elements of the game digest 136 may also be updated based on the selections when they correspond to one or more in-game events selected using entries in the list 155. For example, other elements may be similarly emphasized and/or de-emphasized when selected. Examples include an entry 168 in the timeline 190 and/or an entry 170 in the round display area 192.
In accordance with the metadata, the game digest 136 may include various supplemental information displayed in association with the entry (e.g., in the list 155, timeline 190, turn display area 192, etc.). The supplemental information may help the user determine which items to select. For example, the entry may display an indication of a time and/or duration corresponding to an in-game event within the gaming session. For example, entry 164 indicates that it corresponds to a 20 second video clip. As a further example, the supplemental information may include an indication of an event type of the in-game event associated with the entry. For example, entry 164 indicates that the player who is the subject of game digest 136 scores twice in succession. Other event types may correspond to player scores, assistance, win, lose, escalation, and the like. As a further example, the timeline 190 includes symbols such as check marks 178, "x"180, and multipliers 144 that represent the occurrence of one or more particular event types and/or multiples or variants thereof. Other examples of supplemental information include game status information such as score information, player information, team information, and/or other information. For example, the entry 170 in the turn display area 192 includes score information corresponding to the turn (and one or more in-game events). However, the game state information may similarly be presented in association with entries in the list 155 and/or timeline 190.
A timeline 190 may be displayed to convey the relative timing of one or more in-game events captured in the event log. Similar to list 155, timeline 190 may indicate a time of an in-game event in association with the in-game event. Entries in the timeline 190 (e.g., the entry 168) may be located according to an associated timestamp. Entries corresponding to in-game events and/or game content automatically extracted by the game data capturer 138 may be visually distinguished from entries manually captured by the player and/or user (e.g., manually captured diamonds and automatically captured circles). In at least one embodiment, each entry in the timeline 190 can correspond to (e.g., respectively) one or more entries in the list 155. For example, entry 168 may correspond to the same in-game event as entry 158 in list 155. In one or more embodiments, entries in the timeline 190 can have a one-to-one correspondence with entries in the list 155.
In one or more embodiments, each entry in the timeline may correspond to one or more game content items that are similar to entries in the list 155. Further, one or more entries in the timeline 190 may be selectable. For example, upon selection of the entry 168, as shown in FIG. 1, the entry 168 may be emphasized (e.g., highlighted, visually distinguished, etc.) indicating that the corresponding in-game event and/or game content has been selected. Also, as shown, upon selection of an entry 168, game content 166 (e.g., without limitation, video clips, screen shots, and/or audio) corresponding to the in-game event may be displayed, played, and/or loaded in the game content display area 142 of the game summary 136. If another entry is selected, the current entry may be deselected. Deselection may result in the emphasis being removed from the visual indicator of the entry and the corresponding game content being removed and/or replaced with newly selected game content in the game content display area 142.
The turn display area 192 may represent one or more in-game events organized by turn. For example, the turn display area 192 may include an entry 170, an entry 172, an entry 174, and an entry 176 that correspond to respective turns of one or more game play sessions of the game digest 136. In one or more embodiments, the turn display area 192 may be used to select a subset of in-game events and/or corresponding entries displayed in one or more other portions of the game digest 136. For example, user selection of the item 170 may limit the list 155 and/or timeline 190 to events corresponding to the item 170 (e.g., events occurring within a round) and/or otherwise cause one or more items to be displayed in those portions of the game digest 136, which may include replacing existing items that do not correspond to the item 170. In one or more embodiments, the time on the timeline 190 can be scaled according to the time span of the currently selected round. The start time on the timeline 190 may correspond to a start time of a round in a game play session and the end time on the timeline 190 may correspond to an end time of a round in a game play session.
In one or more embodiments, more than one round may be selected at a time. Also, in one or more embodiments, deselection of an entry corresponding to a round may remove the entry from other portions of the game digest 136 corresponding to the round. In some examples, selection of an item may automatically result in deselecting one or more currently selected items. As indicated in fig. 1, similar to the entries selected in the other portions of the game digest 136, the selected entries may be emphasized in the game digest 136. Deselecting an entry may likewise result in the importance of the entry being reduced.
Other examples of features that may be included in the game abstract 136 are described with respect to fig. 2-4, with the exception of or without the features described with respect to fig. 1. Referring now to fig. 2, fig. 2 illustrates an example of a game summary 136 of in-game events including a map 204 of a virtual environment in which one or more in-game events occur, according to some embodiments of the present disclosure. By way of example and not limitation, the game digest 136 in FIG. 2 includes a list 155, a game content display area 142, and a timeline 190. The game content display area 142 is displayed to include playback control elements, such as play buttons and timelines with which a user may interact to view activity occurring during one or more selected in-game events.
Map 204 may be dynamically updated to account for movement of one or more objects and/or entities over time throughout the game play session, as indicated by the metadata. For example, the locations of one or more players, NPCs, and/or objects may be overlaid on map 204 or otherwise used to annotate map 204 to indicate where these objects or entities are located in the virtual environment at a particular time of a game play session. The time may be based at least on other selections made in the game summary 136, such as selected in-game events, game content, and/or entries in the user interface 150. For example, in the case where the game content display area 142 corresponds to the game content 202 including video clips, in-game events, objects, and/or entities displayed in the video clip play map 204 and/or may be added to, deleted from, or have their locations or other attributes (e.g., health, number of items, score, event type, etc.) modified to reflect the game status at the respective times in the video clips. Corresponding information for updating map 204 may be captured in metadata, and one or more updates may occur periodically and/or continuously. Interpolation between metadata values may be used to provide intermediate values for updates. Map 204 may similarly be updated to reflect the game status of other game content in game content display area 142, such as selected screenshots or audio clips.
In one or more embodiments, annotating map 204 can include adding, removing, and/or updating one or more paths of one or more entities or objects throughout the virtual environment. For example, path 208 of a player (e.g., the subject matter of game summary 136) throughout the game is shown in dashed lines. The end point of the path may correspond to the location of the player associated with point 220 in the video clip being viewed. In other examples, the endpoint may correspond to a timestamp of the screenshot being viewed. The view indicator 206 on the map 204 may indicate the player's position in the virtual environment at the current time associated with the game content being viewed and/or may correspond to an area in the virtual environment available for viewing by the game or otherwise associated with the player at that time. In one or more embodiments, map 204 may correspond to an in-game map that is displayed to a player during a game play session. The annotations of map 204 may be limited to information that is known to the player account and/or viewable by the player on the in-game map (e.g., at the corresponding time of the gaming session), or may include information that is unknown to the player account and/or viewable by the player (e.g., aggregated from other players or spectators).
Referring now to FIG. 3, FIG. 3 illustrates an example of a game summary 136 of in-game events, including a game content library 300 capturing one or more in-game events, according to some embodiments of the present disclosure. In one or more embodiments, the library 300 may be an example of the list 155 described with respect to fig. 1 and 2. As described above, the game summary 136 may correspond to in-game events and/or game content from multiple games, across multiple game platforms, and/or more than one game play session. For example, a set of game content 302 is indicated as being from game 2 and a set of game content 310 is indicated as being from game 3. By way of example and not limitation, image 304 is indicated to display a map associated with the game session. In one or more embodiments, user selection of the image 304 may display the associated game content in the game content display area 142, the game content display area 142 may be included in the game summary 136, added to the game summary 136 based on the selection, and/or included in a pop-up window based on the user selection. In one or more embodiments, user selection of an item may result in display of a corresponding map 204, as described with respect to fig. 2.
Referring now to fig. 4, fig. 4 illustrates an example of additional information that may be included in a game summary 136 of an in-game event, according to some embodiments of the present disclosure. The game summary 136 of fig. 4 includes an indicator 402 of the game title and a number of in-game events captured from the corresponding game play session and a related time 406 (e.g., "1:00 pm in the future"). In the following, five images of captured events are shown, such as image 404, which may correspond to list 155 described herein. As with the entries in other examples of list 155, the indicia may be included in an image or other visual indicator (e.g., double elimination, scoring, etc.). These images may serve as visual previews of in-game events and may include thumbnails of the underlying game content.
In addition, the game summary 136 includes game state information such as a level 414 of the game played (e.g., level 1,2,3, etc.), a mode 416 of the game (e.g., team, flag-grabbing, free-mixing, etc.), and a time of game play. The game digest 136 may also include a performance or statistics digest 420 that may include performance and/or statistics, such as numbers of exclusions, scores, points, or any other notable event in one or more particular game play sessions or aggregations thereof. Also included is game statistics 422, which may describe important statistics such as the score of each round, how many rounds were played, the number of obsolescence, or any other game statistics that may help the user track his progress during a game play session. In aspects, the activity feed can also include a performance list 424. The achievement 424 will depend on the type of game being played. For example, in a mid-century subject game, achievements 424 may include a target number of completions (e.g., towns conquered, enemies annihilated, etc.). In addition to these features, the activity feed may also include a map 428 showing a map of the game route and the path that the user follows throughout the capture event shown in the activity feed summary 400.
Turning to FIG. 1, further examples of how in-game events and/or game content may be captured from game data are provided. As described herein, game data may refer to data associated with one or more game play sessions of one or more games, such as video data, audio data, one or more API messages from one or more games (e.g., including or identifying one or more in-game events), and/or user input data generated by and/or associated with one or more games (e.g., from user input device 112).
In one or more embodiments, the application 106 and/or the game data capturer 138 may include instructions that, when executed, will record game data in a game play session and store the recorded game data locally on the client device 104 or send the recorded game data to the game summary server 116 or the game server 126 for storage in the data stores 124 and 134, respectively. In examples where the client device 104 does not generate a rendering, the game server 126 may record and store the game data or send the game data to the game summary server 116 for storage in the data store 124.
As described herein, it may be desirable to identify clips (e.g., video and/or audio) and/or other game content from game data that includes interesting content, such as clips in which a greater amount of user input is provided and/or clips in which certain important events occur in a game play session. Accordingly, a game data capturer 138 may be provided for this purpose.
The game data capturer 138 may be part of the application 106, or may be part of a separate application (e.g., one or more system services, programs, etc.), as described herein. The game data capturer 138 may include instructions that, when executed by the processor, cause the processor to not be limited to recording or recording game data (e.g., input device usage data, video data, and/or other data associated with a game play session). Examples of input device usage data include data describing or representing keyboard, mouse, or other input device usage, which is relevant to one or more gaming sessions. Examples of recordable information include keyboard strokes, mouse clicks, mouse movements, microphone inputs, camera inputs, and/or inputs to client device 104 during a game play session. In addition, the game data capturer 138 may store timestamp information and such inputs related to the timestamp information of the game session video data.
The game data capturer 138 may use an interest determiner 140, and the interest determiner 140 may generally be configured to indicate to the game data capturer 138 game content that may be of interest to the user. It is contemplated that the interest determiner 140 may automatically identify game content or may manually instruct (e.g., create tags, bookmarks, capture events, etc., via one or more user inputs or commands) by a user and/or player to do so. In one or more embodiments, the interest determiner 140 can analyze game data of the game play session to detect game content. The analysis may be performed on recorded game data, such as after the game session is completed, and/or in real-time as the game session occurs. The interest determiner 140 may be part of the application 106 or may be part of a separate application (e.g., one or more system services, programs, etc.). In some examples, the interest determiner 140 is part of the same application as the game data capturer 138.
The interest determiner 140 may detect game content and/or corresponding in-game events in the game data using a variety of potential methods (e.g., automatically), examples of which are described herein. To this end, the interest determiner 140 analyzes any form of game data, such as game streams, recorded game data, and/or identified game content, to determine and/or detect the occurrence of an event in the game. For example, the analysis may be used to trigger capture events and instruct capture of at least a portion of the game content from the game stream (or save from a buffer) and/or categorize detected in-game events and/or one or more attributes thereof (e.g., for inclusion in metadata of one or more game summaries 136). The game data capturer 138 may determine that an in-game event has occurred based on artificial intelligence, object detection, computer vision, text recognition, and/or other analysis methods. For example, the interest determiner 140 may utilize any type of machine learning model to detect events and corresponding game content in a game, such as machine learning models using linear regression, logistic regression, decision trees, support Vector Machines (SVMs), na iotave bayes, K-nearest neighbors (Knn), K-means clustering, random forests, dimensionality reduction algorithms, gradient boosting algorithms, neural networks (e.g., auto-encoders, convolutions, recursions, perceptrons, long/short term memory, hopfield, boltzmann, deep beliefs, deconvolution, generation of opposition, liquidmachine, etc.), and/or other types of machine learning models. In some embodiments, the machine learning model includes a deep convolutional neural network.
Examples of in-game events that may be detected include eliminating another character in the game, collecting an item, scoring, making a home run, climbing a building or mountain, performing or achieving a user-specified task or goal, upgrading, winning a round, losing a round, and/or other event types. For example, in some embodiments, the interest determiner 140 may identify a change in a reticle or other interface element of the game that indicates an in-game event, such as an in-game elimination of a character and triggering a capture event. As a further example, the game data capturer 138 may identify text in the game instance (e.g., using Optical Character Recognition (OCR)) to represent "player 1 obsolete player 4" or "player 1 touchdown score", or other in-game events to trigger one or more capture events. Methods that rely on target detection may analyze game visual data to identify one or more capture events and/or categorize game content.
In one or more embodiments, to identify one or more in-game events, the interest determiner 140 may determine a duration of action of a higher level (e.g., frequency or concentration) in the game play session based on the game data. For example, the game data may be used to identify time periods during a game play session that include higher-level actions, and these durations or time periods may be referred to as "durations of predicted interest. When a time period is detected, the disclosed method (e.g., for a screen shot) may be used to identify a particular time. For example, the interest determiner 140 may identify a time period within the game play session as a high action based at least in part on a high number of key presses per minute (KPMs), a high percentage of action key selections, and/or a time period measured by other input devices. In addition, additional analysis of these input device metrics may be performed to reduce potential noise when higher motion periods are identified. For example, an interest level algorithm may be applied to input device metrics to convert the metrics to operational activity measurements over time.
Based on the values of the interest level algorithm over time, a time period associated with a potentially high in-game activity may be determined. For example, a period of time (e.g., continuous) with data points above a threshold (e.g., average) may be identified as being related to a clip or screenshot that may be worth highlighting. The beginning of the time period may be based on a time when the operational activity measurement exceeds a threshold, and the ending of the time period may be based on a time when the operational activity measurement is below the threshold. A video clip or screenshot corresponding to a time period may then be identified. For example, the video clip may be stored as a discrete file capturing frames spanning a time period, and/or may be stored as metadata identifying an area in the gaming session video data corresponding to the time period (e.g., using a start time stamp, an end time stamp, and/or a duration).
While user input (e.g., input device metrics) may be used to quantify activity in a game as a level of interest, other types of data may be used in addition to or instead of user input. For example, game session visual and/or audio metrics may be used that correspond to camera and/or field of view movements, color changes, audio volume, audio dynamics, and/or audio changes in the game, and the like. Further, while in some examples the interest determiner 140 may identify a duration of predicted interest and then use the duration to generate a video clip or screenshot, in other examples the interest determiner 140 may analyze the video clip (generated using any suitable method) to determine whether it is sufficiently interesting, such as by using an interest level algorithm (e.g., based on determining that the average interest level of the clip is greater than a threshold).
Once a time period is identified as a duration of predicted interest using any of the methods described herein, a corresponding video segment may be identified from video data of a game play session. For example, the interest determiner 140 may send a timestamp corresponding to the duration of the predicted interest to the application 106, the game server 126, and/or the game play capture server 116. Any combination of these components may then generate and/or identify discrete video clips from the above-described game session video data (e.g., stored in data store 124 or 134) using the time stamps, and time stamp the video clips or other game content in the metadata for one or more game summaries 136.
The game summary server 116 may include one or more servers for storing, sorting, clustering, and/or categorizing game content and/or game data from the game session. Although only some components and/or features of game summary server 138 are shown in fig. 1, this is not intended to be limiting. For example, the game summary server 166 may include additional or alternative components, such as those described below with respect to the computing device 900 of FIG. 9.
As further shown in FIG. 1, the game summary server 116 may be separate or distinct from the game server 126, however, this is not intended to be limiting. In some examples, the game summary server 116 may be the same or similar server as the game server 126 (e.g., running as a task on the game server 126). In some examples, the game summary server 116 may be operated or hosted by a first entity (e.g., a first company), while the game server 126 may be operated or hosted by a second entity (e.g., a second, different company). In such examples, the second entity may be a game developer, and the first entity and the second entity may share data such that the first entity may use the data received from the second entity to identify game content of interest. In other examples, game play capture server 116 and game server 126 may be operated or hosted by the same entity. In further examples, GSS system 100 may be implemented entirely on client device 104 and/or one or more components shown as being included in a server and/or functionality thereof may be implemented at least partially on client device 104.
The game summary server 116 may include one or more Application Programming Interfaces (APIs) to enable information communication (e.g., game data, time stamps, game content selection data, etc.) with the game server 126 or client device 104. For example, the game summary server 116 may include one or more game APIs that interface with the client device 104 or the game server 126 to receive game data and/or game summary data. As yet another example, the game summary server 116 may include one or more APIs that interface with the client device 104 for transmitting categorized game content and/or game summary data. Although different APIs are described herein, these APIs may be part of a single API, two or more APIs may be combined, different APIs may be included, rather than the APIs described herein as examples, or a combination thereof.
The game server 126 may include one or more servers (e.g., dedicated game servers) for storing, hosting, managing, and in some examples rendering games. In some examples, the first game server 126 may be used to create, update, and modify games (e.g., program code for games), and the second game server 126 may be used to host games (e.g., as a dedicated game server). Although only some components and/or features of game server 126 are shown in fig. 1, this is not intended to be limiting. For example, the game server 126 may include additional or alternative components, such as those described below with respect to the computing device 900 of fig. 9.
The game server 126 may include one or more APIs to enable the client device 104 to play games and/or to communicate information (e.g., user profiles, game session data, game summaries, etc.) with the game summary server 116 and/or the client device 104. For example, the game server 126 may include one or more game APIs that interface with the application 106 of the client device 104 to enable the client device 104 to play games. As yet another example, the game server 126 may include one or more APIs that receive categorized game content and/or other game summary data for transmission to the client device 104. Although different APIs are described herein, these APIs may be part of a single API, two or more APIs may be combined, different APIs may be included, rather than the APIs described herein as examples, or a combination thereof.
The game server 126 may include a game engine 128. The game engine 128 may include functionality to enable one or more users to play games over a network. The game engine 128 may include a rendering engine, an audio engine, a physics engine, an animation engine, an artificial intelligence engine, a network engine, a streaming media engine, a memory management engine, and/or other components or features. The game engine 128 may be used to generate some or all of the game session data during the game session.
Referring now to fig. 5, each block of method 500, as well as other methods described herein, includes a computing process that may be performed using any combination of hardware, firmware, and/or software. For example, various functions may be performed by a processor executing instructions stored in a memory. The methods may also be implemented as computer-usable instructions stored on a computer storage medium. These methods may be provided by a stand-alone application, a service or a hosted service (alone or in combination with another hosted service) or a plug-in to another product, to name a few. Further, by way of example, a method is described with respect to the system of fig. 1. However, the methods may additionally or alternatively be performed by any one or any combination of systems, including but not limited to the systems described herein.
FIG. 5 illustrates a flowchart showing a method for presenting a game summary including a timeline associated with in-game events, in accordance with some embodiments of the present disclosure. At block B502, the method 500 includes receiving metadata indicating timing information of an in-game event. For example, graphical interface manager 130 and/or application 106 may receive metadata indicating a timing of one or more in-game events in one or more in-game sessions and an association between the one or more in-game events and one or more video clips capturing the one or more in-game events. In-game events may be determined based at least on analyzing video data representative of one or more game play sessions and/or other game data.
At block B504, the method 500 includes presenting, using metadata, interface elements corresponding to one or more in-game events and a timeline indicating timing information in association with the one or more in-game events. For example, the graphical interface manager 130 and/or the application 106 may use the metadata to present one or more interface elements corresponding to one or more in-game events in the user interface 150, as well as a timeline 190 indicating one or more time sequences in association with the one or more in-game events.
At block B506, the method 500 includes loading game content that captures the in-game event and updating the timeline to indicate the in-game event based at least on the selection of the interface element. For example, graphical interface manager 130 and/or application 106 may load image data of a video clip of one or more video clips in user interface 150 that captures one or more in-game events and update a timeline to indicate the one or more in-game events based at least on selection of one or more interface elements that correspond to the one or more in-game events and based at least on an association between the one or more in-game events and the one or more video clips.
Referring now to fig. 6, fig. 6 is a flow chart illustrating a method 600 for presenting a game summary including a list associated with in-game events, according to some embodiments of the present disclosure. At block B602, the method 600 includes receiving metadata indicating a time of an in-game event. For example, the graphical interface manager 130 and/or the application 106 may access metadata that indicates one or more times of one or more in-game events in one or more game play sessions and includes one or more associations between the one or more in-game events and one or more game content items (e.g., images, videos, etc.) that captured the one or more in-game events.
At block B604, the method 600 includes presenting, in the list of in-game events using the metadata, an interface element corresponding to the in-game event at a location corresponding to timing information of the in-game event in the play session. For example, graphical interface manager 130 and/or application 106 may present one or more interface elements corresponding to one or more in-game events in-game event list 155 using metadata in user interface 150 at one or more locations corresponding to timing information of one or more game play sessions.
At block B606, the method 600 includes presenting game content corresponding to the in-game event based at least on the selection of the interface element. For example, graphical interface manager 130 and/or application 106 may present one or more game content items corresponding to one or more in-game events in a user interface based at least on one or more selections of one or more interface elements in the in-game event list and one or more associations between the one or more in-game events and the one or more game content.
Referring now to fig. 7, fig. 7 illustrates a flow chart of a method 700 for presenting a game digest including a map of a virtual environment associated with an in-game event, according to some embodiments of the present disclosure. At block B702, the method 700 includes receiving metadata indicating timing information and a location associated with an in-game event. For example, graphical interface manager 130 and/or application 106 may receive metadata indicating one or more times and one or more locations associated with in-game events of a game session of a game. In-game events may be determined based at least on analyzing game data associated with a game play session.
At block B704, the method 700 includes presenting, using metadata, interface elements corresponding to the in-game event, and a map of a virtual environment of the game. For example, the graphical interface manager 130 and/or the application 106 may use the metadata to present one or more interface elements corresponding to one or more in-game events of the game play session, and one or more maps of one or more virtual environments of the game.
At block B706, the method includes annotating the map with location and timing information based at least on the selection of the interface element. For example, graphical interface manager 130 and/or application 106 may annotate at least one of the one or more maps with one or more locations and timing information associated with the one or more in-game events based at least on one or more selections of one or more interface elements corresponding to the one or more in-game events.
Referring now to fig. 8, fig. 8 is an example system diagram of a game streaming system 800 in accordance with some embodiments of the present disclosure. Fig. 8 includes a game server 802 (which may include similar components, features, and/or functions to game server 126 of fig. 1 and/or computing device 900 of fig. 9), a client device 804 (which may include similar components, features, and/or functions to client device 104 of fig. 1 and/or computing device 900 of fig. 9), and a network 805 (which may be similar to network 102 of fig. 1). In some embodiments of the present disclosure, the game streaming system 800 may be implemented. The application session may correspond to a game streaming application (e.g., NVIDIA GEFORCE NOW), a remote desktop application, a simulation application (e.g., autonomous or semi-autonomous vehicle simulation), a Computer Aided Design (CAD) application, a Virtual Reality (VR) and/or Augmented Reality (AR) streaming application, a deep learning application, and/or other application types.
In the game streaming system 800, for a game session, the client device 804 may receive only input data responsive to input from the input device, send the input data to the game server 802, receive encoded display data from the game server 802, and display the display data on the display 824. Thus, the more computationally intensive computations and processing are offloaded to the game server 802 (e.g., rendering of the game session is performed by the GPU of the game server 802). In other words, the game session is streamed from the game server 802 to the client device 804, thereby reducing the requirements of the client device 804 for graphics processing and rendering.
For example, with respect to instantiation of a gaming session, client device 804 may display frames of the gaming session on display 824 based on receiving display data from game server 802. The client device 804 may receive input from one of the input devices and in response generate input data. Client device 804 may send input data to game server 802 via communication interface 820 and over network 806 (e.g., the internet), and game server 802 may receive input data via communication interface 818. The CPU may receive input data, process the input data, and send the data to the GPU, thereby causing the GPU to generate a rendering of the game session. For example, the input data may represent movement of a user character in a game, firing a weapon, reloading, passing a ball, turning a vehicle, and so forth. Rendering component 812 can render the game session (e.g., representing a result of the input data), and rendering capture component 814 can capture the rendering of the game session as display data (e.g., capturing image data of a rendered frame of the game session). The encoder 816 may then encode the display data to generate encoded display data, and the encoded display data may be transmitted to the client device 804 via the network 806 via the communication interface 818. Client device 804 may receive the encoded display data via communication interface 820, and decoder 822 may decode the encoded display data to generate display data. The client device 804 may then display the display data via the display 824.
Fig. 9 is a block diagram of an example computing device 900 suitable for implementing some embodiments of the disclosure. Computing device 900 can include an interconnection system or bus 902 that directly or indirectly couples memory 904, one or more Central Processing Units (CPUs) 906, one or more Graphics Processing Units (GPUs) 908, a communication interface 910, input/output (I/O) ports 912, input/output components 914, a power supply 916, one or more presentation components 918 (e.g., a display), and one or more logic units 920.
In at least one embodiment, computing device 900 may include one or more Virtual Machines (VMs), and/or any components thereof may include virtual components (e.g., virtual hardware components). For non-limiting examples, one or more GPUs 908 can include one or more vGPU, one or more CPUs 906 can include one or more vCPU, and/or one or more logic units 920 can include one or more virtual logic units. Thus, computing device 900 may include discrete components (e.g., a complete GPU dedicated to computing device 900), virtual components (e.g., a portion of a GPU dedicated to computing device 900), or a combination thereof.
The interconnect system 902 may represent one or more links or buses, such as an address bus, a data bus, a control bus, or a combination thereof. The interconnection system 902 may include one or more bus or link types, such as an Industry Standard Architecture (ISA) bus, an Extended ISA (EISA) bus, a Video Electronics Standards Association (VESA) bus, a Peripheral Component Interconnect (PCI) bus, a peripheral component interconnect express (PCIe) bus, and/or another type of bus or link. In some embodiments, there is a direct connection between the components. For example, the CPU 906 may be directly connected to the memory 904. Further, CPU 906 may be directly connected to GPU 908. Where there is a direct connection or a point-to-point connection between the components, the interconnect system 902 may include a PCIe link for performing the connection. In these examples, PCI bus need not be included in computing device 900.
The interconnect system 902 may represent one or more buses, such as an address bus, a data bus, a control bus, or a combination thereof. The interconnect system 902 may include one or more bus types, such as an Industry Standard Architecture (ISA) bus, an Extended ISA (EISA) bus, a Video Electronics Standards Association (VESA) bus, a Peripheral Component Interconnect (PCI) bus, a peripheral component interconnect express (PCIe) bus, and/or another type of bus.
Although the various blocks in fig. 9 are shown as being connected to wires by interconnect system 902, this is not intended to be limiting and is for clarity only. For example, in some embodiments, a presentation component 918 such as a display device can be considered an I/O component 914 (e.g., if the display is a touch screen). As another example, CPU 906 and/or GPU908 may include memory (e.g., memory 904 may represent a storage device in addition to memory of GPU908, CPU 906, and/or other components). In other words, the computing device of fig. 9 is merely illustrative. There is no distinction between devices such as "workstation," server, "" notebook, "" desktop, "" tablet, "" client device, "" mobile device, "" handheld device, "" game console, "" Electronic Control Unit (ECU), "virtual reality system," and/or other device or system types, as all are contemplated within the scope of the computing device of fig. 9.
Memory 904 may include any of a variety of computer-readable media. Computer readable media can be any available media that can be accessed by computing device 900. Computer readable media can include both volatile and nonvolatile media and removable and non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media.
Computer storage media may include volatile and nonvolatile media and/or removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules, and/or other data types. For example, the memory 904 may store computer-readable instructions (e.g., that represent programs and/or program elements, such as an operating system). Computer storage media may include, but is not limited to, RAM, ROM, EEPROM, flash memory or other storage technologies, CD-ROM, digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computing device 900. As used herein, a computer storage medium does not include a signal itself.
Computer storage media may include computer readable instructions, data structures, program modules, and/or other data types in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term "modulated data signal" may mean a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of any of the above should also be included within the scope of computer readable media.
The CPU 906 may be configured to execute computer-readable instructions to control one or more components of the computing device 900 to perform one or more of the methods and/or processes described herein. Each of the CPUs 906 may include one or more cores (e.g., one, two, four, eight, twenty-eight, seventy-two, etc.) capable of processing a large number of software threads simultaneously. The CPU 906 may include any type of processor and may include different types of processors depending on the type of computing device 900 implemented (e.g., processors with fewer cores for mobile devices and processors with more cores for servers). For example, depending on the type of computing device 900, the processor may be an ARM processor implemented using Reduced Instruction Set Computing (RISC) or an x86 processor implemented using Complex Instruction Set Computing (CISC). In addition to one or more microprocessors or supplemental coprocessors such as math coprocessors, computing device 900 may also include one or more CPUs 906.
Gpu908 may also be configured to execute at least some computer readable instructions to control one or more components of computing device 900 to perform one or more of the methods and/or processes described herein, in addition to or in lieu of CPU 906. One or more GPUs 908 can be integrated GPUs (e.g., with one or more CPUs 906) and/or one or more GPUs 908 can be discrete GPUs. In an embodiment, one or more GPUs 908 can be coprocessors for one or more CPUs 906. Computing device 900 may use GPU908 to render graphics (e.g., 3D graphics) or perform general purpose computations. For example, GPU908 may be used for general purpose computing on a GPU (GPGPU). GPU908 may include hundreds or thousands of cores capable of processing hundreds or thousands of software threads simultaneously. GPU908 may generate pixel data for outputting an image in response to a rendering command (e.g., a rendering command from CPU 906 received via a host interface). GPU908 may include graphics memory, such as display memory, for storing pixel data or any other suitable data (e.g., GPGPU data). Display memory may be included as part of memory 904. GPU908 may include two or more GPUs operating in parallel (e.g., via links). The links may connect GPUs directly (e.g., using NVLINK) or through switches (e.g., using NVSwitch). When combined together, each GPU908 may generate pixel data or GPGPU data for different portions of the output or for different outputs (e.g., a first GPU for a first image and a second GPU for a second image). Each GPU may include its own memory or may share memory with other GPUs. In examples where computing device 900 does not include GPU908, CPU 906 may be used to render graphics.
Logic unit 920 may be configured to execute at least some computer-readable instructions to control one or more components of computing device 900 to perform one or more methods and/or processes described herein in addition to or in lieu of CPU 906 and/or GPU 908. In embodiments, CPU 906, GPU908, and/or logic unit 920 may perform the methods, processes, and/or any combination of portions thereof, either discretely or jointly. One or more logic units 920 may be part of and/or integrated within one or more CPUs 906 and/or one or more GPUs 908, and/or one or more logic units 920 may be discrete components of or otherwise external to CPU 906 and/or GPU 908. In embodiments, one or more logic units 920 may be processors of one or more CPUs 906 and/or one or more GPUs 908.
Examples of logic unit 920 include one or more processing cores and/or components thereof, such as a Data Processing Unit (DPU), tensor Core (TC), tensor Processing Unit (TPU), pixel Vision Core (PVC), vision Processing Unit (VPU), graphics Processing Cluster (GPC), texture Processing Cluster (TPC), streaming Multiprocessor (SM), tree Traversal Unit (TTU), artificial Intelligence Accelerator (AIA), deep Learning Accelerator (DLA), arithmetic Logic Unit (ALU)), application Specific Integrated Circuit (ASIC), floating Point Unit (FPU), input/output (I/O) element, peripheral Component Interconnect (PCI), or peripheral component interconnect express (PCIe) element, and the like.
Communication interface 910 may include one or more receivers, transmitters, and/or transceivers that enable computing device 900 to communicate with other computing devices via an electronic communication network, including wired and/or wireless communications. The communication interface 910 may include components and functions that enable communication over any of a number of different networks, such as a wireless network (e.g., wi-Fi, Z-wave, bluetooth LE, zigBee, etc.), a wired network (e.g., over ethernet or InfiniBand communication), a low-power wide area network (e.g., loRaWAN, sigFox, etc.), and/or the internet. In one or more embodiments, one or more logic units 920 and/or communication interface 910 may include one or more Data Processing Units (DPUs) to transmit data received over a network and/or over interconnection system 902 directly to one or more GPUs 908 (e.g., memories thereof).
The I/O ports 912 can enable the computing device 900 to be logically coupled to other devices including the I/O component 914, the presentation component 918, and/or other components, some of which can be built into (e.g., integrated into) the computing device 900. Illustrative I/O components 914 include a microphone, mouse, keyboard, joystick, game pad, game controller, satellite dish, browser, printer, wireless device, or the like. The I/O component 914 can provide a Natural User Interface (NUI) that processes user-generated air gestures, voice, or other physiological input. In some examples, the input may be transmitted to an appropriate network element for further processing. NUI may enable any combination of speech recognition, handwriting recognition, facial recognition, biometric recognition, on-screen and near-screen gesture recognition, air gesture, head and eye tracking, and touch recognition associated with a display of computing device 900 (as described in more detail below). Computing device 900 may include depth cameras such as stereoscopic camera systems, infrared camera systems, RGB camera systems, touch screen technology, and combinations of these for gesture detection and recognition. Furthermore, computing device 900 may include an accelerometer or gyroscope (e.g., as part of an Inertial Measurement Unit (IMU)) that enables motion detection. In some examples, the output of the accelerometer or gyroscope may be used by the computing device 900 to render immersive augmented reality or virtual reality.
The power source 916 may include a hard-wired power source, a battery power source, or a combination thereof. The power source 916 may provide power to the computing device 900 to enable components of the computing device 900 to operate.
Presentation component 918 can include a display (e.g., a monitor, touch screen, television screen, head-up display (HUD), other display types, or a combination thereof), speakers, and/or other presentation components. The rendering component 918 can receive data from other components (e.g., GPU 608, CPU 906, DPU, etc.) and output the data (e.g., as images, video, sound, etc.).
FIG. 10 illustrates an example data center 1000 that can be used in at least one embodiment of the present disclosure. The data center 1000 may include a data center infrastructure layer 1010, a framework layer 1020, a software layer 1030, and an application layer 1040.
As shown in fig. 10, the data center infrastructure layer 1010 may include a resource coordinator 1012, grouped computing resources 1014, and node computing resources ("node c.r.") 1016 (1) -1016 (N), where "N" represents any complete positive integer. In at least one embodiment, nodes C.R.1016 (1) -1016 (N) may include, but are not limited to, any number of Central Processing Units (CPUs) or other processors (including DPUs, accelerators, field Programmable Gate Arrays (FPGAs), graphics processors or Graphics Processing Units (GPUs), etc.), memory devices (e.g., dynamic read only memory), storage devices (e.g., solid state drives or disk drives), network input/output (NW I/O) devices, network switches, virtual Machines (VMs), power and cooling modules, etc. In some embodiments, one or more of the nodes c.r.1016 (1) -1016 (N) may correspond to a server having one or more of the above-described computing resources. Furthermore, in some embodiments, node c.r.
1016 (1) -1016 (N) may include one or more virtual components, e.g., vGPU, vCPU, etc., and/or one or more of nodes c.r.1016 (1) -1016 (N) may correspond to a Virtual Machine (VM).
In at least one embodiment, the grouped computing resources 1014 may include individual groupings of nodes C.R.1016 (not shown) housed within one or more racks, or a number of racks (also not shown) housed within a data center at various geographic locations. Individual packets of node c.r.1016 within the packet's computing resources 1014 may include computing, network, memory, or storage resources of the packet that may be configured or allocated to support one or more workloads. In at least one embodiment, several nodes C.R.1016 including CPU, GPU, DPU and/or other processors may be grouped within one or more racks to provide computing resources to support one or more workloads. The one or more racks may also include any number of power modules, cooling modules, and/or network switches in any combination.
The resource coordinator 1012 may configure or otherwise control one or more nodes c.r.1016 (1) -1016 (N) and/or grouped computing resources 1014. In at least one embodiment, the resource coordinator 1012 may include a Software Design Infrastructure (SDI) management entity for the data center 1000. The resource coordinator 1012 may include hardware, software, or some combination thereof.
In at least one embodiment, as shown in FIG. 10, framework layer 1020 can include job scheduler 1032, configuration manager 1034, resource manager 1036, and distributed file system 1038. The framework layer 1020 may include a framework of one or more applications 1042 of the application layer 1040 and/or software 1050 supporting the software layer 1030. Software 1050 or application 1042 may comprise Web-based service software or application, such as that provided by Amazon Web Services, google Cloud, and Microsoft Azure, respectively. The framework layer 1020 may be, but is not limited to, a free and open-source software web application framework such as APACHE SPARK TM (hereinafter "Spark") that may utilize the distributed file system 1038 for large-scale data processing (e.g., "big data"). In at least one embodiment, job scheduler 1032 may include Spark drivers to facilitate scheduling of workloads supported by the various layers of data center 1000. In at least one embodiment, the configuration manager 1034 may be capable of configuring different layers, such as a software layer 1030 and a framework layer 1020 including Spark and a distributed file system 1038 for supporting large-scale data processing. Resource manager 1036 is capable of managing cluster or group computing resources mapped to or allocated for supporting distributed file system 1038 and job scheduler 1032. In at least one embodiment, the clustered or grouped computing resources can include grouped computing resources 1014 at the data center infrastructure layer 1010. The resource manager 1036 may coordinate with the resource coordinator 1012 to manage these mapped or allocated computing resources.
In at least one embodiment, the software 1050 included in the software layer 1030 may include software used by at least a portion of the nodes c.r.1016 (1) -1016 (N), the grouped computing resources 1014, and/or the distributed file system 1038 of the framework layer 1020. One or more types of software may include, but are not limited to, internet web search software, email virus browsing software, database software, and streaming video content software.
In at least one embodiment, the one or more applications 1042 included in the application layer 1040 can include one or more types of applications used by at least a portion of the nodes c.r.1016 (1) -1016 (N), the grouped computing resources 1014, and/or the distributed file system 1038 of the framework layer 1020. One or more types of applications may include, but are not limited to, any number of genomics applications, cognitive computing and machine learning applications, including training or reasoning software, machine learning framework software (e.g., pyTorch, tensorFlow, caffe, etc.), and/or other machine learning applications used in connection with one or more embodiments.
In at least one embodiment, any of configuration manager 1034, resource manager 1036, and resource coordinator 1012 may implement any number and type of self-modifying actions based on any number and type of data acquired in any technically feasible manner. The self-modifying action may mitigate data center operators of the data center 1000 from making potentially bad configuration decisions and may avoid underutilized and/or bad portions of the data center.
The data center 1000 may include tools, services, software, or other resources for training one or more machine learning models or predicting or reasoning about information using one or more machine learning models in accordance with one or more embodiments described herein. For example, the machine learning model may be trained by computing weight parameters from the neural network architecture using the software and computing resources described above with respect to the data center 1000. In at least one embodiment, by using the weight parameters calculated by one or more training techniques, information, such as, but not limited to, those described herein, can be inferred or predicted using the resources described above and with respect to the data center 1000 using a trained machine learning model corresponding to one or more neural networks.
In at least one embodiment, the data center 1000 can use CPU, application Specific Integrated Circuit (ASIC), GPU, FPGA, and/or other hardware (or virtual computing resources corresponding thereto) to perform training and/or reasoning using the resources described above. Furthermore, one or more of the software and/or hardware resources described above may be configured as a service to allow a user to train or perform information reasoning, such as image recognition, speech recognition, or other artificial intelligence services.
Example network Environment
A network environment suitable for implementing embodiments of the present disclosure may include one or more client devices, servers, network Attached Storage (NAS), other backend devices, and/or other device types. Client devices, servers, and/or other device types (e.g., each device) can be implemented on one or more instances of computing device 900 of fig. 9-e.g., each device can include similar components, features, and/or functionality of computing device 900. Further, where a back-end device (e.g., server, NAS, etc.) is implemented, the back-end device may be included as part of the data center 1000, examples of which are described in more detail herein with respect to fig. 10.
The components of the network environment may communicate with each other over a network, which may be wired, wireless, or both. The network may include a plurality of networks, or a network of a plurality of networks. For example, the network may include one or more Wide Area Networks (WANs), one or more Local Area Networks (LANs), one or more public networks (e.g., the internet and/or Public Switched Telephone Network (PSTN)), and/or one or more private networks. Where the network comprises a wireless telecommunications network, components such as base stations, communication towers, or even access points (among other components) may provide wireless connectivity.
Compatible network environments may include one or more peer-to-peer network environments (in which case the server may not be included in the network environment) and one or more client-server network environments (in which case the one or more servers may be included in the network environment). In a peer-to-peer network environment, the functionality described herein with respect to a server may be implemented on any number of client devices.
In at least one embodiment, the network environment may include one or more cloud-based network environments, distributed computing environments, combinations thereof, and the like. The cloud-based network environment may include a framework layer, a job scheduler, a resource manager, and a distributed file system implemented on one or more servers, which may include one or more core network servers and/or edge servers. The framework layer may include a framework for supporting one or more applications of the software and/or application layers of the software layer. The software or application may include web-based service software or application, respectively. In embodiments, one or more client devices may use network-based service software or applications (e.g., by accessing the service software and/or applications via one or more Application Programming Interfaces (APIs)). The framework layer may be, but is not limited to, a type of free and open source software web application framework, such as may be used for large scale data processing (e.g., "big data") using a distributed file system.
The cloud-based network environment may provide cloud computing and/or cloud storage that performs any combination of the computing and/or data storage functions described herein (or one or more portions thereof). Any of these various functions may be distributed across multiple locations from a central or core server (e.g., of one or more data centers that may be distributed across states, regions, countries, the world, etc.). If the connection to the user (e.g., client device) is relatively close to the edge server, the core server may assign at least a portion of the functionality to the edge server. The cloud-based network environment may be private (e.g., limited to only a single organization), public (e.g., available to many organizations), and/or a combination thereof (e.g., a hybrid cloud environment).
The client device may include at least some of the components, features, and functionality of the example computing device 900 described herein with respect to fig. 9. By way of example, and not limitation, a client device may be embodied as a Personal Computer (PC), laptop computer, mobile device, smart phone, tablet computer, smart watch, wearable computer, personal Digital Assistant (PDA), MP3 player, virtual reality head mounted display, global Positioning System (GPS) or device, video player, camera, surveillance device or system, vehicle, watercraft, aircraft, virtual machine, drone, robot, handheld communication device, hospital device, gaming device or system, entertainment system, in-vehicle computer system, embedded system controller, remote control, appliance, consumer electronics device, workstation, edge device, any combination of these described devices, or any other suitable device.
The disclosure may be described in the general context of machine-useable instructions, or computer code, being executed by a computer or other machine, such as a personal digital assistant or other handheld device, including computer-executable instructions such as program modules. Generally, program modules including routines, programs, objects, components, data structures, and the like, refer to code that perform particular tasks or implement particular abstract data types. The present disclosure may be practiced in a wide variety of system configurations, including hand-held devices, consumer electronics, general-purpose computers, more specialized computing devices, and the like. The disclosure may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network.
As used herein, the recitation of "and/or" with respect to two or more elements should be interpreted to refer to only one element or combination of elements. For example, "element a, element B, and/or element C" may include element a alone, element B alone, element C alone, element a and element B, element a and element C, element B and element C, or elements A, B and C. Further, "at least one of element a or element B" may include at least one of element a, at least one of element B, or at least one of element a and at least one of element B. Further, "at least one of element a and element B" may include at least one of element a, at least one of element B, or at least one of element a and at least one of element B.
The subject matter of the present disclosure is described with specificity herein to meet statutory requirements. However, the description itself is not intended to limit the scope of this disclosure. Rather, the inventors have contemplated that the claimed subject matter might also be embodied in other ways, to include different steps or combinations of similar steps than the ones described in conjunction with other present or future technologies. Moreover, although the terms "step" and/or "block" may be used herein to connote different elements of methods employed, the terms should not be interpreted as implying any particular order among or between various steps herein disclosed unless and except when the order of individual steps is explicitly described.

Claims (20)

1. A method of generating a game digest using metadata, comprising:
Receiving metadata indicative of timing information of an in-game event within a game play session, a first association between the in-game event and a video clip capturing the in-game event, and a second association between the in-game event and one of multiple rounds of the game play session, wherein the in-game event is determined based at least on analyzing video data representing the game play session;
Presenting, in a graphical user interface and using the metadata, interface elements corresponding to the in-game event, and a timeline indicating the timing information in association with the in-game event;
Displaying in the graphical user interface image data of a video clip of the video clip capturing an in-game event based at least on selection of an interface element of the interface elements corresponding to the in-game event and the first association, and
The timeline of in-game events displayed in the graphical user interface is updated to correspond to the round by the selection.
2. The method of claim 1, wherein the selection comprises at least images corresponding to the video clip and the in-game event.
3. The method of claim 1, wherein the selecting is from an in-game event list that uses the metadata to display a plurality of the in-game events in an order corresponding to the time of the in-game event.
4. The method of claim 1, wherein the selecting further results in updating a round display area of the graphical user interface.
5. The method of claim 1, wherein the updating includes visually emphasizing an indicator of the in-game event in the timeline.
6. The method of claim 1, wherein the metadata further associates the in-game event with an event type, the selecting is from a list of in-game events, and the metadata is for displaying the event type with the corresponding in-game event in the list of in-game events.
7. The method of claim 1, wherein the updating further comprises annotating a map of a virtual environment in which the in-game event occurred with the metadata based at least on one or more instances of the timing information associated with the in-game event.
8. A processor, comprising:
One or more circuits for accessing metadata indicative of timing information corresponding to in-game events within a game play session, including an association between the in-game events and capturing one or more images of the in-game events, and associating the in-game events with one of a plurality of rounds of the game play session,
Presenting interface elements corresponding to the in-game events in a list of in-game events in a user interface at locations corresponding to the timing information within the game play session using the metadata,
Presenting in the user interface the one or more images corresponding to the in-game event based at least on selection of the interface element from the in-game event list and the association between the in-game event and the one or more images, and
The timeline of in-game events displayed in the user interface is updated to correspond to the round by the selection.
9. The processor of claim 8, wherein the one or more images comprise a video clip of the game play session.
10. The processor of claim 8, further comprising visually emphasizing the interface element of the in-game event in the in-game event list based at least on the selection.
11. The processor of claim 8, wherein the metadata further associates the in-game event with an event type, the metadata for displaying the interface element of the in-game event with an indicator of the event type.
12. The processor of claim 8, wherein the selection further results in an update to a round display area of the user interface.
13. The processor of claim 8, further comprising updating game state information displayed in the user interface to correspond to the in-game event using the metadata based at least on the selection.
14. The processor of claim 8, further comprising annotating a map of a virtual environment in which the in-game event occurs with the metadata based at least on the timing information corresponding to the in-game event.
15. A system for generating a game digest using metadata, comprising:
one or more processing units;
one or more memory devices storing instructions that, when executed using the one or more processing units, cause the one or more processing units to perform a method comprising:
Receiving metadata indicative of timing information and in-game positions associated with in-game events of a game play session of a game, wherein the in-game events are determined based at least on analyzing game data associated with the game play session;
presenting interface elements corresponding to in-game events of the game play session with a map of a virtual environment of the game using the metadata, and
Annotating the map using the location and the timing information associated with the in-game event based at least on a selection of one or more of the interface elements corresponding to the in-game event.
16. The system of claim 15, wherein the annotation comprises a path of an in-game object in the game being presented on the map, wherein endpoints of the path are based at least on the timing information.
17. The system of claim 15, wherein the annotation comprises a location of an in-game object of the game presented on the map based at least on the location and the timing information.
18. The system of claim 15, wherein the annotation comprises a location of one or more in-game events of the game presented on the map based at least on the location and the timing information.
19. The system of claim 15, wherein the annotating includes presenting a symbol of the in-game event based at least on an event type specified for the in-game event in the metadata.
20. The system of claim 15, wherein the interface element is displayed in a timeline of a user interface at a location corresponding to timing information corresponding to the in-game event in the game play session.
CN202210724542.8A 2021-06-28 2022-06-23 Automatically generate enhanced activity and event summaries for game sessions Active CN115604543B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US17/359,961 2021-06-28
US17/359,961 US11617951B2 (en) 2021-06-28 2021-06-28 Automatically generated enhanced activity and event summaries for gameplay sessions

Publications (2)

Publication Number Publication Date
CN115604543A CN115604543A (en) 2023-01-13
CN115604543B true CN115604543B (en) 2025-10-10

Family

ID=

Similar Documents

Publication Publication Date Title
US11617951B2 (en) Automatically generated enhanced activity and event summaries for gameplay sessions
US20240363146A1 (en) Determining high-interest durations of gameplay sessions from user inputs
US11712621B2 (en) Video clip classification using feature vectors of a trained image classifier
US12322177B2 (en) Automatic content recognition and information in live streaming suitable for video games
US11007445B2 (en) Techniques for curation of video game clips
CN103888440B (en) Cloud-based game slice generation and hassle-free social sharing out-of-the-box
US12430913B2 (en) Event information extraction from game logs using natural language processing
US11786823B2 (en) System and method for creating personalized game experiences
US20240390802A1 (en) Playstyle analysis for game recommendations
US20250008016A1 (en) Multimedia messaging apparatuses and methods for sending multimedia messages
CN115604543B (en) Automatically generate enhanced activity and event summaries for game sessions
CN119923284A (en) User emotion detection during game play to identify user impairments to provide automatic generation or modification of in-game effects
US20240319951A1 (en) Extended reality content display based on a context
US20250144517A1 (en) Ai middleware that monitors communication to filter and modify language being received and to summarize dialogue between friends that a player missed while stepping away
US20250269277A1 (en) Generation of highlight reel from stored user generated content for a user specified time period
EP4623354A1 (en) Multimedia messaging apparatuses and methods for sending multimedia messages

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant