WO2018161790A1 - Procédé et dispositif de transmission vidéo - Google Patents
Procédé et dispositif de transmission vidéo Download PDFInfo
- Publication number
- WO2018161790A1 WO2018161790A1 PCT/CN2018/076526 CN2018076526W WO2018161790A1 WO 2018161790 A1 WO2018161790 A1 WO 2018161790A1 CN 2018076526 W CN2018076526 W CN 2018076526W WO 2018161790 A1 WO2018161790 A1 WO 2018161790A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- fragment
- mainstream
- request
- terminal
- auxiliary stream
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 101
- 230000005540 biological transmission Effects 0.000 title claims abstract description 73
- 239000012634 fragment Substances 0.000 claims abstract description 492
- 238000013467 fragmentation Methods 0.000 claims description 92
- 238000006062 fragmentation reaction Methods 0.000 claims description 92
- 230000015654 memory Effects 0.000 claims description 42
- 238000004891 communication Methods 0.000 claims description 35
- 230000011218 segmentation Effects 0.000 claims description 8
- 230000008569 process Effects 0.000 description 32
- 230000006870 function Effects 0.000 description 22
- 238000010586 diagram Methods 0.000 description 13
- 230000009471 action Effects 0.000 description 12
- 230000000694 effects Effects 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 8
- 230000003993 interaction Effects 0.000 description 7
- 239000002131 composite material Substances 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 238000004590 computer program Methods 0.000 description 5
- 238000013461 design Methods 0.000 description 5
- 230000003139 buffering effect Effects 0.000 description 3
- 238000005538 encapsulation Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 239000002699 waste material Substances 0.000 description 3
- 230000006399 behavior Effects 0.000 description 2
- 239000000872 buffer Substances 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000005055 memory storage Effects 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000011895 specific detection Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/845—Structuring of content, e.g. decomposing content into time segments
- H04N21/8456—Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
- H04N21/234363—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by altering the spatial resolution, e.g. for clients with a lower screen resolution
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
- H04N21/440263—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/845—Structuring of content, e.g. decomposing content into time segments
Definitions
- the present application relates to the field of transmission technologies, and in particular, to a video transmission method and apparatus.
- HLS HTTP live streaming
- HTTP hypertext transfer protocol
- the server encodes the video content to obtain a plurality of video files of different code rates, and then divides each of the plurality of video files into a plurality of slices according to time.
- the terminal needs to play the video, it requests the required fragment according to the playing time and the current network condition, and buffers, decodes, and plays the requested fragment.
- the embodiment of the present application provides a video transmission method and apparatus, which are used to save transmission resources and cache resources.
- a video transmission method where the video includes a primary encoded stream and a secondary encoded stream, where the secondary encoded stream is used to increase a code rate of the primary encoded stream, the primary encoded stream includes at least two mainstream fragments, and the secondary encoded stream includes at least Two auxiliary stream segments, each of which corresponds to one auxiliary stream segment.
- the method includes: the terminal sends a first mainstream fragment request to the server, where the first mainstream fragment request is used to request the first mainstream fragment, and the first mainstream fragment is one mainstream fragment of the at least two mainstream fragments.
- the terminal requests the mainstream fragment, and requests the auxiliary stream fragmentation as needed; wherein the auxiliary stream fragment is used to improve the code rate of the mainstream fragment.
- the primary encoded stream can be played separately, or the primary encoded stream and the secondary encoded stream are superimposed and reconstructed, so the buffered primary encoded stream does not lose its role due to buffering the secondary encoded stream. Compared with the prior art, it is possible to save transmission resources and cache resources.
- the terminal and the server communicate through the HTTP/2 protocol.
- the server can be any server that supports the HTTP/2 protocol, and can be, for example, a WEB server.
- the method may further include: the terminal superimposing and reconstructing the first mainstream fragment and the first auxiliary stream fragment, and playing the reconstructed code stream. In this way, the quality of video playback can be improved.
- the receiving, by the terminal, the first mainstream fragment sent by the server may include: receiving, by a server, a first mainstream fragment sent by the server, where the link may be between the terminal and the server. Any of the links.
- the first auxiliary stream fragment sent by the receiving server includes: receiving, by the link, the first auxiliary stream fragment sent by the server. In this way, both the terminal and the server need only maintain one link, so the implementation is simple.
- the method may further include: the terminal sending a second mainstream fragment request to the server, where the second mainstream fragment request is used to request the second mainstream fragment, and the second mainstream fragment is at least a mainstream fragment other than the first mainstream fragment in the two mainstream fragments; a second mainstream fragment sent by the receiving server; if the current real-time network condition meets the preset condition, sending the second auxiliary stream to the server a slice request, where the second auxiliary stream fragment request is used to request a second auxiliary stream fragment corresponding to the second mainstream fragment; if the second auxiliary stream fragmentation request is sent to the server, and the second auxiliary stream is received Before detecting the current real-time network condition, the interrupt request is sent to the server, where the interrupt request is used to request to interrupt the transmission process of the second auxiliary stream fragment.
- the terminal can obtain the next mainstream fragment of the second mainstream fragment to be reduced to a certain extent, thereby effectively reducing the occurrence of a stuck phenomenon in the process of playing the video by the terminal. It can be understood that the possible implementation manner can be performed separately without relying on any of the technical solutions provided above, such that the second mainstream fragment is arbitrary.
- a second aspect provides a video transmission method, where the video includes a primary encoded stream and a secondary encoded stream, the primary encoded stream includes at least two mainstream fragments, and the secondary encoded stream includes at least two auxiliary stream fragments, each mainstream fragment.
- the method may include: the server receiving the first mainstream fragment request sent by the terminal, where the first mainstream fragment request is used to request the first mainstream fragment, and the first mainstream fragment is at least two The first mainstream fragment is sent to the terminal; the first auxiliary stream fragment request sent by the terminal is received, where the first auxiliary stream fragment request is used to request the first mainstream fragment corresponding to the first mainstream fragment.
- the first auxiliary stream fragmentation request is sent by the terminal if the current network condition meets the preset condition; and the first auxiliary stream fragment is sent to the terminal.
- sending the first mainstream fragment to the terminal may include: sending the first mainstream fragment to the terminal on a link.
- sending the first auxiliary stream fragment to the terminal may include: sending the first auxiliary stream fragment to the terminal on the link.
- the priority of the mainstream fragment is higher than the priority of the auxiliary stream fragment corresponding to the mainstream fragment.
- sending the first auxiliary stream fragment to the terminal may include: if it is determined that the first mainstream fragment is sent to the terminal, sending the first auxiliary stream fragment to the terminal. In this way, the mainstream fragments can be preferentially transmitted, thereby ensuring the continuity of playback of the mainstream fragments.
- the method may further include: the server receiving the second mainstream fragment request sent by the terminal, where the second mainstream fragment request is used to request the second mainstream fragment, and the second mainstream fragment is a mainstream fragment other than the first mainstream fragment of the at least two mainstream fragments; sending a second mainstream fragment to the terminal; receiving a second auxiliary stream fragmentation request sent by the terminal, where the second auxiliary stream is divided The slice request is used to request the second auxiliary stream fragment corresponding to the second mainstream fragment; if the interrupt request sent by the terminal is received before the second auxiliary stream fragmentation request is sent to the terminal, the second auxiliary stream fragment is interrupted. Transmission process. It can be understood that the possible implementation manner may be performed independently of any of the technical solutions provided above, such that the second mainstream fragment is arbitrary.
- a third aspect provides a terminal, where the video includes a primary encoded stream and a secondary encoded stream, where the secondary encoded stream is used to increase a code rate of the primary encoded stream, the primary encoded stream includes at least two mainstream fragments, and the secondary encoded stream includes at least two secondary streams.
- Flow segmentation where each mainstream segment corresponds to one auxiliary stream segment.
- the terminal may include: a transmitting unit and a receiving unit.
- the sending unit is configured to send a first mainstream fragment request to the server, where the first mainstream fragment request is used to request the first mainstream fragment, and the first mainstream fragment is one of the at least two mainstream fragments. Fragmentation.
- the receiving unit is configured to receive the first mainstream fragment sent by the server.
- the sending unit is further configured to: if the current real-time network condition meets the preset condition, send a first auxiliary stream fragment request to the server, where the first auxiliary stream fragment request is used to request the first auxiliary corresponding to the first mainstream fragment Streaming slices.
- the receiving unit is further configured to receive the first auxiliary stream fragment sent by the server.
- the terminal may further include: a reconstruction unit and a playback unit.
- the reconstruction unit is configured to superimpose and reconstruct the first mainstream segment and the first auxiliary stream segment.
- the playing unit is configured to play the code stream generated after the reconstruction.
- the receiving unit is configured to: receive, by a server, a first mainstream fragment sent by the server on a link; and receive, by the link, the first auxiliary stream fragment sent by the server.
- the sending unit may be further configured to send a second mainstream fragment request to the server, where the second mainstream fragment request is used to request the second mainstream fragment, and the second mainstream fragment is at least One of the two mainstream slices except the first mainstream slice.
- the receiving unit is further configured to receive the second mainstream fragment sent by the server.
- the sending unit may be further configured to: if the current real-time network condition meets the preset condition, send a second auxiliary stream fragmentation request to the server, where the second auxiliary stream fragmentation request is used to request the second corresponding to the second mainstream fragment Auxiliary flow segmentation.
- the sending unit may be further configured to: if the current real-time network condition does not satisfy the preset condition after detecting the second auxiliary stream fragmentation request to the server and before receiving the second auxiliary stream fragmentation, send an interrupt request to the server, The interrupt request is used to request to interrupt the transmission process of the second auxiliary stream fragment.
- a fourth aspect provides a server, where the video includes a primary encoded stream and a secondary encoded stream, the primary encoded stream includes at least two mainstream fragments, and the secondary encoded stream includes at least two auxiliary stream fragments, each mainstream fragment and one auxiliary stream.
- the stream is fragmented.
- the server may include: a receiving unit and a transmitting unit.
- the receiving unit is configured to receive a first mainstream fragment request sent by the terminal, where the first mainstream fragment request is used to request the first mainstream fragment, and the first mainstream fragment is one of at least two mainstream fragments. Mainstream segmentation.
- a sending unit configured to send the first mainstream fragment to the terminal.
- the receiving unit is further configured to: receive the first auxiliary stream fragmentation request sent by the terminal, where the first auxiliary stream fragmentation request is used to request the first auxiliary stream fragment corresponding to the first mainstream fragment; the first auxiliary stream fragmentation The request is sent by the terminal if the current network condition satisfies the preset condition.
- the sending unit is further configured to send the first auxiliary stream fragment to the terminal.
- the sending unit is specifically configured to: send a first mainstream fragment to the terminal on a link; and send the first auxiliary stream fragment to the terminal on the link.
- the priority of the mainstream fragment is higher than the priority of the auxiliary stream fragment corresponding to the mainstream fragment.
- the sending unit is specifically configured to: if it is determined that the first mainstream has been sent to the terminal Fragmentation, the first auxiliary stream fragment is sent to the terminal.
- the receiving unit is further configured to: receive a second mainstream fragment request sent by the terminal, where the second mainstream fragment request is used to request the second mainstream fragment, and the second mainstream fragment is at least One of the two mainstream slices except the first mainstream slice.
- the sending unit is further configured to send the second mainstream fragment to the terminal.
- the receiving unit is further configured to: receive a second auxiliary stream fragmentation request sent by the terminal, where the second auxiliary stream fragmentation request is used to request the second auxiliary stream fragment corresponding to the second mainstream fragment.
- the receiving unit is further configured to receive an interrupt request sent by the terminal.
- the server may further include: an interrupting unit, configured to interrupt the transmission process of the second auxiliary stream fragment if the receiving unit receives the interrupt request sent by the terminal before the sending unit sends the second auxiliary stream fragmentation request to the terminal.
- the interrupt request in the above first to fourth aspects may include: an RST_STREM frame in HTTP/2.
- the interrupt request may be a newly designed frame, or other frames provided in the prior art may be multiplexed, which is not limited in this embodiment of the present application.
- a terminal in a fifth aspect, has a function of implementing terminal behavior in the foregoing method embodiment.
- This function can be implemented in hardware or in hardware by executing the corresponding software.
- the hardware or software includes one or more modules corresponding to the functions described above.
- the terminal may include: a processor, a memory, a communication bus, and a communication interface; wherein the memory is configured to store a computer execution instruction, the processor and the memory are connected through a communication bus, and when the terminal is running, processing The computer executing the memory storage executes the instructions to cause the terminal to perform the video transmission method provided by the above first aspect or any one of the first aspects.
- a computer readable storage medium for storing computer software instructions for use by the terminal, when executed on a computer, causes the computer to perform the video transmission method of any of the above aspects.
- a computer program product comprising instructions for causing a computer to perform the video transmission method of any of the above first aspects when executed on a computer is provided.
- a server having a function of implementing server behavior in the above method embodiment.
- This function can be implemented in hardware or in hardware by executing the corresponding software.
- the hardware or software includes one or more modules corresponding to the functions described above.
- the server may include: a processor, a memory, a communication bus, and a communication interface.
- the memory is used to store computer execution instructions, the processor and the memory are connected by a communication bus, and when the server is running, the processor executes the memory storage computer execution instructions, so that the server performs any of the above second aspect or the second aspect.
- the video transmission method provided by the implementation.
- a computer readable storage medium for storing computer software instructions for use by the server, when executed on a computer, causes the computer to perform the video transmission method of any of the above second aspects.
- a computer program product comprising instructions which, when run on a computer, cause the computer to perform the video transmission method of any of the above second aspects.
- FIG. 1 is a schematic diagram of a system architecture applicable to a technical solution provided by an embodiment of the present disclosure
- FIG. 2 is a schematic structural diagram of an encoding server according to an embodiment of the present application.
- FIG. 3 is a schematic structural diagram of a transmission server according to an embodiment of the present disclosure.
- FIG. 4 is a schematic structural diagram of a terminal according to an embodiment of the present application.
- FIG. 5 is a schematic diagram of interaction of a video transmission method according to an embodiment of the present application.
- FIG. 6 is a schematic diagram of interaction of another video transmission method according to an embodiment of the present disclosure.
- FIG. 7 is a schematic flowchart of a video transmission method according to an embodiment of the present application.
- FIG. 8 is a schematic flowchart of playing a video according to an embodiment of the present application.
- FIG. 9 is a schematic diagram of interaction of another video transmission method according to an embodiment of the present disclosure.
- FIG. 10 is a schematic structural diagram of another terminal according to an embodiment of the present disclosure.
- FIG. 11 is a schematic structural diagram of another server according to an embodiment of the present application.
- the cached but unplayed low bit rate fragments will be disabled, resulting in waste of transmission resources and cache resources.
- the terminal has requested and cached the low code rate fragments 1 to 10, and is currently playing the fragment 5; if the terminal detects that the current network condition is good when the fragment 5 is played, the high code can be requested from the server.
- the fragment 6 of the rate such that the low-rate fragment 6 that has been cached but not played will be deactivated, thereby causing waste of transmission resources and cache resources.
- the embodiment of the present application provides a video transmission method and device, the basic principle is: the terminal requests the mainstream fragmentation, and requests the auxiliary stream fragmentation as needed; wherein the auxiliary stream fragment is used to improve the mainstream fragmentation.
- Code rate In the prior art, fragments of the same content of different code rates exist independently, are independently transmitted, and are selectively played.
- the role of the secondary encoded stream is to improve the code rate of the primary encoded stream, that is, the primary encoded stream can be played separately, or the primary encoded stream and the secondary encoded stream are superimposed and reconstructed, and thus the cached primary encoding is performed. The stream does not lose its effect by caching the secondary encoded stream. Compared with the prior art, it is possible to save transmission resources and cache resources.
- the primary encoded stream and the secondary encoded stream involved in the embodiment of the present application may be a code stream obtained by the server after being encoded according to a reconstructive video coding (RVC) technology or the like.
- RVC reconstructive video coding
- the encoder can generate two streams at the same time, one of which is the primary encoded stream and the other is the secondary encoded stream.
- the main encoded stream is a video stream with a lower resolution, and the player can directly decode and play.
- the role of the secondary encoded stream is to assist the player in improving the resolution of the primary encoded stream.
- the code rate of the main code stream is 720p.
- the code rate can be increased to 1080p, thereby improving the video playback quality.
- the embodiment of the present application is not limited to the coding technology of the RVC technology, and may be other coding technologies.
- the technical solution provided by the embodiment of the present application can be applied to the system architecture shown in FIG. 1 , the system package shown in FIG. 1 : the encoding server 11 , the transmission server 12 , and the terminal 13 .
- the encoding server 11 is for encoding the video content according to a certain encoding technique and transmitting the encoded result to the transmission server 12.
- the transmission server 12 is configured to save the encoded result, and when receiving the fragmentation request sent by the terminal 13, transmits the encoded fragment to the terminal 13.
- the terminal 13 is configured to send a fragmentation request to the transmission server 12 according to a user indication or a self-trigger, and decode and play the encoded fragment received by the transmission server 12.
- FIG. 1 is only an example diagram, and the number of the encoding server 11 , the number of the transmission server 12 , and the number of at least one terminal 13 are not limited in the embodiment of the present application. In practical applications, Network deployment can be performed in different numbers as shown in Figure 1 as needed.
- one or more links may be established between the transport server 12 and the terminal 13, and the fragmentation request and the fragment are transmitted on the links.
- the link here may be an HTTP/2 link, in which case the transport server 12 may be a server supporting HTTP/2.
- the specific implementation is not limited to this.
- the encoding server 11 in FIG. 1 may include: a downsampling module 1101, a mainstream encoder 1102, an auxiliary stream generating module 1103, a fragmentation control module 1104, and a mainstream.
- the downsampling module 1101 is configured to receive the original video input stream, perform a down conversion rate on the input original video input stream, and output the video stream obtained by the downsampling rate to the mainstream encoder 1102. For example, receiving a 4K YUV format video input stream, and a 4K YUV format video input stream downrate, generating a 2K YUV format video stream, and then outputting the 2K YUV format video stream to the mainstream encoder 1102.
- YUV is an encoding format of color data.
- the mainstream encoder 1102 is configured to receive the video stream input by the downsampling module 1101, and encode the video stream, for example, H.264 encoding, etc., generate a main encoded stream, and then output the main encoded stream to the mainstream package fragmentation module 1105.
- the auxiliary stream information generating module 1103 is configured to compare the video stream output by the downsampling module 1101 with the main encoded stream output by the mainstream encoder 1102 to generate a secondary encoded stream.
- the secondary encoded stream includes content reconstructed for improving video quality, such as information for correcting partial pixel points in the primary encoded stream.
- the mainstream encapsulation fragmentation module 1105 is configured to receive the main encoded stream output by the mainstream encoder 1102, and under the control of the fragmentation control module 1104, segment and encapsulate the main encoded stream according to a time point to obtain at least two mainstream fragments. For example, TS (transport stream) fragment or mp4 fragment, etc., and then output the at least two mainstream fragments.
- TS transport stream
- mp4 fragment
- the mainstream encapsulation fragmentation module 1105 is configured to receive the main encoded stream output by the mainstream encoder 1102, and under the control of the fragmentation control module 1104, segment and encapsulate the main encoded stream according to a time point to obtain at least two mainstream fragments. For example, TS (transport stream) fragment or mp4 fragment, etc., and then output the at least two mainstream fragments.
- TS transport stream
- the auxiliary stream encapsulation slice module 1106 is configured to receive the auxiliary code stream output by the auxiliary stream generation module 1103, and perform fragmentation and encapsulation on the auxiliary coded stream according to time under the control of the fragmentation control module 1104, to obtain at least two auxiliary stream segments.
- the slice then outputs the at least two auxiliary stream fragments.
- the format of the auxiliary stream fragment may be a custom format, and each auxiliary stream fragment corresponds to one mainstream fragment, and the time point of each auxiliary stream fragment is the same as the time point of the corresponding mainstream fragment, and each auxiliary The stream slice is used to increase the bit rate of the corresponding mainstream slice.
- the fragmentation control module 1104 is configured to control the execution process of the mainstream package fragmentation module 1105 and the execution process of the auxiliary stream package fragmentation module 1106, and generate a warehouse receipt that describes the fragmentation (including the mainstream fragment and the auxiliary stream fragment). File, then output the warehouse order file.
- the warehouse order file may include a time point of each slice, a correspondence between each mainstream slice and each mainstream slice, and the like.
- the encoding server 11 may further include a communication interface for communicating with an external device such as the transmission server 2.
- the main stream fragment for receiving the output of the mainstream package fragmentation module 1105, the auxiliary stream fragment output by the auxiliary stream package fragmentation module 1106, and the warehouse order file output by the fragmentation control module 1104; and transmitting the information to The WEB server, wherein the WEB server may be a WEB server supporting the HTTP/2 protocol, and the server may be the transport server 12 above.
- the communication interface 1107 can be a transceiver, a downsampling module 1101, a mainstream encoder 1102, a secondary stream information generating module 1103, a fragmentation control module 1104, a mainstream package fragmentation module 1105, and a secondary stream package fragmentation module 1106. Both may be embedded in the hardware of the encoding server 11 in hardware, or may be stored in the memory of the encoding server 11 in the form of software, so that the processor invokes the operations corresponding to the above modules.
- the encoding server 11 may include, in addition to the transceiver, the processor, and the memory, a communication bus for interconnecting the transceiver, the processor, and the memory.
- the transmission server 12 in FIG. 1 may include: a communication interface 1201, a processor 1202, a memory 1203, and a communication bus 1204.
- the communication interface 1201, the processor 1202, and the memory 1203 are connected to each other through a communication bus 1204.
- the communication interface 1201 is a unit for performing service data flow interaction between the transmission server 12 and external network elements such as the encoding server 11 and the terminal 13.
- the communication interface 1201 can be configured to receive each mainstream fragment, each auxiliary stream fragment, and a warehouse order file sent by the encoding server 11.
- the main stream fragment request, the auxiliary stream encoding request or the interrupt request sent by the terminal 13 is received, or the requested main stream fragment, auxiliary stream fragment or warehouse order file is returned to the terminal 13.
- the memory 1203 can be used to store data and/or code, and the processor 1302 can implement various functions of the transport server 12 by running or executing program code stored in the memory 1203, as well as invoking data stored in the memory 1203.
- the transmission server 12 shown in FIG. 3 does not constitute a limitation on the transmission server 12.
- the transmission server 12 may further include more components than the illustration, or combine some components, or different. Assembly of parts.
- the transmission server 12 may also include a power source, a fan, a clock (CLK), and the like.
- the terminal 13 in FIG. 1 may include: a communication interface 1301, a processor 1302, a memory 1303, a communication bus 1304, a player 1305, and a display screen 1306.
- the communication interface 1301, the processor 1302, the memory 1303, the player 1305, and the display 1306 are connected to each other through a communication bus 1304.
- the communication interface 1303 is a unit for performing service data flow interaction between the terminal 13 and an external network element (such as the transmission server 12). Specifically, the communication interface 1303 may be configured to send a mainstream fragment request, a secondary stream encoding request, or an interrupt request to the transmission server 12, or receive a mainstream fragment, a secondary stream fragment, or a warehouse order file sent by the transmission server.
- the memory 1303 can be used to store data and/or code, and the processor 1302 can implement various functions of the terminal 13 by running or executing program code stored in the memory 1303, and calling data stored in the memory 1303.
- the terminal 13 shown in FIG. 4 does not constitute a limitation on the terminal 13.
- the terminal 13 may further include more components than the illustrated ones, or combine some components, or different component arrangements.
- the terminal 13 may also include a power source, a fan, a clock, and the like.
- the communication interface 1201 and the communication interface 1301 may each include a receiving unit and a sending unit, and may specifically be a transceiver or a transmission interface.
- the processor 1202 and the processor 1302 may include a plurality of central processing units (CPUs) or include a plurality of network processing units (CPUs), which may be application specific integrated circuits (ASICs). Or one or more integrated circuits configured to implement embodiments of the present application, such as one or more digital signal processors (DSPs), or one or more field programmable gate arrays (field programmable gates) Array, FPGA), can also be a multi-core system-on-chip (SoC).
- CPUs central processing units
- CPUs may be application specific integrated circuits
- ASICs application specific integrated circuits
- DSPs digital signal processors
- FPGA field programmable gate array
- SoC system-on-chip
- the memory 1203 and the memory 1303 may each be a volatile memory, such as a random-access memory (RAM), or a non-volatile memory, such as a read-only memory.
- RAM random-access memory
- ROM read-only memory
- flash memory flash memory
- HDD hard disk drive
- SSD solid-state drive
- the communication bus 1204 and the communication bus 1304 can be divided into an address bus, a data bus, a control bus, etc., and can be an Ethernet bus, an industry standard architecture (ISA) bus, or a peripheral component (PCI). Bus or extended industry standard architecture (EISA) bus.
- ISA industry standard architecture
- PCI peripheral component
- EISA extended industry standard architecture
- each of the communication buses is represented by only one thick line in FIGS. 3 and 4, but does not mean that there is only one bus or one type of bus.
- the encoding server 11 and the transmission server 12 are described as independent devices.
- the two devices may also be integrated, and in addition, any of the two devices A device may be integrated with other network-side devices, which is not limited in this embodiment of the present application.
- the following embodiments show and describe in detail the video transmission method provided by the present invention, wherein the steps shown may also be in the transmission server 12 and the terminal shown in FIG. Executed in any computer other than 13.
- the logical sequence of the video transmission method provided by the embodiment of the present application is shown in the method flowchart, in some cases, the steps shown or described may be performed in an order different from that herein.
- the "server” described in the embodiments provided below may specifically be the transmission server 12 described above.
- "and/or” in this article is only an association relationship describing the associated objects, indicating that there may be three kinds of relationships, for example, A and / or B, which may indicate that A exists separately, and A and B exist simultaneously, respectively. B these three situations.
- the character "/" in this article generally indicates that the contextual object is an "or” relationship.
- Multiple refers to two or more.
- FIG. 5 is a schematic diagram of interaction of a video transmission method according to an embodiment of the present application.
- the method shown in FIG. 5 may include the following steps S101 to S104:
- S101 The terminal acquires a warehouse order file of the video to be played from the server.
- the video to be played may be any video requested by the terminal to the server.
- the video may include a primary encoded stream and a secondary encoded stream, the secondary encoded stream is used to increase the code rate of the primary encoded stream, the primary encoded stream includes at least two mainstream fragments, and the secondary encoded stream includes at least two auxiliary stream fragments, each mainstream The slice corresponds to one auxiliary stream slice.
- the generation process of the primary coded stream and the secondary coded stream and the fragmentation process thereof may be referred to the above, and are not described herein again.
- the warehouse order file of the to-be-played video may include, but is not limited to, information of each of the at least two mainstream segments, and information of each of the at least two auxiliary stream segments.
- information of each of the at least two mainstream segments may include, but is not limited to, information of each of the at least two mainstream segments, and information of each of the at least two auxiliary stream segments.
- the time point of each fragment including the mainstream fragment and the auxiliary stream fragment
- the time point of each mainstream slice is the same as the time point of the corresponding auxiliary stream slice.
- the terminal may send the indication information to the terminal, for example, by using a touch screen operation, text input, voice input, etc.; after detecting the indication information, the terminal determines that the terminal has been established with the server. If the connection is made, the warehouse file of the video to be played is requested from the server; if it is determined that the terminal does not establish a connection with the server, a connection is established with the server, and then the warehouse order file of the time-frequency to be played is requested from the server. .
- the process of establishing a connection between the terminal and the server and the specific implementation process of the terminal requesting the warehouse order file from the server, refer to the prior art, and details are not described herein again.
- S102 The terminal parses the warehouse order file, obtains a time point of each fragment of the video to be played, a correspondence relationship between each mainstream fragment and the auxiliary stream fragment, and the like.
- the terminal obtains the mainstream fragment of the to-be-played video from the server, and obtains the auxiliary stream fragment corresponding to the mainstream fragment when the current real-time network status meets the preset condition; and caches each fragment obtained.
- the terminal may determine, according to the detected indication information sent by the user, which mainstream fragment in the time frequency to be played by the server, and then the terminal may request the mainstream segment from the server after receiving the mainstream fragment replied by the server.
- the terminal may request the fifth mainstream fragment from the server, and after receiving the fifth mainstream fragment of the server reply, The sixth mainstream fragment is requested from the server, and so on, until the terminal requests the 10th mainstream fragment.
- the current network status is not static, and it can change in real time as factors such as environment and load change.
- the terminal can periodically detect the current network status. When the detection period is small, it can be considered as detecting the current network status in real time.
- the detecting the current network status may include, but is not limited to, detecting at least one of the following parameters: network delay, network packet loss rate, and network fluctuation system. For the specific detection process, reference may be made to the prior art, and details are not described herein again.
- the current real-time network status refers to the detected network status when the terminal performs the action of detecting the network status. That is to say, the meaning represented by the “current real-time network status” involved in the different steps of the embodiment of the present application needs to be determined according to actual implementation or actual context, and the same word in the different steps cannot be considered as “current real-time network”.
- Status indicates the network status measured at the same time.
- current play time refers to the moment the terminal is playing.
- the current real-time network condition satisfies the preset condition, which can be understood as the quality of the current network condition is good, that is, the communication quality is good.
- the terminal may request the auxiliary stream fragment from the server to superimpose and reconstruct the auxiliary stream fragment and the mainstream fragment, thereby improving the code rate of the mainstream fragment corresponding to the auxiliary stream fragment, thereby improving user viewing. Video experience.
- the current network network condition meets the preset condition, and may include at least one of the following: the current network delay is less than or equal to the first threshold, the current network packet loss rate is less than or equal to the second threshold, and the current network fluctuation coefficient is less than or equal to the third threshold.
- the specific values of the first threshold, the second threshold, and the third threshold, and how to obtain values, are not limited in this embodiment.
- the terminal may separately cache the mainstream fragment and the auxiliary stream fragment, for example, opening a specific two-part storage space in the terminal, where a part of the storage space is used for storing the mainstream fragment, and another part of the storage space is used for storing the mainstream fragment.
- the terminal may integrate the cache mainstream fragment and the auxiliary stream fragment, for example, open a specific storage space in the terminal, and store the mainstream fragment and the auxiliary stream fragment in the storage space.
- the method includes the following steps S11 to S16:
- S11 The terminal sends a first mainstream fragment request to the server, where the first mainstream fragment request is used to request the first mainstream fragment.
- the first mainstream fragment request may include an identifier of the first mainstream fragment, such as a time point of the first mainstream fragment, or an index of the first mainstream fragment.
- the first mainstream fragment may be any one of the at least two mainstream fragments, which may be, from the mainstream fragment corresponding to the playback start time indicated by the indication information to the last included in the video to be played. Any of the mainstream shards in a mainstream shard.
- S12 The server receives the first mainstream fragment request sent by the terminal, and sends the first mainstream fragment to the terminal according to the first mainstream fragment request.
- S13 The terminal receives the first mainstream fragment sent by the server, and caches the first mainstream fragment.
- the terminal sends a first auxiliary stream fragmentation request to the server, where the first auxiliary stream fragmentation request is used to request the first auxiliary stream fragment corresponding to the first mainstream fragment.
- the first auxiliary stream fragmentation request may include an identifier of the first auxiliary stream fragment, such as a time point of the first auxiliary stream fragment, or an index of the first auxiliary stream fragment.
- S15 The server receives the first auxiliary stream fragmentation request sent by the terminal, and sends the first auxiliary stream fragment to the terminal.
- S16 The terminal receives the first auxiliary stream fragment sent by the server, and buffers the first auxiliary stream fragment.
- S11 to S13 are processes for acquiring the first mainstream fragment
- S14 to S16 are processes for acquiring the first auxiliary stream fragment.
- the execution order of the S11 to S13 and the S14 to S16 is not limited in the embodiment of the present application, for example, S11 to S16 are executed again, and S14 to S16 are executed.
- S14 to S16 may be executed first, and S11 to S13 may be executed.
- S14 to S16 may be executed during the execution of S11 to S13. That is to say, the process of obtaining a mainstream fragment by the terminal and the process of acquiring the auxiliary stream fragment corresponding to the mainstream fragment may be in no particular order.
- the following describes the process in which the terminal acquires multiple mainstream fragments and auxiliary stream fragments corresponding to each mainstream fragment.
- the terminal it is generally used to distinguish the order of requesting the main stream fragments according to the time point; and, the order of the auxiliary stream fragments may be requested according to the time point, or the auxiliary stream points may not be requested according to the time point.
- the request sequence of the mainstream fragment and the auxiliary stream fragment (that is, the mainstream fragment and the auxiliary stream fragment corresponding to the mainstream fragment) at the same time point may not be limited.
- the server it generally sends each mainstream fragment to the terminal according to the time point, and sends each auxiliary stream fragment to the terminal according to time.
- the priority sequence of the mainstream fragment and the auxiliary stream fragment may be determined by setting a priority.
- the server and the terminal may be pre-agreed, or the priority between the mainstream fragment and the auxiliary stream fragment corresponding to the mainstream fragment is agreed by the server and the terminal. For example, use the HTTP/2 PRIORITY frame to set the priority.
- the terminal when the terminal sends a mainstream fragment request and/or a secondary stream fragmentation request to the server, the terminal carries the priority of the requested mainstream fragment and/or the auxiliary stream fragment.
- the mainstream shards can be played separately, and the auxiliary stream shards need to be played together with the mainstream shards.
- the priority of the mainstream shards is higher than the priority of the auxiliary shards corresponding to the mainstream shards. Level, so as to ensure that the mainstream shards can be transmitted preferentially, so as to ensure the continuity of the mainstream shards; that is, after the server sends a mainstream shard to the terminal, it will send the auxiliary stream corresponding to the mainstream shards. sheet.
- the process of acquiring the multiple mainstream fragments and the auxiliary stream fragments corresponding to each of the mainstream fragments may specifically include the following steps S21 to S24:
- S21 The terminal acquires the i-th main stream fragment from the server, and caches the i-th main stream fragment.
- the i-th main stream fragment refers to the i-th main stream fragment of the video to be played, i is an integer greater than or equal to 1, the initial value of i is 1, and the maximum value of i is the mainstream fragment included in the video to be played. quantity.
- S22 The terminal determines whether to request the auxiliary stream fragment corresponding to the i-th mainstream fragment.
- step S23 If yes, go to step S23, if no, go to step S24.
- the terminal determines that the i-th auxiliary stream fragment needs to be requested when the current real-time network status meets the preset condition and the i-th auxiliary stream fragment is not buffered; In the case of a preset condition, or the i-th auxiliary stream fragment has been cached, it is determined that the i-th auxiliary stream fragment is not required to be requested.
- S23 The terminal acquires the auxiliary stream fragment corresponding to the i-th main stream fragment, and caches the auxiliary stream fragment.
- step S24 is performed.
- S24 The terminal determines whether the i-th mainstream fragment is the last mainstream fragment to be played.
- the current real-time network status may not be determined whether the preset condition is met.
- the terminal has acquired each mainstream fragment to be played, and auxiliary stream fragments corresponding to some or all of the mainstream fragments.
- the method may further include: the terminal acquiring the uncached auxiliary stream fragment corresponding to the unplayed mainstream fragment.
- the mainstream shards to be played are sorted according to the time point, the main shards are 1 to 5, and the main shards 1 to 5 are associated with the auxiliary stream shards 1 to 5, then the mainstream shards obtained by the terminal and
- the sequence of the auxiliary stream fragmentation may be: mainstream fragment 1, auxiliary stream fragment 1, mainstream fragment 2, mainstream fragment 3, mainstream fragment 4, auxiliary stream fragment 4, mainstream fragment 5, auxiliary stream fragmentation. 5.
- the terminal may further acquire the following auxiliary stream fragments: the auxiliary stream fragment 2 and the auxiliary stream fragment. 3.
- the terminal after the terminal obtains each mainstream fragment, if the current real-time network condition meets the preset condition, the terminal acquires the uncached first auxiliary stream fragment corresponding to the unplayed mainstream fragment. Until the last mainstream fragment is obtained, and then the uncached auxiliary stream fragment corresponding to the unplayed mainstream fragment is obtained.
- the terminal may obtain one or more uncached auxiliary stream fragments corresponding to the unplayed mainstream fragments after acquiring the multiple mainstream fragments, which is not limited in this embodiment of the present application.
- S104 The terminal plays the video according to the cached slice.
- the terminal After requesting each fragment (including the mainstream fragment and the auxiliary stream fragment), the terminal first caches the fragment, and after buffering to a certain number of mainstream fragments, starts to follow the time points of the mainstream fragment according to the time.
- the cached mainstream shard plays the video to be played to ensure the continuity of playback. It should be noted that, since the mainstream fragment and the auxiliary stream fragment obtained by the terminal are obtained by the server, the terminal needs to decode the mainstream fragment and the auxiliary stream fragment before playing.
- the embodiment of the present application does not limit the specific value and how to determine the value.
- the terminal can play the cached video while acquiring the video, but for the same mainstream segment, the terminal first acquires the mainstream segment and then plays the video.
- the mainstream fragment that has been cached is the mainstream fragment that has been cached.
- the process of playing the video by the terminal may specifically include the following steps S31 to S35:
- the terminal determines the jth mainstream fragment according to the time point of the mainstream fragmentation.
- the jth mainstream fragment indicates the mainstream fragment that the terminal is about to play recently. For example, if the terminal is currently playing the mainstream fragment 1, the mainstream fragment to be played by the terminal is the mainstream fragment 2.
- the jth mainstream fragment refers to the jth mainstream fragment of the video to be played, j is an integer greater than or equal to 1, the initial value of j is 1, and the maximum value of j is the mainstream fragment included in the video to be played. quantity.
- S32 The terminal determines whether the auxiliary stream fragment corresponding to the jth mainstream fragment is buffered.
- step S33 If yes, go to step S33; if no, go to step S34.
- the terminal plays the jth mainstream fragment. That is, the video content corresponding to the jth mainstream tile is played.
- step S35 is performed.
- S34 The terminal superimposes and reconstructs the jth mainstream fragment and the auxiliary stream fragment to obtain a composite fragment, and plays the composite fragment.
- step S35 is performed.
- the video content corresponding to the video content played in step S33 is the same as the video content corresponding to the combined component played in step S34, except that the code rates of the two are different, and thus the playback quality is different.
- Mainstream fragmentation and superposition auxiliary stream segmentation and reconstruction are the key steps in reconstructing video coding. Since the encoding server encodes the video content and generates the mainstream fragment, the code rate processing is performed, for example, the 4K video stream is subjected to a down-coding rate to generate a 2K video stream. Therefore, the resolution of the image frame obtained by the terminal after decoding the mainstream slice is low. Generally, after decoding the mainstream slice, the terminal performs super-resolution processing on the decoded image to re-rate the code rate to 4K; however, during the promotion process, the time-frequency quality of the original 4K video stream is compared. It will fall, so it can be corrected with the auxiliary code stream, such as replenishing the missing details, correcting the errors introduced during the encoding process, and so on.
- the auxiliary code stream such as replenishing the missing details, correcting the errors introduced during the encoding process, and so on.
- S35 The terminal determines whether the jth mainstream fragment is the last mainstream fragment to be played.
- the video to be played includes 10 mainstream fragments, and each mainstream fragment corresponds to one auxiliary stream fragment.
- the mainstream fragment indicated by the current playback time is the fifth mainstream fragment, and the sixth to ten mainstream streams are cached.
- Fragmentation, and 6th to 7th auxiliary stream segmentation a possible case is: playing the synthesized slice 6, the synthesized slice 7, and the 8th to 10th mainstream segments in sequence.
- the composite slice 6 is a composite slice obtained by superimposing and reconstructing the sixth mainstream slice and the sixth auxiliary stream slice
- the composite slice 7 is the seventh mainstream slice and the seventh auxiliary stream segment.
- the composite fragments obtained after the slices are superimposed and reconstructed.
- the terminal requests the mainstream fragment, and requests the auxiliary stream fragmentation as needed; wherein the auxiliary stream fragment is used to improve the code rate of the mainstream fragment.
- the primary encoded stream can be played separately, or the primary encoded stream and the secondary encoded stream are superimposed and reconstructed, so the buffered primary encoded stream does not lose its role due to buffering the secondary encoded stream. Compared with the prior art, it is possible to save transmission resources and cache resources.
- the mainstream fragment and the auxiliary stream fragment can be transmitted on two independent links or in the same chain. Transfer on the road. If the transmission is on the same link, both the terminal and the server need only maintain this link, so the implementation is simple.
- the link here can be an HTTP/2 link. That is, when the terminal sends the primary fragment request and the secondary stream fragment request, the same HTTP/2 link is used, that is, the mainstream fragment and the auxiliary stream fragment use different HTTP/2 frames, and the two streams are multiplexed. Go to the same HTTP/2 link.
- the foregoing S11 to S16 provide a process of acquiring the first mainstream fragment and the first auxiliary stream fragment.
- an interrupt request can also be set to interrupt the transmission process of the auxiliary stream fragment.
- the following steps S41 to S47 may be included.
- the second mainstream fragment is used as an example for description. It can be understood that the second mainstream fragment is arbitrarily, that is, the terminal can interrupt the transmission process of any one of the auxiliary stream fragments in the time-frequency to be played.
- S41 The terminal sends a second mainstream fragment request to the server, where the second mainstream fragment request is used to request the second mainstream fragment, and the second mainstream fragment is the first mainstream fragment in the at least two mainstream fragments.
- S42 The server receives the second mainstream fragmentation request sent by the terminal, and sends a second mainstream fragment to the server according to the second mainstream fragmentation request.
- S43 The terminal receives the second mainstream fragment sent by the server, and caches the second mainstream fragment.
- S45 The server receives the second auxiliary stream fragmentation request sent by the terminal.
- the terminal can detect the current network status in real time. Therefore, if the terminal sends the second auxiliary stream fragmentation request to the server and receives the second auxiliary stream fragment, the terminal detects that the current real-time network status does not meet the preset condition. In this case, the network is in a bad condition. If the server sends the second auxiliary stream fragment to the terminal, it may take a long time, which may cause the terminal to play a video in the process of playing the video.
- the terminal request may be an RST_STREM frame in the HTTP/2, or may be a new frame, or another frame, which is not limited in this embodiment of the present application.
- the terminal detects that the current real-time network condition still meets the preset condition after sending the second auxiliary stream fragmentation request to the server and before receiving the second auxiliary stream fragmentation, the terminal does not send an interrupt request to the server, that is, The server will not receive an interrupt request.
- the server sends a second secondary stream fragmentation request to the terminal, and the terminal receives the second auxiliary stream fragment and caches the second auxiliary stream fragment.
- the server receives the interrupt request sent by the terminal, and if it is determined that the second auxiliary stream fragment has not been sent to the terminal, interrupts the transmission process of the second auxiliary stream fragment according to the interrupt request.
- the server may not respond to the second auxiliary stream fragmentation request message, that is, the second auxiliary stream fragment is not sent to the terminal. If the server has sent the second auxiliary stream fragment to the terminal when receiving the interrupt request sent by the terminal, the server may discard the interrupt request.
- the terminal may not receive the second auxiliary stream fragment sent by the server, but the terminal has acquired the second mainstream fragment, and therefore, the terminal may continue to request the second mainstream fragment from the server.
- the next mainstream shard may be understood that after performing S41-S47.
- the terminal may trigger the interruption.
- the transmission process of the requested auxiliary stream fragment the terminal can obtain the next mainstream fragment of the mainstream fragment corresponding to the auxiliary stream fragment to a certain extent, thereby effectively reducing the occurrence of a stuck phenomenon in the process of playing the video by the terminal.
- each network element such as a terminal or a server.
- each network element such as a terminal or a server.
- it includes hardware structures and/or software modules corresponding to the execution of the respective functions.
- the present application can be implemented in a combination of hardware or hardware and computer software in combination with the elements and algorithm steps of the various examples described in the embodiments disclosed herein. Whether a function is implemented in hardware or computer software to drive hardware depends on the specific application and design constraints of the solution. A person skilled in the art can use different methods to implement the described functions for each particular application, but such implementation should not be considered to be beyond the scope of the present application.
- the embodiment of the present application may divide a function module into a terminal or a server according to the foregoing method.
- each function module may be divided according to each function, or two or more functions may be integrated into one processing module.
- the above integrated modules can be implemented in the form of hardware or in the form of software functional modules. It should be noted that the division of the module in the embodiment of the present application is schematic, and is only a logical function division, and the actual implementation may have another division manner.
- FIG. 10 shows a possible structural diagram of the terminal 100 involved in the above embodiment.
- the terminal 100 includes a transmitting unit 1401 and a receiving unit 1402.
- the method further includes: a reconstruction unit 1403 and a playback unit 1404.
- the transmitting unit 1401 may be configured to perform actions performed by the terminal in S11 and S14 in FIG. 6, actions performed by the terminal in S41 and S46 in FIG. 9, and/or other processes for supporting the techniques described herein.
- the receiving unit may be used to perform actions performed by the terminal in S12 and S15 in FIG. 6, actions performed by the terminal in S42 in FIG. 9, and/or other processes for supporting the techniques described herein.
- Reconstruction unit 1403 can be used to perform the actions reconstructed in S34 of Figure 8, and/or other processes for supporting the techniques described herein.
- Play unit 1404 can be used to perform the actions played in S34 of Figure 8, and/or other processes for supporting the techniques described herein.
- the terminal is presented in a form corresponding to each function to divide each functional module, or the policy entity is presented in a form of dividing each functional module in an integrated manner.
- a "module” herein may refer to an Application-Specific Integrated Circuit (ASIC), a processor and memory that executes one or more software or firmware programs, integrated logic circuits, and/or other devices that provide the above functionality.
- ASIC Application-Specific Integrated Circuit
- the terminal can take the form shown in FIG.
- the sending unit 1401 provided above may be a transmitter
- the receiving unit 1402 may be a receiver
- the transmitter may form a transceiver together with the receiver, which may specifically be the communication interface 1301 in FIG.
- reconstruction unit 1403 may be embedded in the hardware or in a processor independent of the terminal (such as the processor 1302 in FIG. 4), or may be stored in the memory of the terminal in the form of software (such as the memory 1303 in FIG. 4). )in.
- Play unit 1404 can be player 1305 and/or display 1306 in FIG.
- the terminal provided by the embodiment of the present application can be used to perform the above-mentioned video transmission method. Therefore, the technical effects of the present invention can be referred to the foregoing method embodiments.
- FIG. 11 shows a possible structural diagram of the server 110 involved in the above embodiment.
- the server 110 may include a receiving unit 1501 and a transmitting unit 1502.
- the method further includes: an interrupting unit 1503.
- the receiving unit 1501 may be configured to perform the actions performed by the server in S11 and S14 in FIG. 6, the actions performed by the server in S41 and S44 in FIG. 9, and/or other processes for supporting the techniques described herein.
- the transmitting unit 1502 may be configured to perform the actions performed by the server in S12 and S15 in FIG. 6, the actions performed by the server in S45 and S46 in FIG. 9, and/or other processes for supporting the techniques described herein.
- Interrupt unit 1503 can be used to perform the actions of S47 in Figure 9, and/or other processes for supporting the techniques described herein.
- the server is presented in a form corresponding to each functional module, or the server is presented in a form of dividing each functional module in an integrated manner.
- a “module” herein may refer to a particular ASIC circuit, a processor and memory that executes one or more software or firmware programs, integrated logic circuitry, and/or other devices that can provide the functionality described above.
- the terminal can take the form shown in FIG.
- the sending unit 1502 provided above may be a transmitter
- the receiving unit 1501 may be a receiver
- the transmitter may form a transceiver together with the receiver, which may specifically be the communication interface 1201 in FIG.
- the interrupting unit 1503 may be embedded in hardware or in a processor independent of the terminal (such as the processor 1202 in FIG. 3), or may be stored in software in the memory of the terminal (such as the memory 1203 in FIG. 3). in.
- the terminal provided by the embodiment of the present application can be used to perform the above-mentioned video transmission method. Therefore, the technical effects of the present invention can be referred to the foregoing method embodiments.
- the above embodiments it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof.
- a software program it may be implemented in whole or in part in the form of a computer program product.
- the computer program product includes one or more computer instructions.
- the computer program instructions When the computer program instructions are loaded and executed on a computer, the processes or functions described in accordance with embodiments of the present application are generated in whole or in part.
- the computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable device.
- the computer instructions can be stored in a computer readable storage medium or transferred from one computer readable storage medium to another computer readable storage medium, for example, the computer instructions can be from a website site, computer, server or data center Transmission to another website site, computer, server or data center via wired (eg coaxial cable, fiber optic, digital subscriber line (DSL)) or wireless (eg infrared, wireless, microwave, etc.).
- the computer readable storage medium can be any available media that can be accessed by a computer or a data storage device that includes one or more servers, data centers, etc. that can be integrated with the media.
- the usable medium may be a magnetic medium (eg, a floppy disk, a hard disk, a magnetic tape), an optical medium (eg, a DVD), or a semiconductor medium (such as a solid state disk (SSD)) or the like.
- a magnetic medium eg, a floppy disk, a hard disk, a magnetic tape
- an optical medium eg, a DVD
- a semiconductor medium such as a solid state disk (SSD)
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
Des modes de réalisation de la présente invention concernent un procédé et un dispositif de transmission vidéo, se rapportant au domaine technique de la transmission, et sont utilisés pour économiser des ressources de transmission et de cache. La vidéo comprend un flux de codage primaire et un flux de codage secondaire. Le flux de codage secondaire est utilisé pour augmenter un débit binaire du flux de codage primaire. Le flux de codage primaire comprend au moins deux fragments de flux primaire. Le flux de codage secondaire comprend au moins deux fragments de flux secondaires. Chaque fragment de flux primaire correspond à un fragment de flux secondaire. Le procédé consiste à : envoyer une première demande de fragment de flux primaire à un serveur, la première demande de fragment de flux primaire étant utilisée pour demander un premier fragment de flux primaire, et le premier fragment de flux principal étant un fragment de flux primaire dans au moins deux fragments de flux primaire ; recevoir le premier fragment de flux primaire envoyé par le serveur ; envoyer une première demande de fragment de flux secondaire au serveur si un état actuel du réseau en temps réel satisfait une condition prédéfinie, la première demande de fragment de flux secondaire étant utilisée pour demander un premier fragment de flux secondaire correspondant au premier fragment de flux primaire ; et recevoir le premier fragment de flux secondaire envoyé par le serveur.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710140922.6A CN108574882B (zh) | 2017-03-08 | 2017-03-08 | 一种视频传输方法和装置 |
CN201710140922.6 | 2017-03-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018161790A1 true WO2018161790A1 (fr) | 2018-09-13 |
Family
ID=63448047
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2018/076526 WO2018161790A1 (fr) | 2017-03-08 | 2018-02-12 | Procédé et dispositif de transmission vidéo |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN108574882B (fr) |
WO (1) | WO2018161790A1 (fr) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110072130B (zh) * | 2019-04-11 | 2020-06-02 | 西安交通大学 | 一种基于http/2的has视频切片推送方法 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6496980B1 (en) * | 1998-12-07 | 2002-12-17 | Intel Corporation | Method of providing replay on demand for streaming digital multimedia |
CN102123304A (zh) * | 2009-12-28 | 2011-07-13 | 汤姆森许可贸易公司 | 用于接收利用现有数据传输广播的视频内容和服务的方法和设备 |
CN102638704A (zh) * | 2006-06-27 | 2012-08-15 | 汤姆森特许公司 | 性能感知的对等内容点播 |
-
2017
- 2017-03-08 CN CN201710140922.6A patent/CN108574882B/zh active Active
-
2018
- 2018-02-12 WO PCT/CN2018/076526 patent/WO2018161790A1/fr active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6496980B1 (en) * | 1998-12-07 | 2002-12-17 | Intel Corporation | Method of providing replay on demand for streaming digital multimedia |
CN102638704A (zh) * | 2006-06-27 | 2012-08-15 | 汤姆森特许公司 | 性能感知的对等内容点播 |
CN102123304A (zh) * | 2009-12-28 | 2011-07-13 | 汤姆森许可贸易公司 | 用于接收利用现有数据传输广播的视频内容和服务的方法和设备 |
Also Published As
Publication number | Publication date |
---|---|
CN108574882B (zh) | 2019-11-12 |
CN108574882A (zh) | 2018-09-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11622134B2 (en) | System and method for low-latency content streaming | |
CN110784740A (zh) | 视频处理方法、装置、服务器及可读存储介质 | |
WO2016131223A1 (fr) | Procédé de perte de trame pour trame vidéo et appareil d'envoi de vidéo | |
WO2018133601A1 (fr) | Procédé et appareil de transmission de contenu multimédia diffusé en continu, serveur et terminal | |
WO2017095885A1 (fr) | Procédé et appareil de transmission de données vidéo | |
CN110830460B (zh) | 一种连接建立方法、装置、电子设备及存储介质 | |
US10476928B2 (en) | Network video playback method and apparatus | |
CN111447455A (zh) | 直播视频流回放处理方法、装置及计算设备 | |
TWI637631B (zh) | 影像處理裝置、影視子系統與影視處理電路 | |
US10863179B1 (en) | Overlapped rate control for high-quality segmented video encoding | |
CN102006501A (zh) | 流媒体播放控制方法、装置和流媒体播放器 | |
US20140226711A1 (en) | System and method for self-adaptive streaming of multimedia content | |
CN107592551A (zh) | 用于云流服务的方法和设备 | |
JP2018509060A (ja) | Mmtpストリームをmpeg−2 tsに変換する方法及び装置 | |
WO2021143360A1 (fr) | Procédé de transmission de ressources et dispositif informatique | |
CN110855645B (zh) | 流媒体数据播放方法、装置 | |
WO2023226915A1 (fr) | Procédé et système de transmission vidéo, et support de stockage | |
CN114900698A (zh) | 基于前向纠错的视频传输方法、设备和计算机存储介质 | |
CN105898625B (zh) | 一种播放处理方法及终端设备 | |
US10893303B1 (en) | Streaming chunked media segments | |
CN115767149A (zh) | 一种视频数据的传输方法和装置 | |
WO2018161790A1 (fr) | Procédé et dispositif de transmission vidéo | |
CN113079386B (zh) | 一种视频在线播放方法、装置、电子设备及存储介质 | |
CN108702542A (zh) | 用于串流服务的客户端操作方法 | |
CN105519121B (zh) | 一种关键帧路由的方法及媒体服务器 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18763041 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18763041 Country of ref document: EP Kind code of ref document: A1 |