CN117528092A - Image transmission method, image transmission system, electronic device, and storage medium - Google Patents
Image transmission method, image transmission system, electronic device, and storage medium Download PDFInfo
- Publication number
- CN117528092A CN117528092A CN202311467321.8A CN202311467321A CN117528092A CN 117528092 A CN117528092 A CN 117528092A CN 202311467321 A CN202311467321 A CN 202311467321A CN 117528092 A CN117528092 A CN 117528092A
- Authority
- CN
- China
- Prior art keywords
- image
- information
- integrated
- integration
- area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/167—Position within a video image, e.g. region of interest [ROI]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/10—Protocols in which an application is distributed across nodes in the network
- H04L67/1095—Replication or mirroring of data, e.g. scheduling or transport for data synchronisation between network nodes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Computer Networks & Wireless Communication (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
Abstract
The embodiment of the application provides an image transmission method, an image transmission system, electronic equipment and a storage medium. In some embodiments, an image transmission method includes: integrating the images received from the at least one transmitting end into an integration area to form an integrated image; determining coding information according to the position information of the image in the integration area; and determining and outputting the integrated data according to the integrated image and the coding information.
Description
Technical Field
Embodiments of the present application relate to the field of data processing, and more particularly, to an image transmission method, an image transmission system, an electronic device, and a storage medium.
Background
In daily use, users often use a screen-casting function to transfer the display screen content of their own device to the display screen of another device so that the display screen of the other device can view the display screen content of their own device. For example, the user can throw the display screen of the mobile terminal to the vehicle-mounted display screen, so that the user can view the display screen interface of the mobile terminal in the vehicle through the vehicle-mounted display screen, and the user can view information conveniently.
However, in some scenes, the screen projection requirements exist at the same time for a plurality of screen projection ends, and the screen projection channel of the display end is limited, so that the screen projection requirements of the plurality of screen projection ends cannot be met.
Disclosure of Invention
Embodiments of the present application provide an image transmission method, an image transmission system, an electronic device, and a storage medium that can at least partially solve the above-mentioned problems or other problems existing in the prior art.
The embodiment of the application provides an image transmission method, which comprises the following steps: integrating the images received from the at least one transmitting end into an integration area to form an integrated image; determining coding information according to the position information of the image in the integration area; and determining and outputting the integrated data according to the integrated image and the coding information.
The embodiment of the application also provides an image transmission method of the sending end, which comprises the following steps: acquiring size information of a remaining area in an integration area of a server; comparing the size information of the remaining area with the size information of the image to be transmitted; and sending an integration request containing the image to the server in response to the comparison result indicating that the size of the residual area is greater than or equal to the size of the image.
The embodiment of the application also provides an image transmission method of the receiving end, which comprises the following steps: acquiring coding information from the acquired integrated data, wherein the coding information comprises position information of each original image in the integrated image; acquiring an original image from an integrated image in the integrated data according to the coding information; and processing each original image.
According to some embodiments of the present application, an execution subject integrates an image of a transmitting end into an integration area and encodes according to position information of the image in the integration area to form integrated data, so that even if a plurality of transmitting ends simultaneously transmit the image, the image can be output from a transmission channel through a limited number.
In some embodiments, the integrated image is divided into coding regions for coding the coding information, so that the coding information and the image are synchronous, and subsequent processing is facilitated.
In some embodiments, the image coding is performed by using a color block coding technology, so that the problem that a single pixel has error to affect the function can be reduced, and the stability of the system is improved.
In some embodiments, the coding region is dynamically adjusted to enable the integrated image to accommodate as much information as possible.
In some embodiments, the image is scaled to meet the transmission requirements if the image is larger than the remaining area of the syndicated image.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading the detailed description of non-limiting embodiments, made with reference to the following drawings. Wherein:
FIG. 1 is a schematic illustration of an on-board screen-drop scenario according to some embodiments of the present application;
Fig. 2 is a schematic structural view of an image transmission system according to some embodiments of the present application;
fig. 3 is a flow chart of an image transmission method suitable for a server according to some embodiments of the present application;
FIGS. 4a and 4b are schematic diagrams of integrated images, respectively, according to some embodiments of the present application;
fig. 5 is a flow chart of an image transmission method suitable for a transmitting end according to some embodiments of the present application;
fig. 6 is a flow chart of an image transmission method suitable for a receiving end according to some embodiments of the present application; and
fig. 7 is a schematic block diagram of an electronic device according to some embodiments of the present application.
Detailed Description
For a better understanding of the present application, various aspects of the present application will be described in more detail with reference to the accompanying drawings. It should be understood that these detailed description are merely illustrative of exemplary embodiments of the application and are not intended to limit the scope of the application in any way. Like reference numerals refer to like elements throughout the specification. The expression "and/or" includes any and all combinations of one or more of the associated listed items.
Unless otherwise defined, all terms (including engineering and technical terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
It should be noted that, in the case of no conflict, the embodiments and features in the embodiments may be combined with each other. In addition, unless explicitly defined or contradicted by context, the particular steps included in the methods described herein are not necessarily limited to the order described, but may be performed in any order or in parallel. The present application will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
In some scenarios, a user uses a screen shot or the like to transfer an image from one device to another. For example, fig. 1 is a schematic diagram of an on-board screen-projection scenario according to some embodiments of the present application. As shown in fig. 1, a user initiates a screen-throwing request through a terminal 110, and can throw a screen onto an on-board display screen 121 of a vehicle 120. Alternatively, the screen-casting process may be based on the mobile industry processor interface (Mobile Industry Processor Interface, MIPI) protocol for image data transmission. However, the MIPI protocol supports a virtual channel function, but it supports a limited number of channels. Meanwhile, the MIPI-based screen projection chip generally has the requirement of minimum resolution, and color blocks cannot be too low, so that the transmission of small-resolution images is not facilitated. While some other screen-throwing chips generally do not support the virtual channel function of MIPI, channel multiplexing cannot be performed, i.e., one screen-throwing channel can only transmit the screen-throwing content of one terminal 110. If multiple projection chips are used to transmit multiple paths of data, the cost is too high, and the hardware interface of the chip is limited, so that the implementation is poor.
For example, a plurality of SOC chips may be disposed within the vehicle 120, which may be configured to transmit image data through the projection chip. In this example, multiple image data transmission needs to be realized between multiple SOC chips through the projection screen chip. Under the condition, a single screen projection chip cannot meet the requirement, and a plurality of screen projection chips have the problems of high cost and difficult implementation.
Based on at least one of the above problems or other problems, in some embodiments of the present application, fig. 2 is a schematic structural diagram of an image transmission system according to some embodiments of the present application. As shown in fig. 2, the image transmission system 200 includes at least a transmitting end 210, a service end 220, and a receiving end 230. Wherein the at least one transmitting end 210 may be configured to transmit the image to be processed. The server 220 may be configured to integrate the image transmitted from the transmitting end into an integration area to form an integrated image, determine encoding information according to position information of the image in the integration area, and determine and output integrated data according to the integrated image and the encoding information. The receiving end 230 may be configured to receive the integrated data and obtain encoded information from the integrated data, obtain images from the integrated images in the integrated data according to the encoded information, and process the images.
As one example, the image transmission system 200 may be used for a projection scene. The transmitting end 210 may be, for example, a screen-throwing client, for example, a mobile terminal including a mobile phone, a notebook computer, a smart bracelet, etc. The server 220 may be, for example, a projection server. The receiving end 230 may be, for example, a projection display end, for example, a vehicle body of a vehicle.
It should be appreciated that the image transmission system 200 may also be applied to other image transmission scenarios, not specifically recited herein, without departing from the teachings of the present application.
It should be appreciated that the image transmission system 200 may be implemented as a device with the transmitting end 210, the server end 220, and the receiving end 230 being components of the device without departing from the teachings of the present application. The image transmission system 200 may also be implemented to be constructed by a plurality of devices, i.e., the transmitting end 210, the service end 220, and the receiving end 230 may also be separate devices. The present application is not limited to the embodiment of the image transmission system 200.
The processing procedure of the server 220 is first described in the following.
In some embodiments of the present application, fig. 3 is a schematic flow chart of an image transmission method suitable for a server according to some embodiments of the present application. As shown in fig. 3, the image transmission method 300 suitable for the server 220 may include the following steps:
S31, integrating the images received from at least one transmitting end into an integration area to form an integrated image;
s32, determining coding information according to the position information of the image in the integration area; and
s33, determining and outputting the integrated data according to the integrated image and the coding information.
According to some embodiments of the present application, an execution subject integrates an image of a transmitting end into an integration area and encodes according to position information of the image in the integration area to form integrated data, so that even if a plurality of transmitting ends simultaneously transmit the image, the image can be output through a limited transmission channel.
For ease of understanding, an exemplary description of an exemplary implementation of the various steps of the image transmission method 300 illustrated in fig. 3 is provided below.
In some embodiments of the present application, it is contemplated that the projection chip generally has a relatively large data throughput capability. For example, some projection chips may implement an image resolution of 3840 x 2160, an image transmission of 30 frames per second (fps). And the resolution of the image transmitted by the transmitting end 210 is generally smaller than the resolution of the image supported by the projection chip. Based on this, the server 220 can integrate the images received from at least one transmitting end into an integration area to form an integrated image, and the area where each image is located can be regarded as a channel, so that multiple virtual channels are realized through the integrated image. The integration area may refer to a part or all of the pixel structure units of the projection screen chip, and the part or all of the pixel structure units are used for integrating the image. The process of integrating the image into the integration area is exemplarily described below.
As an example, the process of integrating the image received from the at least one transmitting end 210 into the integration area by the server 220 to form an integrated image may include: traversing the integration area according to a specified direction from a specified position of the integration area to determine the rest area in the integration area; if the image is traversed to the residual area, the image is added according to the starting position of the traversed residual area. For example, if the server 220 adds the image from the bottom left to the right, after receiving the image, it traverses the integration area from the last row of the integration area to the right, and if traversing to the pixel point that has not been integrated, it determines the pixel point as the integration position of the pixel point in the bottom left corner of the image; and integrating the image into an integration area according to the determined integration position. In the above example, the server 220 adds the received images sequentially along the fixed direction according to the specified rule, so that the integration logic is simpler, and the integration efficiency can be improved.
As another example, the process of integrating the image received from the at least one transmitting end 210 into the integration area by the server 220 to form an integrated image includes: and sequencing the images according to the image information of each image, and sequentially integrating each image into an integration area according to the sequencing result to form an integrated image. In the above example, the image information of the image may include size information of the image. The server 220 may sort the images according to the sizes of the images currently received and transmitted by the different transmitting ends 210. For example, the images are ordered in order of size from small to large to transmit as many images in as many integration as possible. After determining the ordering of the images, sequentially adding the images to the integration area along the specified direction from the specified pixel point in the integration area according to the ordering result. In the above example, the images are ordered based on the image size so that the arrangement of the added images is more reasonable, so as to strive for adding more images in a limited integration area, and realize the transmission of more paths of image data.
It should be appreciated that, in addition to sequentially integrating images in a specified direction, individual images may be sequentially added based on other integration rules according to size information of the images without departing from the teachings of the present application. For example, the integration area is a rectangular area, and the integration rule may indicate that the image is integrated from the pixel point at the top left of the integration area, and after the current image is integrated, it is determined whether the longitudinal dimension of the next image to be integrated is smaller than the difference between the longitudinal dimension of the integration area and the longitudinal dimension of the integrated image, and if so, the initial integration position of the next image to be integrated is determined according to the longitudinal dimension of the integrated image. For example, a coordinate system is established with the pixel point at the upper left corner as the origin, the transverse direction of the integration area is the X-axis direction, the longitudinal direction is the Y-axis direction, the initial integration position of the first image may be (0, 0), and if the longitudinal dimension of the first image is k, the initial integration position of the next image to be integrated is (0, k+n), where n is an integer greater than 1. By adjusting the integration rule, the images sent by each sending end 210 can be integrated into an integration area more reasonably, so as to strive for adding more images in a limited integration area, and finally, the integrated images are spliced with more images, the integrated images are transmitted to a plurality of receiving ends, and each receiving end receives the integrated images and decomposes the integrated images to obtain images required by itself, thereby realizing the transmission of more paths of image data. The present application does not limit the integration rules.
As another example, the process of integrating the image received from the at least one transmitting end 210 into the integration area by the server 220 to form an integrated image includes: and sequencing the images according to the priority information of the sending end corresponding to each image, and sequentially integrating each image into an integration area according to the sequencing result to form an integrated image. For example, the server 220 may store therein priority information of a transmitting end having image transmission authority. If a plurality of sending terminals send images at the same time, each image can be ordered according to the priority information, and the image transmission requirements of the sending terminals with higher priorities are preferentially met.
It should be understood that the process of adding images after ordering the images based on the priority information may refer to an exemplary description of the process of adding images after ordering the images based on the image information, and will not be described herein.
It should be appreciated that one skilled in the art may also sort images based on other information to meet various needs without departing from the teachings of the present application, which is not limited in this application.
Alternatively, the server 220 may determine size information of the image according to image information of the image during the image adding process for the image to be integrated, and compare the size information of the image with the size information of the remaining area in the integrated image. If the server 220 determines that the size of the comparison result indicates that the size of the image is smaller than or equal to the size of the remaining area, the server integrates the image into the remaining area. In this example, the image integration is performed after determining that the remaining area is larger than the image to be integrated currently, so that the image missing situation can be reduced.
If the optional server 220 determines that the size of the image indicated by the comparison result is greater than the size of the remaining area, it may prompt the sender 210 of the image that the image fails to be transmitted, or may perform scaling processing on the image, and integrate the scaled image into the remaining area. In this example, in the case that the remaining area is insufficient, the image is completely integrated into the integration area through the image scaling process, so that the integration area can accommodate more images, and the server 220 can implement more paths of image transmission.
It should be appreciated that other measures may be taken by the server 220 to integrate more images in a limited integration area without departing from the teachings of the present application, which is not limited in this regard.
In some embodiments of the present application, after determining the location information of the image in the integration area, the server 220 may determine the encoding information according to the location information of the image in the integration area.
As one example, the server 220 may directly determine the encoding information according to the location information of the image in the integration area, so that the receiving end 230 may display the image based on acquiring the image from the integration area.
As another example, the server 220 may determine the encoding information according to the position information and the auxiliary information of the image. Wherein the auxiliary information may include at least one of: the position information of the coding region, the identification information of the image, the processing instruction information of the image, the integration time information of the image and the verification information of the image.
An exemplary description is given below of a suitable scenario of a part of the auxiliary information.
Position information of encoded region
In some scenarios, the server 220 may perform color block encoding on the encoded region of the integrated image according to the encoding information and determine the encoded integrated image as the integrated data in determining the integrated data according to the integrated image and the encoding information. In this example, the integrated image is divided into coding regions for coding the coding information, so that the coding information and the image are synchronous, and subsequent processing is facilitated. In addition, the server 220 performs image encoding by using the color block encoding technology, so that the problem that a single pixel has an error to affect the function can be reduced, and the stability of the system can be improved.
It should be understood that in the process of color block coding, binary may be used, where the pixel point is black and 0, and the pixel point is white and 1, and in order to reduce the occupied coding space, a color that is not easy to be confused other than black and white may be used for coding, which is not limited in this application.
Alternatively, in the above scenario, the server 220 may dynamically determine the coding region of the integrated image according to the position information of the image, and perform color block coding on the coding region of the integrated image according to the coding information. In other words, the server 220 can dynamically adjust the encoding region according to the position information of the image to be integrated, so that the integrated image can accommodate more information as much as possible.
For example, an exemplary illustration is given using a 60fps drop chip with a transmission capability of 3840×2160. If the server 220 determines that the images with the lateral dimension larger than the longitudinal dimension are most in the images to be integrated, the location information of the integrated image a 410 may be, for example, as shown in the integration area 411 in fig. 4 a. In this case, the server 220 may determine the region 412 as shown in fig. 4a as the encoding region based on the start position information of the encoding region and the position information of the region remaining after the integrated image a is added in the integrated image a. The server 220 determines the location information of the encoded region according to the lateral dimension of the region 412 (i.e., the encoded region) and the longitudinal dimension of the encoded region.
For another example, the images with lateral dimensions smaller than the longitudinal dimensions are most of the images to be integrated, and the determined position information of the integrated image B420 may be, for example, as shown in the integrated area 421 in fig. 4B. In this case, the server 220 may determine the region 422 as illustrated in fig. 4B as the encoding region based on the start position information of the encoding region and the position information of the region remaining after the integrated image B is added in the integrated image B. The server 220 determines the location information of the encoded region according to the lateral dimension of the region 422 (i.e., the encoded region) and the longitudinal dimension of the encoded region.
From the above, it can be seen that the position information of the coding region dynamically codes the image to be integrated according to the requirement, which is beneficial to the final integrated image to accommodate more information as much as possible.
For the above scenario, the server 220 may co-encode the position information of the encoding region as auxiliary information to the encoding region in the integrated image. Wherein the position information of the encoded region may include a lateral dimension and a longitudinal dimension of the encoded region. The receiving end 230 may read the start position information of the encoding region and the offset information of the pixel point for storing the position information of the encoding region relative to the start position information, determine the position of the pixel point in the integrated image for storing the lateral dimension and the longitudinal dimension of the encoding region, and decode the pixel point at the position to obtain the lateral dimension and the longitudinal dimension of the encoding region. The receiving end 230 may further determine the coding region in the integrated image according to the lateral dimension and the longitudinal dimension of the coding region, so as to read the position information of the integrated image from the coding region, and obtain the required image.
It should be understood that, without departing from the teachings of the present application, the start position information of the encoding region and the offset information of the pixel point for storing the position information of the encoding region relative to the start position information may be transmitted to the receiving end 230 through separate data frames, or may be written in advance into the decoding protocol between the receiving end 230 and the server end 220, which is not limited in this application.
It should be appreciated that, without departing from the teachings of the present application, the server 220 may also encode the encoded image according to the encoded information, and determine the encoded image and the integrated image as integrated data, instead of encoding the encoded information into the integrated image, thereby improving bandwidth utilization. In this case, the position information of the encoded region may not be the encoded information. The method for determining the integrated data by the server 220 is not limited in this application.
It should be understood that the coding region in the integrated image may also be a fixed region without departing from the teachings of the present application, i.e., the server 220 designates a region in the integrated image in advance as the coding region, such as the upper left corner or other contracted region of the integrated image, without dynamically adjusting the coding region. In this case, the position information of the encoded region may not be the encoded information. The method for determining the coding region by the server 220 is not limited in this application.
Identification information of image
In some scenarios, multiple senders 210 send the transmission images simultaneously. The server 220 may determine the identification information of each image according to the identification information of the sender 210 of each image, and encode the identification information of each image as encoded information into the integration data. After receiving the integration data of the server 220, the receiving end 230 may determine an image to be acquired according to the identification information of the image, and cut the image obtained from the integration data.
It should be understood that the identification information of the image may also be determined based on other information without departing from the teachings of the present application, which is not limited in this application.
Image processing instruction information
In some scenarios, the server 220 may determine, according to the image processing instruction of the sender 210, a processing operation that needs to be performed in the image display process. For example, at the time of image display, an avatar is added to the image. The server 220 may determine the processing instruction information of the image from the image processing instruction, and encode the processing instruction information as encoding information into the integrated data. After receiving the integration data, the receiving end 330 can read the image processing instruction from the encoded information, so as to execute the specified processing operation on the image. Such as the above-mentioned operation of adding an avatar.
In other scenarios, the server 220 may scale the image during integration of the received image into the integrated image (see above). After the image is scaled, the server 220 may determine the image processing operation that the receiver 230 needs to perform based on the scaling. For example, if the server 220 performs downsampling on the image, in order to ensure the display effect of the image on the receiver 230, an instruction instructing the receiver 230 to perform upsampling may be written into the encoded information.
It should be understood that the server 220 may also issue processing instructions for the image to the receiving end 230 in other scenarios and write the processing instructions into encoded information without departing from the teachings of the present application, which is not limited in this application.
It should be understood that, without departing from the teachings of the present application, the server 220 may further use the integration time information of the image as encoding information so that the receiving end 230 determines the transmission quality of the image transmission channel based on the integration time information, and may also use the verification information of the image as encoding information so that the receiving end 230 verifies the image based on the verification information of the image to determine whether the acquired image is correct. The present application does not limit the auxiliary information.
For example, in some embodiments of the present application, the encoded transmission protocol may indicate that the encoded information format is as shown in table 1. Referring to table 1, the encoded information may include: the code start identifier (SOT), the number of SUB-CHANNELs (SUB-CHANNEL NUM), the SUB-CHANNEL number (SUB CHANNEL NUM ID), the X-coordinate of the SUB-CHANNEL start position (SUB CHANNEL STRAT POS X), the Y-coordinate of the SUB-CHANNEL start position (SUB CHANNEL STRAT POS Y), the SUB-CHANNEL activation status (SUB CHANNEL ACTIVE H and SUB CHANNEL ACTIVE V) … …, and the code END identifier (END). The number of the sub-channels indicates the number of the integrated images in the integrated image, the number of the sub-channels indicates the number of the channel where the image needed by the receiving end is located, the X coordinate of the start position of the sub-channel indicates the X coordinate of the image needed by the receiving end in the integrated image, the Y coordinate of the start position of the sub-channel indicates the Y coordinate of the image needed by the receiving end in the integrated image, and the activation state of the sub-channel indicates the state of the transmission channel between the server and the receiving end.
TABLE 1
In some embodiments of the present application, after the server 220 determines the integrated data, the integrated data is transmitted to the receiver 230. For example, the server 220 may output the integrated data through a screen-drop channel. Each integrated image in the integrated data is respectively analyzed, cut and screen-cast by the receiving end, so each integrated image can be respectively regarded as different sub-channels, and the multi-channel data transmission is realized through the screen-cast channel.
After completing the exemplary description of the embodiments of the steps of the image transmission method 300 illustrated in fig. 3, the process of transmitting image data by the transmitting end 210 is described below as an example.
In some embodiments of the present application, the transmitting end 210 may transmit the image to the server end 220 after detecting the image transmission instruction.
In other embodiments of the present application, the transmitting end 210 may also perform some checks after detecting the image transmission instruction, and send the image after the checks pass, so as to reduce unnecessary resource consumption.
Fig. 5 is a flow chart illustrating an image transmission method suitable for a transmitting end according to some embodiments of the present application. As shown in fig. 5, the image transmission method 500 suitable for the receiving end may include the following steps:
S51, acquiring size information of a remaining area in the integration area of the server;
s52, comparing the size information of the residual area with the size information of the image to be transmitted; and
s53, responding to the comparison result to indicate that the size of the residual area is larger than or equal to the size of the image, and sending an integration request containing the image to the server.
According to some embodiments of the present application, before sending an image, the sending end 210 queries the integrated image occupation situation (i.e. the sub-channel situation) of the server 220, and sends the image after determining that the remaining area (i.e. the spare channel capability) meets the requirement of the image that needs to be transmitted at the present time, so as to reduce the insufficient transmission failure of the remaining area or the decision problem of the server 220.
For example, in order to improve the transmission efficiency, the transmitting end 210 may transmit the integration verification request to the server 220 before transmitting the image. The server 220 may determine and feed back the size information of the remaining area in the integration area after receiving the integration verification request from the sender. The transmitting end 210 may compare based on the size information of the remaining area fed back by the server end 220.
Optionally, the sending end 210 may further feedback a zoom prompt after determining that the comparison result indicates that the size of the remaining area is smaller than the size of the image, where the zoom prompt is used to prompt the user to confirm whether to allow the image to be zoomed in the integration process, and the zoom process may be, for example, a zoom-out process. In other words, if the remaining channel capacity is insufficient, the transmitting end 210 may notify the user, and the user may determine whether to allow the image quality to be ignored, and reduce the image to meet the requirement. If the sending end 210 receives the zoom-in confirmation instruction, that is, the user confirms that the image is allowed to be zoomed out, it sends an integration request of the image to the server 220. After receiving the integration request, the server 220 may perform image processing based on the image processing method 200 illustrated in fig. 3 to implement image transmission.
After completing the exemplary description of the embodiments of the steps of the image transmission method 500 illustrated in fig. 5, the process of transmitting image data by the receiving end 230 is described below as an example.
In some embodiments of the present application, fig. 6 is a flow chart of an image transmission method suitable for a receiving end according to some embodiments of the present application. For example, the image transmission method 600 suitable for the receiving end 230 may include the following steps:
s61, obtaining coding information from the obtained integrated data, wherein the coding information comprises the position information of each original image in the integrated image;
s62, acquiring an original image from the integrated image in the integrated data according to the coding information; and
s63, processing each original image.
According to some embodiments of the present application, the server 220 integrates a plurality of images into an integrated image to form integrated data to simulate a multi-channel transmission. After receiving the integration data, the receiving end 230 determines the position of the original image required by itself according to the position information of each original image in the integration image indicated by the encoding information, so as to obtain the original image.
The image transmission system is applied to a projection scene, for example. And data transmission is performed between the server 220 and the receiver 230 through a screen projection channel. The receiving end 230 may continuously receive the integrated data from the drop channel receiving port. The receiving end 230 may then obtain the encoded information from the integrated data. For example, identifying the encoded image in the integrated data, or determining the encoded region from the integrated image, and obtaining the encoded information by decoding the encoded image or the encoded region. The receiving end 230 may determine the number of sub-channels existing in the integrated image (i.e., the number of original images integrated into the integrated image) according to the encoding information, and determine the position information of each sub-channel, i.e., the position information of the original images. For example, the position information of the original image may include coordinates of the upper left corner of the original image, a lateral dimension (width) and a longitudinal dimension (height) of the original image. The receiving end 230 cuts the integrated image according to the decoded information to obtain the original image required by the receiving end.
It should be appreciated that the receiving end 230 may also process the original image based on the processing instruction information in the encoded information and the like and then display the processed image without departing from the teachings of the present application. The relevant content of the processing instruction information may be referred to the above description of the relevant content, and will not be repeated herein.
Alternatively, the image transmission between the server 220 and the receiver 230 may be performed at 30fps to ensure image smoothness. The image transmission can be performed between the server 220 and the receiver 230 at 60fps, in this case, a similar method of time division multiplexing can be also adopted, and adjacent frames adopt different codes, so as to realize the transmission of more channel data. For example, there are M transmitting ends to transmit images, the 2t-1 frame integration image integrates the images from the 1 st screen end to the Q screen end, and the 2t frame integration image integrates the images from the Q+1st screen end to the M screen end, wherein t is greater than or equal to 1.
The above steps of the methods are divided, for clarity of description, and may be combined into one step or split into multiple steps when implemented, so long as they include the same logic relationship, and they are all within the protection scope of this patent; it is within the scope of this patent to add insignificant modifications to the algorithm or flow or introduce insignificant designs, but not to alter the core design of its algorithm and flow.
Embodiments of the present application also provide an electronic device, as shown in fig. 7, an electronic device 700 may include: the at least one processor and the memory are communicatively connected to the at least one processor, and store instructions executable by the at least one processor to enable the at least one processor to perform the image transmission method 300 applicable to a server/the image transmission method 500 applicable to a sender/the image transmission method 600 applicable to a server as mentioned in the above embodiment. Where "/" may mean or.
As an example, the electronic device may be, for example, a transmitting end, such as a smart terminal of a mobile phone, a computer, etc., and the instructions stored in the memory of the electronic device 700 are executed by the processor to implement the image transmission method 500 applicable to the transmitting end as mentioned in the above embodiment.
As an example, the electronic device may be, for example, a server, such as a cluster server, a cloud server, or the like, and the instructions stored in the memory of the electronic device 700 are executed by the processor to implement the image transmission method 300 applicable to the server as mentioned in the above embodiment.
As an example, the electronic device may be, for example, a receiving end, such as an on-board host, a smart tv, or the like, and the instructions stored in the memory of the electronic device 700 are executed by the processor to implement the image transmission method 600 applicable to the receiving end as mentioned in the above embodiment.
One embodiment of the present application also provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the image transmission method 300/500/600.
Fig. 7 is a schematic block diagram of an electronic device 700 according to some embodiments of the present application. As shown in fig. 7, the electronic device 700 includes a processor 701 that can perform various suitable actions and processes in accordance with a computer program stored in a Read Only Memory (ROM) 702 or a computer program loaded from a memory 708 into a Random Access Memory (RAM) 703. In the RAM 703, various programs and data required for the operation of the electronic device 700 may also be stored. The processor 701, the ROM 702, and the RAM 703 are connected to each other through a bus 704. An input/output (I/O) interface 705 is also connected to bus 704.
Various components in the electronic device 700 are connected to the I/O interface 705, including: an input unit 706 such as a button of a car machine, a touch screen, or the like; an output unit 707 connected to, for example, various types of displays, speakers, and the like to output various forms of signals; memory 708, including any medium for storing computer-executable programs; and a communication unit 709 such as a network card, modem, wireless communication transceiver, etc. The communication unit 709 allows the electronic device 700 to exchange information/data with other devices through, for example, a local area network or other wireless communication network.
The processor 701 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 701 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 701 performs the various methods and processes described above, such as the image transmission methods 300/500/600 mentioned in the above embodiments. For example, in some embodiments, the image transmission methods 300/500/600 mentioned in the above embodiments may be implemented as a computer software program, which is tangibly embodied on a computer-readable storage medium, such as the memory 708. In some implementations, part or all of the computer program may be loaded and/or installed onto the electronic device 700 via the ROM 702 and/or the communication unit 709. When the computer program is loaded into the RAM 703 and executed by the processor 701, one or more steps of the image transmission method 300/500/600 mentioned in the above-described embodiment may be performed. Alternatively, in other embodiments, the processor 701 may be configured to perform the image transmission methods 300/500/600 mentioned in the above embodiments in any other suitable way (e.g. by means of firmware).
Various aspects of the present application are described herein with reference to flowchart illustrations and/or timing diagrams of methods, apparatus (systems) and computer program products according to exemplary embodiments of the application. It will be understood that each step of the flowchart and/or timing diagram, and combinations of steps in the flowchart and/or timing diagram, can be implemented by computer readable program instructions.
These computer readable program instructions may be provided to a processor in an electronic device, a processing unit of a general purpose computer, a special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, when executed by the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions/steps specified in the flowchart and/or sequence diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium having the instructions stored therein includes an article of manufacture including instructions which implement the function/act specified in the flowchart and/or sequence diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/steps specified in the flowchart and/or sequence diagram block or blocks.
The flowcharts and time diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of devices, methods and computer program products according to various embodiments of the present application. In this regard, each step in the flowchart or timing diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the steps may occur out of the order noted in the figures. For example, two consecutive steps may in fact be performed substantially in parallel, they may sometimes also be performed in the opposite order, depending on the function involved. It will also be noted that each step of the timing diagrams and/or flowchart illustration, and combinations of steps in the timing diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The above description is merely illustrative of the implementations of the application and of the principles of the technology applied. It should be understood by those skilled in the art that the scope of protection referred to in this application is not limited to the specific combination of the above technical features, but also encompasses other technical solutions formed by any combination of the above technical features or their equivalents without departing from the technical concept. Such as the above-described features and technical features having similar functions (but not limited to) disclosed in the present application are replaced with each other.
Claims (15)
1. An image transmission method, comprising:
integrating the images received from the at least one transmitting end into an integration area to form an integrated image;
determining coding information according to the position information of the image in the integration area; and
and determining and outputting integrated data according to the integrated image and the coding information.
2. The method of claim 1, wherein determining the consolidated data comprises:
performing color block coding on a coding region of the integrated image according to the coding information, and determining the coded integrated image as the integrated data; or,
And encoding according to the encoding information to obtain an encoding image, and determining the encoding image and the integration image as the integration data.
3. The method of claim 2, wherein color block encoding the encoded region of the integrated image according to the encoding information comprises:
dynamically determining the coding region of the integrated image according to the position information of the image; and
and performing color block coding on the coding region of the integrated image according to the coding information.
4. A method according to any one of claims 1 to 3, wherein integrating the images received from the at least one transmitting end into an integration area to form an integrated image comprises:
sorting the images according to the image information of each image or the priority information of the transmitting end corresponding to the image; and
and integrating the images into the integration area in sequence according to the sequencing result to form the integrated image.
5. The method of claim 4, wherein sequentially integrating the images into the integration region according to the ordering result to form the integrated image comprises:
determining the size information of the image according to the image information of the image aiming at the image to be integrated currently, and comparing the size information of the image with the size information of the residual area in the integrated image; and
And integrating the image into the residual area in response to the comparison result indicating that the size of the image is smaller than or equal to the size of the residual area.
6. The method of claim 5, wherein the method further comprises:
and in response to the comparison result indicating that the size of the image is larger than the size of the residual area, performing scaling processing on the image, and integrating the scaled image into the residual area.
7. A method according to any one of claims 1 to 3, wherein the method further comprises:
and determining and feeding back the size information of the remaining area in the integration area in response to receiving the integration verification request of the transmitting end.
8. A method according to any one of claims 1 to 3, wherein determining coding information from the position information of the image in the integration region comprises:
determining the coding information according to the position information and the auxiliary information of the image; wherein the auxiliary information includes at least one of: the method comprises the steps of encoding position information of an area, identification information of an image, processing instruction information of the image, integration time information of the image and verification information of the image.
9. A method according to any one of claims 1 to 3, wherein outputting the consolidated data comprises:
and outputting the integrated data through a screen projection channel.
10. An image transmission method of a transmitting end, comprising:
acquiring size information of a remaining area in an integration area of a server;
comparing the size information of the residual area with the size information of the image to be transmitted; and
and responding to the comparison result to indicate that the size of the residual area is larger than or equal to the size of the image, and sending an integration request containing the image to a server.
11. The image transmission method according to claim 10, wherein the method further comprises:
responding to the comparison result to indicate that the size of the residual area is smaller than the size of the image, and feeding back zoom prompt information, wherein the prompt information is used for prompting a user to confirm whether to allow the image to be zoomed in the integration process; and
and sending an integration request containing the image to a server in response to receiving the confirmation scaling instruction.
12. A method for transmitting an image at a receiving end, comprising:
acquiring coding information from the acquired integrated data, wherein the coding information comprises position information of each original image in the integrated image;
Acquiring the original image from the integrated image in the integrated data according to the coding information; and
and processing each original image.
13. An image transmission system, comprising:
at least one transmitting terminal configured to transmit an image to be processed;
the server is configured to integrate the images sent by the sending end into an integration area to form an integrated image, determine coding information according to the position information of the images in the integration area, and determine and output integrated data according to the integrated image and the coding information; and
and the receiving end is configured to receive the integrated data, acquire coding information from the integrated data, acquire the images from the integrated images in the integrated data according to the coding information and process the images.
14. An electronic device, comprising:
at least one processor; the method comprises the steps of,
a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the image transmission method of any one of claims 1 to 9, or the image transmission method of claim 10 or 11, or the image transmission method of claim 12.
15. A computer-readable storage medium storing a computer program, characterized in that the computer program, when executed by a processor, implements the image transmission method according to any one of claims 1 to 9, or the image transmission method according to claim 10 or 11, or the image transmission method according to claim 12.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202311467321.8A CN117528092A (en) | 2023-11-06 | 2023-11-06 | Image transmission method, image transmission system, electronic device, and storage medium |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202311467321.8A CN117528092A (en) | 2023-11-06 | 2023-11-06 | Image transmission method, image transmission system, electronic device, and storage medium |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN117528092A true CN117528092A (en) | 2024-02-06 |
Family
ID=89746893
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202311467321.8A Pending CN117528092A (en) | 2023-11-06 | 2023-11-06 | Image transmission method, image transmission system, electronic device, and storage medium |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN117528092A (en) |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104143200A (en) * | 2014-07-16 | 2014-11-12 | 华南理工大学 | A Method of Frame Type Coding and Intelligent Recognition of Image Additional Information |
| CN112073731A (en) * | 2020-09-16 | 2020-12-11 | 厦门市美亚柏科信息股份有限公司 | Image decoding method, image decoding device, computer-readable storage medium and electronic equipment |
| CN112822516A (en) * | 2020-12-30 | 2021-05-18 | 北京大学 | Image group transmission method, device, equipment and system based on data block recombination |
| CN113438747A (en) * | 2021-06-07 | 2021-09-24 | 深圳传音制造有限公司 | Processing method, processing apparatus, and storage medium |
| CN113595984A (en) * | 2021-06-29 | 2021-11-02 | 北京来也网络科技有限公司 | Data transmission method and device combining RPA and AI, electronic equipment and storage medium |
| CN116149581A (en) * | 2021-11-19 | 2023-05-23 | 华为终端有限公司 | A multi-window object projection method, electronic equipment and system |
-
2023
- 2023-11-06 CN CN202311467321.8A patent/CN117528092A/en active Pending
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104143200A (en) * | 2014-07-16 | 2014-11-12 | 华南理工大学 | A Method of Frame Type Coding and Intelligent Recognition of Image Additional Information |
| CN112073731A (en) * | 2020-09-16 | 2020-12-11 | 厦门市美亚柏科信息股份有限公司 | Image decoding method, image decoding device, computer-readable storage medium and electronic equipment |
| CN112822516A (en) * | 2020-12-30 | 2021-05-18 | 北京大学 | Image group transmission method, device, equipment and system based on data block recombination |
| CN113438747A (en) * | 2021-06-07 | 2021-09-24 | 深圳传音制造有限公司 | Processing method, processing apparatus, and storage medium |
| CN113595984A (en) * | 2021-06-29 | 2021-11-02 | 北京来也网络科技有限公司 | Data transmission method and device combining RPA and AI, electronic equipment and storage medium |
| CN116149581A (en) * | 2021-11-19 | 2023-05-23 | 华为终端有限公司 | A multi-window object projection method, electronic equipment and system |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN103037205B (en) | Video transmission method and system | |
| US12327380B2 (en) | 3D scene transmission with alpha layers | |
| US10430665B2 (en) | Video communications methods using network packet segmentation and unequal protection protocols, and wireless devices and vehicles that utilize such methods | |
| CN108650460B (en) | Server, panoramic video storage and transmission method and computer storage medium | |
| US20200259880A1 (en) | Data processing method and apparatus | |
| CN109726034A (en) | Data processing device and data processing method | |
| CN110958431A (en) | Multi-channel video compression post-transmission system and method | |
| US9489331B2 (en) | Method and protocol for high-speed data channel detection control | |
| CN116405716B (en) | Video transmission method, device, equipment and storage medium | |
| US8179421B2 (en) | Image synthesizing device and method and computer readable medium | |
| US20120308147A1 (en) | Image processing device, image processing method, and program | |
| CN111953990A (en) | Coding method and device | |
| US8681860B2 (en) | Moving picture compression apparatus and method of controlling operation of same | |
| CN118555410B (en) | Video image rapid transmission method for recorder, recorder and system | |
| JPH11122498A (en) | Picture processor | |
| CN117528092A (en) | Image transmission method, image transmission system, electronic device, and storage medium | |
| US6369848B1 (en) | Picture data transmission device and picture signal coding method thereof | |
| CN117998092A (en) | Image processing method, device, electronic equipment and storage medium | |
| JP4329429B2 (en) | Image transfer apparatus, image transfer method, and image transfer program | |
| CN110545445A (en) | Video compression method and device, server and computer readable storage medium | |
| CN113327302A (en) | Picture processing method and device, storage medium and electronic device | |
| CN108881924A (en) | Data transmission set | |
| CN114928715A (en) | Data transmission method and system | |
| CN115706835A (en) | Data transmission method and related device | |
| US10382771B2 (en) | Image processing apparatus |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination |