WO2018120519A1 - Procédé et dispositif de traitement d'image - Google Patents
Procédé et dispositif de traitement d'image Download PDFInfo
- Publication number
- WO2018120519A1 WO2018120519A1 PCT/CN2017/080645 CN2017080645W WO2018120519A1 WO 2018120519 A1 WO2018120519 A1 WO 2018120519A1 CN 2017080645 W CN2017080645 W CN 2017080645W WO 2018120519 A1 WO2018120519 A1 WO 2018120519A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- region
- resolution
- area
- image
- video image
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
- H04N21/234345—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements the reformatting operation being performed only on part of the stream, e.g. a region of the image or a time segment
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/4728—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for selecting a Region Of Interest [ROI], e.g. for requesting a higher resolution version of a selected region
Definitions
- the present invention relates to the field of image processing, and more particularly to a method and apparatus for image processing.
- the Content Delivery Network is a network built on the existing transport network.
- the CDN relies on the edge servers deployed in various places, through the load balancing, content distribution, scheduling and other functional modules of the central platform. Users get the content they need, reduce network congestion, and improve user access response speed and hit rate.
- the technical architecture for transmitting video by CDN is as shown in FIG. 1. First, the network server uploads the video file to the file server, and then the file server distributes the video file to multiple content distribution networks, and finally, the user passes the playback terminal. Get or watch videos from their respective distribution networks.
- the CDN method solves the problem of transmission and distribution of video, when the amount of data of the video is too large, in the case where the transmission bandwidth is small or limited, the user may have problems such as video jamming during the process of watching the video. Users can't get a better viewing experience.
- Embodiments of the present invention provide a method and apparatus for image processing, which can process a video image of a target video with a limited transmission bandwidth, and reduce the data amount of the target video while providing a better viewing experience for the user, which is more useful.
- the video data is transmitted to the user's playback terminal.
- a first aspect of the embodiments of the present invention provides a method for image processing, including:
- the processed target video image is sent to the playback terminal.
- the hotspot area of the target video image is a focused area of the user's line of sight, that is, a certain area that the user pays attention to or is interested in.
- the hotspot area may be, for example, a ball in a ball game, specifically, soccer, volleyball, basketball, etc. It can be a singer in a concert, specifically a singer's face, a participant in a meeting, and so on.
- the target video image area may be divided into the first area and the second area according to the hotspot area, so that the hotspot area is in the first area, and then the first area is processed at the first resolution to be gradual
- the resolution processes the second area, and finally sends the processed target video image to the user's playing terminal, wherein the first resolution is greater than all resolutions of the second area, and the resolution of the second area follows the hotspot area The distance is increased, and the resolution is reduced.
- the resolution of the hotspot area is higher by processing the video image, and the resolution of other areas is lower, and the data amount of the processed video image is smaller than
- the amount of data of the original video image, the useful video data, that is, the video data that the user wants to watch, is transmitted to the user's playing terminal to ensure the user's viewing experience and save bandwidth costs.
- the method before the processing, by the first resolution, the first region of the target video image, the method further includes: determining a hotspot region of the target video image; Determining a first region and the second region of the target video image.
- the determining, by the hotspot region, the first region of the target video image includes: determining a center of the hotspot region; An area having a center of the center and a radius of the first predetermined distance is determined as the first area of the target video image.
- the determining, by the hotspot region, the first region of the target video image includes: a minimum of the hotspot region to be included
- the regular graphics area is determined to be the first area of the target video image.
- the minimum regular pattern area includes, but is not limited to, a circular area, a rectangular area, and a diamond shaped area.
- the method before the processing, by the first resolution, the first area of the target video image, the method further includes: determining a transmission bandwidth with the play terminal; The area size of the first area, the area size of the second area, and the transmission bandwidth determine the resolution of the first resolution and the gradation.
- the transmission bandwidth between the target and the playback terminal may be determined according to the target image acquisition request sent by the user through the playback terminal, for example, the bandwidth data is carried in the target image acquisition request; and the playback terminal may be determined according to the access bandwidth of the playback terminal.
- the transmission bandwidth of the terminal, wherein the access bandwidth of the terminal is the actual network bandwidth of the user; and the transmission bandwidth between the terminal and the playback terminal may also be determined according to the effective bandwidth that the transmission network can provide.
- the resolution of the gradation includes determining a resolution of a target image position in the second region according to a distance between a target image position in the second region and a center of the hot spot region.
- the determining, according to a distance between a target image location in the second region and a center of the hotspot region, The resolution of the target image position in the second region includes: obtaining a distance between the target image position in the second region and a center of the hot spot region; determining a target image position in the second region according to a gradation function
- the resolution wherein the gradient function is a function that characterizes a correspondence between the resolution and the distance.
- the distance may be in units of centimeters, inches, pixels, and the like.
- the second area includes a third area and a fourth area, where the third area and the hotspot area a shortest distance between the centers is smaller than a shortest distance between the fourth region and a center of the hotspot region; the determining is based on a distance between a target image location in the second region and a center of the hotspot region
- the resolution of the target image position in the second region includes: acquiring a distance between a target image position in the third region and a center of the hot spot region; determining a target in the third region according to a gradation function a resolution of an image position, wherein the gradient function is a function characterizing a correspondence between the resolution and the distance; determining a second resolution as a resolution of a target image position in the fourth region Wherein the second resolution is less than or equal to a minimum resolution of a target image location in the third region, the second resolution being a fixed resolution.
- the second resolution is 1/N of the first resolution, and N is an integer greater than 1;
- the second resolution is the target in the third region 1/P of the minimum resolution of the target image position, P is an integer greater than or equal to 1.
- the gradation function may be a parabolic function, an elliptic function, or a one-time decreasing function, and the like, which satisfies a gradual change function of a human visual change rule.
- the second area includes K sub-areas
- the resolution of the gradation is a resolution that varies discretely
- the resolution corresponding to the sub-region is a fixed resolution, wherein the shortest distance between the (K-1)th sub-region and the center of the hotspot region is smaller than the shortest distance between the K-th sub-region and the center of the hotspot region
- K is a positive integer greater than or equal to 2
- determining the resolution of the target image position in the second region according to the distance between the target image position in the second region and the center of the hotspot region includes: Obtaining a distance between a target image position in the second area and a center of the hot spot area; determining a sub-area where the target image position is located according to the distance; and corresponding to the sub-area according to the resolution
- the relationship determines the resolution of the target image location.
- the shape of the sub-region may be a regular shape such as
- the corresponding relationship between the resolution and the sub-region includes: a first sub-region of the target video image
- the corresponding resolution is 1/L of the first resolution or a difference between the first resolution and the first preset value, where L is an integer greater than 1.
- the corresponding relationship between the resolution and the sub-region further includes: a Kth of the target video image
- the resolution corresponding to the sub-region is 1/M of the resolution corresponding to the (K-1)th sub-region, and M is an integer greater than 1; or the resolution corresponding to the K-th sub-region of the target video image is The difference between the resolution corresponding to the (K-1)th sub-region of the target video image and the second preset value.
- the method further includes: determining a target bandwidth required for transmitting the target video image and a transmission bandwidth with the playing terminal; In the case of being greater than the transmission bandwidth, the step of processing the first region of the target video image at the first resolution is performed.
- the target video image is an original target video image, that is, a target video image before processing.
- the resolution of the gradation comprises a resolution that varies continuously or a resolution that varies discretely.
- a second aspect of the embodiments of the present invention provides an apparatus for image processing, including:
- a first area processing module configured to process, in a first resolution, a first area of the target video image, where the first area includes a hotspot area of the target video image;
- a second area processing module configured to process the second area of the target video image with a gradual resolution, the resolution of the gradation being smaller than the first resolution, wherein the distance in the second area is The higher the resolution corresponding to the image at the position closer to the hot spot region, the second region is located outside the first region, and the second region and the first region constitute the target video image;
- a sending module configured to send the processed target video image to the playing terminal.
- the apparatus for image processing provided by the second aspect of the present invention further includes other program modules for performing the image processing method provided by the first aspect of the embodiments of the present invention, and details are not described herein again.
- a third aspect of the embodiments of the present invention provides an apparatus for image processing, including a processor, a memory, and a communication interface, wherein the processor, the memory, and the communication interface are connected to each other, wherein the communication interface is configured to receive And transmitting data for storing an application code supporting the image processing apparatus for performing the above method, the processor being configured to perform the various methods of the first aspect described above.
- a fourth aspect of the embodiments of the present invention provides a computer storage medium for storing computer program instructions for use in the apparatus for image processing described above, comprising a program for performing the above first aspect.
- a fifth aspect of the embodiments of the present invention provides a computer program for performing the various methods provided by the above first aspect.
- FIG. 1 is a technical architecture diagram of content distribution using a CDN in the prior art
- FIG. 2 is a schematic structural diagram of a system for performing live video broadcast according to an embodiment of the present invention
- FIG. 3 is a schematic flowchart of a method for image processing according to an embodiment of the present invention.
- 4a is a target video image of an undistinguished hotspot area and a non-hotspot area provided by an embodiment of the present invention
- FIG. 4b is a target video image after distinguishing a hot spot area according to an embodiment of the present invention.
- FIG. 5 is a schematic diagram of a relationship between a hot spot area, a first area, and a second area according to an embodiment of the present invention
- FIG. 5b is a schematic diagram of another hotspot area, a first area, and a second area according to an embodiment of the present invention.
- FIG. 5c is a schematic diagram of another relationship between a hot spot area, a first area, and a second area according to an embodiment of the present invention.
- FIG. 6 is a distribution diagram of a first area and a second area according to an embodiment of the present invention.
- FIG. 7a is still another distribution diagram of a first area and a second area according to an embodiment of the present invention.
- FIG. 7b is still another distribution diagram of the first area and the second area provided by the embodiment of the present invention.
- FIG. 8a is still another distribution diagram of a first area and a second area according to an embodiment of the present invention.
- FIG. 8b is still another distribution diagram of the first area and the second area provided by the embodiment of the present invention.
- FIG. 9a is a target video image before processing according to an embodiment of the present invention.
- FIG. 9b is a processed target video image according to an embodiment of the present invention.
- FIG. 10 is a schematic structural diagram of an apparatus for image processing according to an embodiment of the present invention.
- FIG. 11 is a schematic structural diagram of another apparatus for image processing according to an embodiment of the present invention.
- a system structure for performing live video broadcast includes a video collection end 101 and a video splicing server 102.
- the video collection end 101 is configured to capture various initial video images, such as live video images, and send the collected video images to the video splicing server 102.
- the video capture end 101 can be, for example, a camera, a camera, etc.; the video splicing server 102 is used to capture the video image collected by the video capture end 101.
- the transmission distribution network 104 includes a plurality of cache servers/streaming servers, and the transmission distribution network 104 distributes the target video to the respective playback terminals 105.
- the transmission distribution network 104 can be, for example, a CDN, and the playback terminal can be, for example, a mobile phone or a virtual reality. (English: Virtual Reality, VR) Devices, tablets, etc. that can be used to play video.
- the video may be a VR video or a normal video.
- the system structure shown in FIG. 2 can transmit the original target video to the user's play terminal.
- the image processing method provided by the embodiment of the present invention can be implemented, where the video stitching server can be executed.
- the step of processing the original target video image, wherein the original target video image may be any one of the target video images, the step specifically comprising: performing the first region of the target video image at the first resolution.
- Processing processing the second area of the target video image with a gradual resolution, or performing the processing of the target video image by a cache server/streaming server in the transmission distribution network, and also in the system Adding an image processing node to the structure to perform the step of processing the target video image, for example, adding an image processing server to perform processing on the target video image in the transmission distribution network; and then processing by the transmission distribution network After the target video image is sent to the user's end of play
- the terminal displays the video data useful to the user.
- image processing method provided by the embodiment of the present invention may also be performed by two or more servers in the foregoing system structure, and no further enumeration is made herein.
- the system structure shown in FIG. 2 may also be a schematic structural diagram of a system for performing video on demand.
- the video splicing server 102 sends the target video to the network server 103, and the network server 103 After the target video is distributed to the plurality of transmission distribution networks 104, the target video may be cached or stored in the cache server/streaming server for a period of time, and when the user needs to view the target video, then the playback terminal is used to the network server.
- the network server Initiating a target video acquisition request, and then the network server returns the address of the cache server/streaming server closest to the playing terminal to the playing terminal, and the playing terminal accesses the cache server/streaming server closest to the playing terminal according to the address to view the target.
- the step of processing the target video image may be performed by a cache server/streaming server in the transmission distribution network, or an image processing server may be added to the transmission distribution network to perform the target video image. The steps of processing.
- FIG. 3 is a schematic flowchart diagram of a method for image processing according to an embodiment of the present invention.
- the method of the present invention may be implemented on the server mentioned above, where the server specifically includes a video splicing server and a cache server/flow.
- At least one of a media server and an image processing server, the method including but not limited to the following steps:
- Step S201 processing a first area of the target video image with a first resolution, where the first area includes a hotspot area of the target video image.
- the hotspot area is the focus area of the user's line of sight, that is, compared with other areas of the target video image, the hotspot area is an area where the user is more concerned or interested, and the area that the user desires to see more clearly.
- a hotspot area of the target video image may also be determined and according to the hotspot The region determines a first region and a second region of the target video image.
- the target video image is a video image of a soccer game live video
- the area where the soccer ball is located may be determined as a hot spot area or a face area of a certain player may be determined as a hot spot area
- the target video image is sung
- a video image of a live video can be used to determine the area where the singer's face is located as a hotspot area or the area where the singer's body is located as a hotspot area
- the target video image is a large conference, such as a target
- the video image is a video image of a national conference, and the face of the speaker or the area where the body is located can be determined as the hotspot area.
- FIG. 4a is a target video image that does not distinguish between a hot spot area and a non-hot spot area
- FIG. 4b is a target video image after distinguishing the hot spot area
- FIG. 4a is a video image of a basketball game video.
- the image there are images such as players, stadiums, basketball baskets, basketballs, etc.
- the image images defined by the white areas that is, the image images corresponding to the basketballs are hotspots
- the image images of the black areas are corresponding to the areas outside the basketball.
- the image image is a non-hot spot area.
- the hot spot region of the target video image may be determined according to the image-based target recognition method, wherein the image-based target recognition method may be a feature-based target detection method or a texture-based and boundary feature-based target Detection method.
- the image-based target recognition method may be a feature-based target detection method or a texture-based and boundary feature-based target Detection method.
- a Haar-like feature can be adopted, and the Haar-like feature is a simple rectangular feature, which is named after a Haar wavelet (English: Haar wavelet).
- a grayscale change that reflects the local features of the detected object For example, by extracting the Haar-like feature of the to-be-identified region of the target video image, wherein the to-be-identified region includes the hotspot region and the non-hotspot region, and then obtaining the feature values of the to-be-identified region, and then using the AdaBoost (English: Adaptive Boosting) algorithm to perform these features. The values are categorized to determine the hotspot area.
- AdaBoost English: Adaptive Boosting
- the features of the pixel points are extracted by using a Harris operator, a Forstner operator, a Moravic operator, etc., and then clustering operations are performed to form different categories. Characteristics.
- the center of the first area may coincide with the center of the hot spot area, so that the hot spot area is located at an intermediate position of the first area.
- the first area and the second area may be determined in at least two manners: 1) the center of the hot spot area may be determined, and the center is to be Determining, as a first region of the target video image, a region having a radius of a center and a first predetermined distance, wherein the first predetermined distance radius is greater than a maximum distance between a center of the hot spot region and an edge of the hot spot region, and correspondingly The area outside the first area is the second area; 2) the minimum regular graphic area including the hot spot area may be determined as the first area of the target video image, wherein the minimum rule graphic may be a circle or a rectangle A shape area such as a triangle, a diamond, or the like, and correspondingly, a region outside the first area is the second area.
- the unit of the first preset distance may be an image distance unit of centimeters, inches, pixels, etc., and the first preset distance may be set according to the length and width of the total area of the target video image, for example, the length of the total area is 1 /2, 1/3, etc.; may also be set according to the longest distance between the center of the hot spot area and the edge of the hot spot area, for example, the sum of the longest distance and a preset threshold, specifically, a preset threshold, for example The distance is 1 unit.
- the size of the first area may be a fixed setting, or may be set by the user, for example, the user will
- the first predetermined distance is set to 2 cm, and it is determined that the first area is a circular area with a center of the hot spot area as a center and a radius of 2 cm, and the area of the first area is 4 ⁇ square centimeter.
- the hot spot area in addition to determining the first area and the second area by using the above two manners, the hot spot area may be directly determined as the first area.
- the area outside the first area is the second area.
- FIG. 5a is a schematic diagram of a relationship between a hot spot area, a first area, and a second area according to an embodiment of the present invention.
- the first area is centered on the center of the hot spot area, and the first preset distance is a circular area of a radius, the hotspot area is in the first area, and the area outside the first area is the second area;
- FIG. 5b is another hotspot area, the first area, and the second area provided by the embodiment of the present invention.
- FIG. 5b A schematic diagram of the relationship, in FIG. 5b, the first area is the smallest square area including the hotspot area, the center of the first area and the center of the hot spot area coincide, and the area outside the first area is the second area;
- FIG. 5c is provided by the embodiment of the present invention.
- the transmission bandwidth between the playback terminal and the target bandwidth required for transmitting the target video image may be determined. If the target bandwidth is greater than the transmission bandwidth, step S201 is performed; and the target bandwidth is smaller than the transmission. In the case of bandwidth, the target video image is not processed, and the unprocessed/original target video image is directly transmitted to the playback terminal.
- the transmission bandwidth can be determined in at least the following ways:
- the access bandwidth of the playback terminal is determined as the transmission bandwidth, that is, the actual bandwidth of the user is determined as the transmission bandwidth;
- the bandwidth resource of the server is not limited, the bandwidth indication sent by the user through the playing terminal is received, and the transmission bandwidth is determined according to the bandwidth indication, that is, the bandwidth specified by the user is determined as the transmission bandwidth; for example, the current connection of the user is assumed.
- the inbound bandwidth is 20M, but the user does not want all the bandwidth to be used to transmit the target video image. The user only wants to use 10M bandwidth to transmit the target video image.
- the user can set the playback bandwidth to 10M on the playable terminal, and the playback terminal will bandwidth.
- the indication is set to 10M, and after receiving the bandwidth indication, the server determines 10M as the transmission bandwidth.
- the bandwidth resource of the server is limited, the current free bandwidth of the server is determined according to the bandwidth resource allocation of the server, and then part of the bandwidth of the idle bandwidth is determined to be between the playback terminal and the terminal according to a certain bandwidth allocation rule. Transmission bandwidth, in which case the transmission bandwidth is less than or equal to the access bandwidth of the playback terminal.
- Step S202 processing a second region of the target video image with a gradient resolution, the resolution of the gradient being smaller than the first resolution, wherein the second region is closer to the hotspot region The higher the resolution corresponding to the image at the location, the second region is outside the first region, and the second region and the first region constitute the target video image.
- the resolution of the first resolution and the gradation may be determined according to a transmission bandwidth with the playback terminal.
- the first resolution may be determined according to the transmission bandwidth, the size of the area of the first area, and the size of the area of the second area, according to the location between the target image location in the second area and the center of the hotspot area. The correspondence between the distance and the resolution determines the resolution of the target image position.
- the size of the area of the first area and the second area may be measured by area or by area The total number of pixels is measured.
- the resolution in the second region satisfies the condition that the closer the image position is to the hot spot region, the higher the resolution is, the at least the following resolution change rules may be set for the second region: 1) The resolution of the two regions changes continuously, and there is a function correspondence between the resolution and the distance, wherein the distance refers to the distance between the target image position of the second region and the center of the hot spot region, and the distance is larger. The smaller the function value is; 2) the resolution of the second region varies discretely, wherein the second region is divided into a plurality of sub-regions, the resolution in the sub-region is fixed, and the different sub-regions correspond to different resolutions. The farther the sub-area from the center of the hot spot area, the lower the corresponding resolution; 3) the resolution of the second area changes continuously in a part of the second area, and the other part of the second area is discrete Variety.
- the gradation function between the resolution and the distance may be a functional relationship such as a parabolic function, an elliptic function, or a decremental function that conforms to changes in the human visual.
- the resolution of the first resolution and the gradation determined in the following scenarios refers to the resolution per unit area. If it is necessary to determine the total resolution of a certain area of the target video image, the resolution on a unit area is determined. The rate is multiplied by the corresponding area to get the total resolution of a certain area.
- the center of the first area is the center of the hotspot area.
- the center of the first area may not be consistent with the center of the hotspot area.
- Implementation scenario 1 The resolution of the second region changes continuously, wherein there is a function correspondence between the resolution and the distance.
- the distance can be used as the argument of the gradual function
- the resolution is used as the dependent variable of the gradual function
- the dependent variable and the independent variable satisfy the decreasing correspondence, that is, the dependent variable increases with the independent variable. If you decrease, you can use the decreasing function as the gradient function between resolution and distance.
- the size of the first area may be first determined, and the maximum radius of the second area is determined, where the maximum radius refers to the maximum distance between the center of the hot spot area and the edge of the second area, and then the transmission bandwidth is determined according to the transmission bandwidth.
- the resolution determines the first resolution according to the principle that the total number of pixels is consistent and the gradient function, and determines the resolution at each distance according to the gradient function of the distance and the resolution, wherein the total number of pixels is consistent, the pointer is the same as the size of the region.
- the target video image, the target video image before the resolution change contains a total amount of pixels equal to the total number of pixels included in the target video image after the resolution change.
- the resolution R0 described in the embodiment refers to the resolution corresponding to the transmission bandwidth, that is, the transmission bandwidth can support the resolution of the transmitted target video image per unit area when the playback is not stuck, and at the same time,
- R1, R0, and R2 all refer to the resolution per unit area.
- the second area is placed in a ring composed of two concentric circles centered on the center of the hot spot area, as shown in FIG. 6, in which case the radius of the circle 1 is the center of the hot spot area and the second area.
- the circle 2 is the smallest circle containing the second region
- the radius d2 of the circle 2 is the maximum distance between the center of the hot spot region and the boundary of the second region, according to the total pixel point
- the principle of quantity constant in this case, the principle that the total number of pixels is constant is that the sum of the total number of pixels of circle 1 and the total number of pixels of the ring is equal to the total number of pixels of circle 2, and the formula is obtained:
- R1 (R0*d2 2 +1/3d1 3 -2/3d2 3 -d2 2 d1)/d2 2
- d1, d2 may be obtained after determining the first area and the second area.
- Implementation scenario 2 the resolution of the second region varies discretely, and the second region has K sub-regions, wherein the shortest distance between the (K-1)th sub-region and the center of the hotspot region is smaller than the K-th sub-region The shortest distance from the center of the hot spot area, that is, visually, the Kth sub-area is outside of the (K-1)th sub-area or surrounds the (K-1)th sub-area.
- the resolutions of the respective sub-regions may be correlated, so that the resolution between the sub-regions is regular.
- the change for example, the resolution of the (K-1)th sub-region of the Kth sub-region is 1/M, M is an integer greater than 1, and the resolution of the K-th sub-region is the resolution of the (K-1)th sub-region.
- the rate is subtracted from the second preset value; then the resolution of the sub-region closest to the center of the hot spot region among the K sub-regions is determined, that is, the resolution of the first sub-region is determined.
- the resolution of the first sub-region is 1/L of the first resolution or the first resolution minus the first preset value, where L is an integer greater than 1.
- L and M may be set equal or different; the first preset value and the second preset value may be set equal to each other, or may be unequal.
- the resolutions of the sub-areas may not be associated, that is, the resolution of each sub-area is irregular, and each sub-area may be associated with the first resolution.
- the resolution of the first sub-region is the first resolution minus the first preset value
- the resolution of the second sub-region is the second resolution minus the second preset value
- the resolution of the third sub-region is The resolution of the first resolution is 1/L...the Kth sub-region is the first resolution 1/M, here only as an example, in addition, each sub-region needs to satisfy the condition that "the farther away from the hot spot region, the lower the corresponding resolution is.”
- the first resolution may also be replaced with a preset fixed resolution smaller than the first resolution.
- the resolution relationship between the above two sub-regions and the resolution of each region are determined by referring to the respective sub-regions of the first region and the second region.
- FIG. 7a is another distribution diagram of the first area and the second area, wherein the first area has a circular shape with a radius of d1; A resolution is R1, the second area is a gray area outside the first area, and the second area has three sub-areas, and the first sub-area is closest to the first area, and is a ring having an inner ring radius of d1.
- the ring radius is d2; the second sub-region is adjacent to the first sub-region, and is also a ring, the inner ring radius is d2, the outer ring radius is d3; the third sub-region is outside the second sub-region, the target video image is
- the area is S, and the resolution corresponding to the transmission bandwidth is R0.
- the resolution of the first sub-region is 1/L of the first resolution and the resolution of the K-th sub-region is 1/M of the resolution of the (K-1)th sub-region
- both L and M are assumed
- the value is 2, according to the principle that the total number of pixels is constant, in this case, the principle that the total number of pixels does not change is that the sum of the total number of pixels of the first region and the second region is equal to the total pixel of the video image.
- the amount, the sum of the total number of pixels in each sub-area is equal to the total number of pixels in the second area, and the formula is obtained:
- R1 (R0*S)/(1/2 ⁇ d1 2 +1/4 ⁇ d2 2 +1/8 ⁇ d3 2 +1/8S)
- the first resolution R1 can be determined according to R0, d1, d2, d3, and S, and the resolution of each sub-area is determined according to the relationship between each sub-area and R1, and the resolution of the first sub-area is 1/1 2R1, the resolution of the second sub-area is 1/4R1, and the resolution of the third sub-area is 1/8R1.
- the resolution of the first sub-region is the first resolution minus the first preset value
- the resolution of the K-th sub-region is the resolution of the (K-1)th sub-region minus the second preset value.
- the principle that the total number of pixel points does not change is the pixel points of the first region and the second region.
- the sum of the total amounts is equal to the total number of pixels of the video image
- the sum of the total number of pixels of each sub-area is equal to the total number of pixels of the second area
- ( ⁇ d1 2 )*R1 is the total number of pixel points in the first region
- ( ⁇ d2 2 - ⁇ d1 2 )*(R1-r) is the total number of pixel points in the first sub-region
- ( ⁇ d3 2 - ⁇ d2 2 )* (R1-2r) is the total number of pixel points in the second sub-region
- (S- ⁇ d3 2 )*(R1-3r) is the total number of pixel points in the third sub-region
- R0*S is the pixel point of the entire target video image. Total amount.
- R1 (R0*S+3r*S- ⁇ d1 2 *r- ⁇ d2 2 *r- ⁇ d3 2 *r)/S.
- the first resolution R1 can be determined according to R0, d1, d2, d3, S, and r, and the resolution of each sub-area is determined according to the relationship between each sub-area and R1, and the resolution of the first sub-area is R1-r, 2nd The resolution of the sub-area is R1-2r, and the resolution of the third sub-area is R1-3r.
- FIG. 7b is another distribution diagram of the first area and the second area, wherein the first area has a rectangular shape and the area is S1, the first The resolution is R1, the second area is a gray area outside the first area, the second area has three sub-areas, and the distance between the first sub-area and the second area is the closest, which is a rectangle, the area is S2, and the second sub-area
- the first sub-region is adjacent to each other, and has a rectangular shape
- the area is S3, and the third sub-region is outside the second sub-region
- the area of the target video image is S
- the resolution corresponding to the transmission bandwidth is R0.
- the resolution of the first sub-area is 1/2 of the first resolution
- the resolution of the second sub-area is 1/3 of the first resolution
- the resolution of the third sub-area is 1/ of the first resolution. 4.
- the principle that the total number of pixel points does not change is that the sum of the total number of pixel points of the first area and the second area is equal to the total number of pixel points of the entire video image, and each The sum of the total number of pixels in the sub-area is equal to the total number of pixels in the second area, and the formula is obtained:
- S1*R1 is the total number of pixels in the first region
- S2*1/2R1 is the total number of pixels in the first sub-region
- S3*1/3R1 is the total number of pixels in the second sub-region
- (S- S3-S2-S1)*1/4R1 is the total number of pixels of the third sub-region
- R0*S is the total number of pixels of the entire target video image.
- R1 (R0*S)/(3/4S1+1/4S2+1/12S3+1/4S)
- the first resolution R1 can be determined according to R0, S1, S2, S3, S4, and S, and further the resolution 1/2R1 of the first sub-region and the resolution 1/3R1 of the second sub-region are determined.
- the resolution of the 3 sub-regions is 1/4R1.
- S1, S2, S3, and S4 may be set as defaults, or may be set by a user, and will not be discussed here.
- the second area may further include a plurality of sub-areas, for example, the second area includes 4 sub-areas, 5 sub-areas, etc., and the shape of each sub-area It is not limited to the above-mentioned circular and rectangular shapes, and the resolution of each sub-area may have other variations.
- the distance between the target image location of the second region China and the center of the hotspot region may be first obtained, and the sub-region where the target image location is located is determined according to the distance, and then according to the resolution and the sub-region
- the correspondence between the target image positions is determined, for example, determining that the target image position is located in the first sub-region, and the resolution of the first sub-region is the resolution of the target image position.
- Implementation scenario 3 The resolution of the second region changes continuously in a portion of the target video image, and varies discretely in another portion of the target video image.
- the resolution of each target image location may be determined by referring to the solution in the foregoing implementation scenario 1.
- the scheme determines the resolution of each sub-region, and further determines the resolution of the target image location. Specifically, in which sub-region the target image location is located, the resolution of the target image location is the resolution of the current sub-region.
- FIG. 8a is another distribution diagram of the first region and the second region, wherein the first region has a circular shape and a radius of d1.
- One resolution is R1
- the second area is a gray area outside the first area
- the second area has two areas
- the second area includes a third area and a fourth area, wherein the third area is a ring, the inner ring
- the radius is d1 and the outer ring radius is d2.
- the resolution of the third region changes continuously.
- the fourth region is outside the third region.
- the area includes two sub-areas, A area and B area respectively.
- the A area is adjacent to the third area.
- the radius of the ring in the A area is d2, the radius of the outer ring is d3, and the area B is outside the area A.
- the resolution of the A area is assumed. 1/2 of the minimum resolution of the third region,
- the resolution of the B area is 1/3 of the minimum resolution of the third area,
- the area of the target video image is S, and the resolution corresponding to the transmission bandwidth is R0.
- the principle that the total number of pixel points does not change is that the sum of the total number of pixel points of the first region, the third region, and the fourth region is equal to the total number of pixels of the entire video image.
- the sum of the total number of pixels in the A area and the B area is equal to the total number of pixel points in the fourth area, and the formula is obtained:
- ( ⁇ d1 2 )*R1 is the total number of pixels in the first region
- ( ⁇ d3 2 - ⁇ d2 2 )*1/2(R1-d2+d1) is the total number of pixel points in the A region
- (S- ⁇ d3 2 )*1/3 (R1- D2+d1) is the total number of pixels in the B region
- R0*S is the total number of pixels of the entire target video image.
- R1 (R0*S+1/6 ⁇ d2 3 +1/3 ⁇ d1 3 -1/2 ⁇ d2 2 d1-(1/6 ⁇ d3 2 +1/3S)(d1-d2))/(1/2 ⁇ d2 2 +1/6 ⁇ d3 3 +1/3S- ⁇ d1 2 )
- the first resolution R1 can be determined according to R0, d1, d2, d3, and S, thereby determining the resolution function of the third region, the resolution of the A region, and the resolution of the B region.
- FIG. 8b is another distribution diagram of the first region and the second region, wherein the shape of the first region is a circle and a radius
- the first resolution is R1
- the second area is a gray area outside the first area
- the second area has two areas
- the second area includes a fifth area and a sixth area, wherein the fifth area includes two sub-
- the area is A area and B area respectively.
- the inner ring radius of area A is d1
- the outer ring radius is d2
- the inner ring radius of B area is d2
- the outer ring radius is d3
- the resolution of the fifth area is discrete.
- the resolution of the A area is 1/2 of the minimum resolution of the fifth area, the resolution of the B area is 1/3 of the minimum resolution of the fifth area; the sixth area is outside the fifth area, and the sixth area is
- the resolution function of the second region may also be other functions.
- the shape of the first region may also be other shapes, and the second region has other The way of division.
- the corresponding resolution can be determined by the transmission bandwidth, and the correspondence between the resolution of the first resolution and the gradient and the distance can be obtained according to the principle of the total number of pixels, thereby determining the second region.
- the distance between the target image position and the center of the hot spot region can be obtained, the target resolution at the distance is determined, and the image image of the target image position is processed at the target resolution.
- Step S203 Send the processed target video image to the playing terminal.
- the processed target video image is visually different from the original target video image, the original target
- the sharpness of the image of each region of the target video image is the same
- the sharpness of the image of the first region of the processed target video image is higher than the sharpness of the image of the second region
- the image of the second region The image image that is farther away from the first region has a lower resolution.
- the clear changes of the target video image are different, and the visual representation thereof is also different, and the image image closer to the center of the hot spot region has higher definition.
- the following is an example of the target video image before processing and the processed target video image.
- FIG. 9a is a target video image before processing, and the sharpness of the image screen at each position of the target video image before processing is the same.
- FIG. 9b is a processed target video image. If the adopted resolution change rule corresponds to the implementation scenario 2 in the above step S203, for example, the processed target video image has four levels, wherein the innermost layer is the image of the first region. The picture is the clearest, the image of the second layer, that is, the image of the first sub-area is lower than the resolution of the image of the innermost layer, and the image of the third layer, that is, the image of the second sub-area, is lower than that of the second layer. The sharpness of the image picture, the sharpness of the image image of the outermost layer, that is, the third sub-area, is lower than the sharpness of the image picture of the third layer.
- the above-mentioned processing may be performed on each frame of the target video image of the target video image, so that the image of the user's attention or interest is sufficiently clear and the resolution of the transmission bandwidth is satisfied. Rate demand.
- the target video image area is divided into the first area and the second area according to the hotspot area of the target video image such that the hotspot area is in the first area, and then the first area is compared with the first area.
- the rate decreases with the distance from the hotspot area.
- the resolution of the hotspot area is higher by processing the video image, while the resolution of other areas is lower, and at the same time, after processing
- the data amount of the video image is smaller than the data amount of the original video image, so that the useful video data, that is, the video data that the user wants to watch, can be transmitted to the user's playing terminal, thereby ensuring the user's viewing experience and saving bandwidth cost.
- FIG. 10 is a schematic structural diagram of an apparatus for image processing according to an embodiment of the present invention.
- the apparatus includes at least a first area processing module 310, a second area processing module 320, and a sending module 330.
- the detailed description is as follows:
- a first area processing module 310 configured to process, in a first resolution, a first area of the target video image, where the first area includes a hotspot area of the target video image;
- a second area processing module 320 configured to process the second area of the target video image with a gradual resolution, the resolution of the gradation being smaller than the first resolution, wherein the distance in the second area
- the image area corresponding to the hotspot area corresponds to a higher resolution, the second area is located outside the first area, and the second area and the first area constitute the target video image;
- the sending module 330 is configured to send the processed target video image to the playing terminal.
- the device further includes:
- a hotspot area determining module 340 configured to determine a hotspot area of the target video image
- the area dividing module 350 is configured to determine the first area and the second area of the target video image according to the hot spot area.
- the area dividing module 350 includes:
- a central determination sub-module 351, configured to determine a center of the hotspot area
- the first area determining sub-module 352 is configured to determine an area having a center of the center and a radius of the first preset distance as the first area of the target video image.
- the area dividing module 350 is specifically configured to:
- a minimum regular graphics area containing the hotspot area is determined as the first area of the target video image.
- the device further includes:
- a transmission bandwidth determining module 360 configured to determine a transmission bandwidth with the playing terminal
- the resolution determining module 370 is configured to determine the resolution of the first resolution and the gradation according to an area size of the first area, an area size of the second area, and the transmission bandwidth.
- the resolution determining module 370 includes:
- a first resolution determining submodule 370 configured to determine the first resolution according to an area size of the first area, an area size of the second area, and the transmission bandwidth;
- the gradient resolution determining sub-module 371 is configured to determine a resolution of the target image position in the second region according to a distance between a target image position in the second region and a center of the hot spot region.
- the gradient resolution determining submodule 371 includes:
- a first distance obtaining unit 3711 configured to acquire a distance between a target image location in the second region and a center of the hotspot region
- a first resolution determining unit 3712 configured to determine a resolution of a target image position in the second region according to a gradation function, wherein the gradation function is to represent a correspondence between the resolution and the distance function.
- the second area includes a third area and a fourth area, wherein a shortest distance between the third area and a center of the hotspot area is smaller than a center of the fourth area and the hotspot area
- the shortest distance between the shortest distances; the gradient resolution determining sub-module 371 includes:
- a second distance acquiring unit 3713 configured to acquire a distance between a target image location in the third region and a center of the hotspot region
- a second resolution determining unit 3714 configured to determine a resolution of a target image position in the third region according to a gradation function, wherein the gradation function is to represent a correspondence between the resolution and the distance function;
- a third resolution determining unit 3715 configured to determine the second resolution as a resolution of a target image position in the fourth region, wherein the second resolution is less than or equal to a target image in the third region The minimum resolution of the location, the second resolution being a fixed resolution.
- the second resolution is 1/N of the first resolution, N is an integer greater than 1; or the second resolution is a minimum resolution of a target image location in the third region.
- 1/P, P is an integer greater than or equal to 1.
- the gradation function comprises: a parabolic function, an elliptic function, and a declining function.
- the second area includes K sub-areas, the resolution of the gradation is a resolution that varies discretely, and the resolution corresponding to the sub-area is a fixed resolution, where the (K-1) The shortest distance between the sub-region and the center of the hot spot region is smaller than the shortest distance between the K-th sub-region and the center of the hot-spot region, and K is a positive integer greater than or equal to 2;
- the gradient resolution determination sub-module 371 include:
- a third distance obtaining unit 3716 configured to acquire a target image location and the hotspot area in the second area The distance between the centers;
- a sub-area determining unit 3717 configured to determine, according to the distance, a sub-area where the target image position is located;
- the fourth resolution determining unit 3718 is configured to determine a resolution of the target image position according to a correspondence between the resolution and the sub-region.
- the correspondence between the resolution and the sub-region includes: a resolution corresponding to the first sub-region of the target video image is 1/L of the first resolution or the first The difference between the resolution and the first preset value, where L is an integer greater than or equal to 1.
- the mapping between the resolution and the sub-region further includes: a resolution corresponding to a Kth sub-region of the target video image is a resolution corresponding to the (K-1)th sub-region 1/M, M is an integer greater than 1; or the resolution corresponding to the Kth sub-region of the target video image is the resolution corresponding to the (K-1)th sub-region of the target video image and the second pre- Set the difference in value.
- each module may also correspond to the corresponding description of the method embodiment shown in FIG. 3 .
- the image processing server may divide the target video image area into the first area and the second area according to the hotspot area of the target video image, such that the hotspot area is in the first area, and then the first The resolution processes the first area, processes the second area with a gradual resolution, and finally sends the processed target video image to the user's playing terminal, wherein the first resolution is greater than all resolutions of the second area
- the resolution of the second region decreases with increasing distance from the hotspot region. In the case of the same bandwidth, the resolution of the hotspot region is higher by processing the video image, and the resolution of other regions is lower.
- the data amount of the processed video image is smaller than the data amount of the original video image, and the useful video data, that is, the video data that the user wants to watch, is transmitted to the user's playing terminal, thereby ensuring the user's viewing experience and saving bandwidth cost.
- FIG. 11 is a schematic structural diagram of another apparatus for image processing according to an embodiment of the present invention.
- the apparatus includes a processor 41, a memory 42, and a communication interface 43.
- the processor 41 is connected to the memory 42 and the communication interface 43, for example, the processor 41 can be connected to the memory 42 and the communication interface 43 via a bus.
- the processor 41 is configured to support a corresponding function in the method of image processing performed by the apparatus of the image processing described in FIG.
- the processor 41 can be a central processing unit (CPU), a network processor (in English: network processor, NP), a hardware chip, or any combination thereof.
- the hardware chip may be an application-specific integrated circuit (ASIC), a programmable logic device (PLD), or a combination thereof.
- ASIC application-specific integrated circuit
- PLD programmable logic device
- the above PLD can be a complex programmable logic device (CPLD), a field-programmable gate array (FPGA), and a general array logic (GAL). Or any combination thereof.
- the memory 42 is used to store program codes and the like.
- the memory 42 may include a volatile memory (English: volatile memory), such as a random access memory (English: random access memory, abbreviation: RAM); the memory 72 may also include a non-volatile memory (English: non-volatile memory)
- ROM read-only memory
- flash memory English: flash memory
- hard disk English: hard disk drive, abbreviation: HDD
- solid state drive English: solid-state drive , abbreviation: SSD
- the memory 42 may also include a combination of the above types of memories.
- Communication interface 43 is used to receive and transmit data.
- the processor 41 can invoke the program code to perform the following operations:
- the processed target video image is transmitted to the playback terminal through the communication interface 43.
- the processor 41 before processing, by the processor 41, the first area of the target video image at a first resolution, the processor 41 is further configured to:
- the determining, by the processor 41, the first area of the target video image according to the hotspot area specifically:
- An area having a center of the center and a radius of the first predetermined distance is determined as the first area of the target video image.
- the determining, by the processor 41, the first area of the target video image according to the hotspot area specifically:
- a minimum regular graphics area containing the hotspot area is determined as the first area of the target video image.
- the processor 41 before the processor 41 processes the first area of the target video image with the first resolution, the processor 41 is further configured to:
- the processor 41 determines the resolution of the first resolution and the gradation according to the area size of the first area, the area size of the second area, and the transmission bandwidth, and specifically includes:
- a resolution of a target image position in the second region is determined according to a distance between a target image position in the second region and a center of the hot spot region.
- the determining, by the processor 41, the resolution of the target image location in the second region according to the distance between the target image location in the second region and the center of the hotspot region specifically including:
- a resolution of a target image position in the second region is determined according to a gradation function, wherein the gradation function is a function that characterizes a correspondence between the resolution and the distance.
- the second area includes a third area and a fourth area, wherein a shortest distance between the third area and a center of the hotspot area is smaller than a center of the fourth area and the hotspot area
- the resolution of the target image position in the second region is determined by the processor 41 according to the distance between the target image location in the second region and the center of the hotspot region, and specifically includes:
- gradation function is a function characterizing a correspondence between the resolution and the distance
- a second resolution as a resolution of a target image position in the fourth region, wherein the second resolution is less than or equal to a minimum resolution of a target image position in the third region, the second resolution For fixed resolution.
- the second resolution is 1/N of the first resolution, N is an integer greater than 1; or the second resolution is a minimum resolution of a target image location in the third region.
- 1/P, P is an integer greater than or equal to 1.
- the gradation function comprises: a parabolic function, an elliptic function, and a declining function.
- the second area includes K sub-areas, the resolution of the gradation is a resolution that varies discretely, and the resolution corresponding to the sub-area is a fixed resolution, where the (K-1)
- the shortest distance between the sub-area and the center of the hot spot area is smaller than the shortest distance between the K-th sub-area and the center of the hot-spot area, K is a positive integer greater than or equal to 2;
- the processor 41 is based on the second area
- the distance between the target image location and the center of the hotspot region determines the resolution of the target image location in the second region, and specifically includes:
- a resolution of the target image position is determined according to a correspondence between the resolution and the sub-region.
- the correspondence between the resolution and the sub-region includes: a resolution corresponding to the first sub-region of the target video image is 1/L of the first resolution or the first The difference between the resolution and the first preset value, where L is an integer greater than one.
- the mapping between the resolution and the sub-region further includes: a resolution corresponding to a Kth sub-region of the target video image is a resolution corresponding to the (K-1)th sub-region 1/M, M is an integer greater than 1; or the resolution corresponding to the Kth sub-region of the target video image is the resolution corresponding to the (K-1)th sub-region of the target video image and the second pre- Set the difference in value.
- the processor 41 is further configured to:
- the step of processing the first region of the target video image at the first resolution is performed.
- each operation may also correspond to the corresponding description of the method embodiment shown in FIG. 3 .
- the embodiment of the present invention further provides a computer storage medium storing a computer program, the computer program comprising program instructions, when the image processing device performs the image processing
- the apparatus performs the method as described in the previous embodiments.
- the embodiment of the present invention further provides a computer program, including program instructions, which are used to execute the method as described in the foregoing embodiments when executed by an image processing apparatus.
- the storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), or a random access memory (RAM).
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Human Computer Interaction (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
La présente invention concerne, selon un mode de réalisation, un procédé et un dispositif de traitement d'image. Le procédé de traitement d'image consiste : à utiliser une première résolution afin d'effectuer un traitement sur une première région d'une image vidéo cible, la première région comprenant une région de tâche lumineuse de l'image vidéo cible ; à utiliser des résolutions changeant progressivement afin d'effectuer un traitement sur une seconde région de l'image vidéo cible, les résolutions changeant progressivement étant inférieures à la première résolution, plus la partie d'image dans la seconde région étant proche de la région de tâche lumineuse, plus la résolution correspondante de la partie d'image étant élevée, la seconde région étant située à l'extérieur de la première région, et la seconde région et la première région constituant l'image vidéo cible ; et à transmettre l'image vidéo cible traitée à un terminal de joueur. Le mode de réalisation de la présente invention permet une transmission de données vidéo plus utiles à un terminal de joueur d'un utilisateur.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201780062040.0A CN109804409A (zh) | 2016-12-26 | 2017-04-14 | 图像处理的方法和装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611220993 | 2016-12-26 | ||
CN201611220993.9 | 2016-12-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018120519A1 true WO2018120519A1 (fr) | 2018-07-05 |
Family
ID=62706644
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2017/080645 WO2018120519A1 (fr) | 2016-12-26 | 2017-04-14 | Procédé et dispositif de traitement d'image |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN109804409A (fr) |
WO (1) | WO2018120519A1 (fr) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111951343A (zh) * | 2019-05-16 | 2020-11-17 | 阿里巴巴集团控股有限公司 | 图像生成方法和装置、图像展示方法和装置 |
CN113408440A (zh) * | 2021-06-24 | 2021-09-17 | 展讯通信(上海)有限公司 | 一种视频数据卡顿检测方法、装置、设备及存储介质 |
US11477426B2 (en) * | 2020-10-26 | 2022-10-18 | Avaya Management L.P. | Selective image broadcasting in a video conference |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111667411A (zh) * | 2020-06-12 | 2020-09-15 | 深圳天度物联信息技术有限公司 | 一种图像传输方法、装置、电子设备及存储介质 |
CN112911191B (zh) * | 2021-01-28 | 2023-03-24 | 联想(北京)有限公司 | 一种视频通话质量调整方法、装置、电子设备和存储介质 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101655984A (zh) * | 2008-05-12 | 2010-02-24 | 美国西门子医疗解决公司 | 医学图像数据的自适应处理系统 |
US20100128983A1 (en) * | 2008-11-25 | 2010-05-27 | Canon Kabushiki Kaisha | Imaging system and imaging method |
CN102915521A (zh) * | 2012-08-30 | 2013-02-06 | 中兴通讯股份有限公司 | 一种移动终端图像处理方法及装置 |
US20130219012A1 (en) * | 2012-02-22 | 2013-08-22 | Citrix Systems, Inc. | Hierarchical Display |
US20140136686A1 (en) * | 2012-11-09 | 2014-05-15 | Institute For Information Industry | Dynamic resolution regulating system and dynamic resolution regulating method |
CN105635624A (zh) * | 2014-10-27 | 2016-06-01 | 华为技术有限公司 | 视频图像的处理方法、设备及系统 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6829391B2 (en) * | 2000-09-08 | 2004-12-07 | Siemens Corporate Research, Inc. | Adaptive resolution system and method for providing efficient low bit rate transmission of image data for distributed applications |
CN104159129B (zh) * | 2014-08-08 | 2015-06-17 | 北京大学 | 一种遥感数据有限带宽下剖分分块渐进传输方法 |
-
2017
- 2017-04-14 CN CN201780062040.0A patent/CN109804409A/zh active Pending
- 2017-04-14 WO PCT/CN2017/080645 patent/WO2018120519A1/fr active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101655984A (zh) * | 2008-05-12 | 2010-02-24 | 美国西门子医疗解决公司 | 医学图像数据的自适应处理系统 |
US20100128983A1 (en) * | 2008-11-25 | 2010-05-27 | Canon Kabushiki Kaisha | Imaging system and imaging method |
US20130219012A1 (en) * | 2012-02-22 | 2013-08-22 | Citrix Systems, Inc. | Hierarchical Display |
CN102915521A (zh) * | 2012-08-30 | 2013-02-06 | 中兴通讯股份有限公司 | 一种移动终端图像处理方法及装置 |
US20140136686A1 (en) * | 2012-11-09 | 2014-05-15 | Institute For Information Industry | Dynamic resolution regulating system and dynamic resolution regulating method |
CN105635624A (zh) * | 2014-10-27 | 2016-06-01 | 华为技术有限公司 | 视频图像的处理方法、设备及系统 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111951343A (zh) * | 2019-05-16 | 2020-11-17 | 阿里巴巴集团控股有限公司 | 图像生成方法和装置、图像展示方法和装置 |
WO2020228676A1 (fr) * | 2019-05-16 | 2020-11-19 | 阿里巴巴集团控股有限公司 | Procédé et dispositif de génération d'image, et procédé et dispositif d'affichage d'image |
US11477426B2 (en) * | 2020-10-26 | 2022-10-18 | Avaya Management L.P. | Selective image broadcasting in a video conference |
CN113408440A (zh) * | 2021-06-24 | 2021-09-17 | 展讯通信(上海)有限公司 | 一种视频数据卡顿检测方法、装置、设备及存储介质 |
Also Published As
Publication number | Publication date |
---|---|
CN109804409A (zh) | 2019-05-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220248088A1 (en) | Digital media system | |
US11601699B2 (en) | Predictive content delivery for video streaming services | |
US11418755B2 (en) | Adaptive resolution in software applications based on dynamic eye tracking | |
WO2018120519A1 (fr) | Procédé et dispositif de traitement d'image | |
TWI619088B (zh) | 圖像資料處理系統和相關方法以及相關圖像融合方法 | |
US8933927B2 (en) | Display system with image conversion mechanism and method of operation thereof | |
US10529122B2 (en) | Information processing apparatus, information processing method, and storage medium | |
CN109302619A (zh) | 一种信息处理方法及装置 | |
AU2017317839B2 (en) | Panoramic video compression method and device | |
WO2019128667A1 (fr) | Terminal et procédé de lecture de vidéo, serveur et support d'informations | |
JP6224516B2 (ja) | エンコード方法およびエンコードプログラム | |
KR102389335B1 (ko) | 복수의 방송 채널의 영상을 표시하는 장치 및 방법 | |
US9946957B2 (en) | Method, apparatus, computer program and system for image analysis | |
CN111629146B (zh) | 拍摄参数的调整方法、调整装置、调整设备及存储介质 | |
CN104469398B (zh) | 一种网络视频画面处理方法及装置 | |
Seo et al. | Real-time panoramic video streaming system with overlaid interface concept for social media | |
CN111885417B (zh) | Vr视频播放方法、装置、设备以及存储介质 | |
US11134236B2 (en) | Image processing device and system | |
CN116506585A (zh) | 一种全景视频中的多目标追踪及其显示方法 | |
CN116703788A (zh) | 全景图处理方法、装置、电子设备及可读存储介质 | |
CN105872591A (zh) | 图标覆盖方法及装置 | |
CN115760584A (zh) | 一种图像处理方法及相关设备 | |
CN117392244A (zh) | 图像处理方法、装置、电子设备和存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17885807 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17885807 Country of ref document: EP Kind code of ref document: A1 |