[go: up one dir, main page]

WO2018176734A1 - Procédé de traitement de données et terminal - Google Patents

Procédé de traitement de données et terminal Download PDF

Info

Publication number
WO2018176734A1
WO2018176734A1 PCT/CN2017/100080 CN2017100080W WO2018176734A1 WO 2018176734 A1 WO2018176734 A1 WO 2018176734A1 CN 2017100080 W CN2017100080 W CN 2017100080W WO 2018176734 A1 WO2018176734 A1 WO 2018176734A1
Authority
WO
WIPO (PCT)
Prior art keywords
format
video
new
surface view
frame data
Prior art date
Application number
PCT/CN2017/100080
Other languages
English (en)
Chinese (zh)
Inventor
仇建斌
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to CN201780033551.XA priority Critical patent/CN109196865B/zh
Publication of WO2018176734A1 publication Critical patent/WO2018176734A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/40Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video transcoding, i.e. partial or full decoding of a coded input stream followed by re-encoding of the decoded output stream

Definitions

  • the present application relates to the field of terminal technologies, and in particular, to a data processing method and a terminal.
  • the system decoder of the Android system can decode the video source data of the general format, and then send the decoded video frame data to the image processing unit such as MDP for scaling. And then sent to the display unit for display.
  • some video playback applications may encode video source data using other video formats than the common video format.
  • the video playing application decodes through a decoder other than the system decoder, and the format of the decoded video frame data is usually not matched with the mobile phone hardware, and thus cannot be directly sent.
  • the image processing unit of the mobile phone and the display unit perform processing.
  • a graphics processing unit converts the format of the video frame data into a target format that can be directly used for display, and displays a video layer image of the video frame data. After being scaled to the size matching the display unit, it is sent to the image processing unit and the display unit for processing.
  • the format of the video frame data is converted into a target format that can be directly used for display by the GPU, and the video layer picture is scaled to a size matching the display unit, which consumes more power of the mobile phone. Reduce the user experience.
  • the embodiment of the present invention provides a data processing method and a terminal, which can reduce power consumption during video playback and improve user experience.
  • the embodiment of the present application provides a data processing method, which is applied to a terminal, where the terminal includes a central processing unit (CPU), an image processing unit, and a display unit, and the method includes: when the decoded video
  • the CPU converts the first format into the second format.
  • the video frame data is used to represent the video layer picture
  • the second format is a preset type format
  • the preset class format is a format that the image processing unit can recognize.
  • the image processing unit converts the second format into a target format, which is a data frame format used by the display unit for display.
  • the image processing unit scales the video layer picture represented by the video frame data of the target format such that the size of the scaled video layer picture matches the size of the display unit Match.
  • the method further includes: the display unit displaying the scaled video layer picture.
  • the display unit can display the video layer screen after the image processing unit is scaled.
  • the first format is corresponding to the first video playing application
  • the video frame data of the first format is a decoder corresponding to the first video playing application. Data obtained by decoding video source data.
  • the first format processed by the CPU is the format of the video frame data obtained by decoding the decoder corresponding to the first video playing application.
  • the CPU converting the first format into the second format includes: the CPU, according to the first video playback application corresponding to the first format, the first The format is converted to the second format.
  • the CPU may convert the first format into the second format according to the first video playing application.
  • the terminal saves a correspondence between the first video playing application and the third format, where the third format is a preset type format, and the second format is A third format corresponding to the first video playback application.
  • the CPU may convert the first format corresponding to the first video playing application into the third format according to the correspondence between the first video playback saved in advance and the third format.
  • the video layer is displayed through a surface view surfaceview
  • the method further includes: the CPU is in the video.
  • a new video layer is created on the upper layer of the corresponding position of the original video layer corresponding to the frame data, the new video layer is used to cover the original video layer, and the new video layer corresponds to the new surface view new surface view.
  • the CPU acquires video frame data through an open graphics library (opengl) interface. The CPU fills in the new surface view new surfaceview with the video frame data acquired through the opengl interface.
  • the terminal side is not easy to modify and control, so the terminal can replace the original surface view by creating a new surface view new surfaceview and a new video layer. And the original video layer to operate on the new surface view new surfaceview and the new video layer.
  • the user does not perceive, and the resolution, the frame rate, the playback speed, and the display content are not affected, and thus Will not reduce the user experience.
  • the CPU converting the first format into the second format includes: the first format corresponding to the video frame data in the new surface view new surface view of the CPU Convert to the second format.
  • the CPU can convert the first format corresponding to the video frame data in the new surface view new surface view corresponding to the new video layer into the second format.
  • the CPU converts the first format corresponding to the video frame data in the new surface view new surface view into the second format, including: the CPU passes the new surface view.
  • the undersurface module nativesurface of the new surfaceview converts the first format corresponding to the video frame data in the new surface view new surfaceview into the second format.
  • the CPU specifically converts the first format into the second format by using the underlying module.
  • the image processing unit converting the second format into the target format includes: the image processing unit corresponding to the video frame data in the new surface view new surface view
  • the second format is converted to the target format.
  • the image processing unit scaling the video layer picture represented by the video frame data of the target format includes: the image processing unit scales the new video layer picture represented by the video frame data converted into the target format in the new surface view new surfaceview.
  • the image processing unit may specifically convert the second format corresponding to the video frame data into the target format in the new surface view new surface view, and convert the video image data converted into the target format in the new surface view new surfaceview.
  • the new video layer picture is scaled.
  • the original video layer corresponds to an original surface view
  • the method further includes: the display unit stops displaying the original video layer corresponding to the original surface view original surface view Picture.
  • the CPU inserts the new surface view new surfaceview into the view hierarchy viewhieiarchy in which the original surface view is located.
  • the display unit displays the scaled video layer picture including: the display unit displays the scaled target video layer picture, and the target video layer picture is a new video layer picture in the new surface view new surfaceview that has been converted into the target format video frame data.
  • the terminal can no longer display the original video layer picture, but replace it by displaying a new video layer picture.
  • the second format is a YUV type format.
  • the CPU can convert the first format to the second format of the YUV type.
  • the CPU creates a new video layer on a higher layer of a corresponding position of the original video layer corresponding to the video frame data, including: the CPU corresponding to setting a new video layer Zorder creates a new video layer on the upper layer of the corresponding position of the original video layer of the video frame.
  • the CPU can set the upper and lower layers of the new video layer and the original video layer through zorder.
  • the method further includes: the image processing unit is configured to: The video layer picture represented by the frame data is scaled such that the size of the scaled video layer picture matches the size of the display unit, and the display unit displays the scaled video layer picture.
  • the image processing unit can directly process the video frame data of the first format.
  • an embodiment of the present application provides a terminal, including at least one processor, an image processing unit, a screen, a memory, and a bus.
  • the processor, the image processing unit, the screen, and the memory are connected by a bus.
  • the memory is used to store instructions.
  • the processor is configured to execute an instruction to: when the first format corresponding to the decoded video frame data is not a preset type format, convert the first format into a second format, where the video frame data is used to represent the video layer picture, and the second format
  • the preset class format is a format that the image processing unit can recognize.
  • the image processing unit is configured to execute an instruction to: convert the second format into a target format, and the target format is a display unit for
  • the displayed data frame format scales the video layer picture represented by the video frame data of the target format such that the size of the scaled video layer picture matches the size of the display unit.
  • the display unit is configured to execute an instruction to perform: displaying the scaled video layer picture.
  • the first format corresponds to the first video playing application
  • the video frame data of the first format is the decoding corresponding to the first video playing application.
  • the processor is specifically configured to: convert the first format into the second format according to the first video playing application corresponding to the first format.
  • the terminal saves a correspondence between the first video playing application and the third format, where the third format is a preset type format, and the second format is A third format corresponding to the first video playback application.
  • the video layer is displayed by a surface view surfaceview
  • the processor is further configured to: before converting the first format into the second format, in the video frame A new video layer is created on the upper layer of the corresponding position of the original video layer corresponding to the data, the new video layer is used to cover the original video layer, and the new video layer corresponds to the new surface view new surface view.
  • the processor is specifically configured to: convert the first format corresponding to the video frame data in the new surface view new surface view into the second format.
  • the processor is specifically configured to: corresponding to the video frame data in the new surface view new surface view by using the bottom surface module nativesurface of the new surface view new surface view
  • the first format is converted to the second format.
  • the image processing unit is specifically configured to: convert the second format corresponding to the video frame data into the target format in the new surface view new surface view.
  • the new video layer picture represented by the video frame data that has been converted into the target format is scaled in the new surface view new surfaceview.
  • the original video layer corresponds to the original surface view original surface view
  • the display unit is configured to stop displaying the original video layer picture corresponding to the original surface view original surface view.
  • the processor is further configured to insert the new surface view new surfaceview into the view hierarchy viewhieiarchy where the original surface view is located.
  • the display unit is specifically configured to: display the scaled target video layer picture, and the target video layer picture is a new video layer picture that has been converted into a video frame data representation of the target format in the new surface view new surfaceview.
  • the second format is a YUV type format.
  • an embodiment of the present application provides a device, which is in the form of a product of a chip.
  • the device includes a processor and a memory, and the memory is coupled to the processor to save necessary program instructions of the device. And data, the processor for executing program instructions stored in the memory such that the apparatus performs the functions of data processing in the above method.
  • an embodiment of the present application provides a computer readable storage medium, where the storage medium includes an instruction, When it is running on the terminal, the terminal is caused to perform the data processing method according to any of the first aspect and the first aspect described above.
  • the embodiment of the present application provides a computer program product including instructions, when the terminal is running on the terminal, causing the terminal to perform the data processing method in any one of the foregoing first aspect and the first aspect.
  • FIG. 2 is a schematic structural diagram of a mobile phone according to an embodiment of the present application.
  • FIG. 3 is a flowchart of a data processing method according to an embodiment of the present application.
  • FIG. 5 is a flowchart of another data processing method according to an embodiment of the present application.
  • FIG. 6 is a flowchart of another data processing method according to an embodiment of the present application.
  • FIG. 7 is a flowchart of another data processing method according to an embodiment of the present application.
  • FIG. 8 is a schematic structural diagram of a terminal according to an embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of another terminal according to an embodiment of the present application.
  • the format of the video frame data is converted by the GPU into a target format that can be directly used for display of the display unit, and the video layer picture is scaled to a size matching the display unit. This consumes more power and reduces the user experience.
  • the data processing method provided by the embodiment of the present application can reduce the power consumption during video playback.
  • the main principle is: replacing the GPU with a part of the format conversion operation and the zoom operation by using a simpler and less power-consuming image processing unit.
  • the data processing method provided by the embodiment of the present application can be applied to a terminal capable of video playback.
  • the terminal here may be a mobile phone, a tablet computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a personal digital assistant (PDA), or the like.
  • UMPC ultra-mobile personal computer
  • PDA personal digital assistant
  • the data processing method provided by the embodiment of the present application is introduced by using the terminal as a mobile phone as an example.
  • the components of the mobile phone 100 will be specifically described below with reference to FIG. 2:
  • the mobile phone 100 may include: a screen 101, a processor 102, a memory 103, a power source 104, a radio frequency (RF) circuit 105, a gravity sensor 106, an audio circuit 107, a video card 108, a GPU 109, and image processing.
  • Units such as unit 110, which may be connected by a bus or directly connected.
  • the screen 101 may specifically be a touch display screen or a non-touch display screen, and may be used for interface display, video play, picture browsing, and the like.
  • the processor 102 is the control center of the handset 100, which connects various portions of the entire handset using various interfaces and lines, by running or executing software programs and/or modules stored in the memory 103, and recalling data stored in the memory 103, The various functions and processing data of the mobile phone 100 are executed to perform overall monitoring of the mobile phone 100.
  • processor 102 can include one or more processing units; processor 102 can integrate an application processor and a modem processor.
  • the application processor mainly processes an operating system, a user interface, an application, and the like, and the modem processor mainly processes wireless communication. It will be appreciated that the above described modem processor may also not be integrated into the processor 102.
  • the memory 103 can be used to store data, software programs, and modules, and can be a volatile memory, such as a random-access memory (RAM), or a non-volatile memory.
  • RAM random-access memory
  • non-volatile memory for example, a read-only memory (ROM), a flash memory, a hard disk drive (HDD), or a solid-state drive (SSD); or a combination of the above types of memories.
  • the program code is stored in the memory 103, and the program code is used to enable the processor 102 to execute the data processing method provided by the embodiment of the present application by executing the program code.
  • the power source 104 which can be a battery, is logically coupled to the processor 102 through a power management system to manage functions such as charging, discharging, and power management through the power management system.
  • the RF circuit 105 can be used for transmitting and receiving information or during a call, receiving and transmitting signals, and in particular, processing the received information to the processor 102; in addition, transmitting the signals generated by the processor 102.
  • RF circuits include, but are not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier (LNA), a duplexer, and the like.
  • LNA low noise amplifier
  • RF circuit 105 can also communicate with the network and other devices via wireless communication.
  • the gravity sensor 106 can collect the acceleration of the mobile phone in various directions (usually three axes), and can collect the magnitude and direction of gravity when stationary, and can be used to identify the gesture of the mobile phone (such as horizontal and vertical screen switching, related Game, magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tapping). It should be noted that the mobile phone 100 may further include other sensors, such as a pressure sensor, a light sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, and the like, and details are not described herein.
  • the audio circuit 107 is used to provide an audio interface between the user and the handset 100.
  • the video card 108 is a device for digital-to-analog signal conversion of the mobile phone, and has image processing capability, which can assist the completion of the work of the processor 102 and improve the overall running speed of the mobile phone 100.
  • the GPU 109 is a processor of a graphics card, which is a complex arithmetic unit, and includes a large number of logic arrays for converting and driving display information required by the mobile phone 100, and providing a line scan signal to the screen 101 to control the correctness of the screen 101. Display, you can perform complex mathematical and geometric calculations required for graphics rendering.
  • the image processing unit 110 is a simple hardware logic unit that contains fewer logic arrays and is small in size and low in power consumption, and is generally used to scale an image to a size that matches the size of the screen 101, such as an MDP unit. (Qualcomm) or DSS unit (His) and so on. In one implementation, image processing unit 110 may also be integrated into processor 110 or other chips of handset 100.
  • the audio circuit 107, the video card 108, the GPU 109, and the image processing unit 110 can be used to perform video playback in conjunction with the screen 101 and the processor 102.
  • the mobile phone 100 may further include a wireless fidelity (WiFi) module, a Bluetooth module, a camera, and the like, and details are not described herein.
  • WiFi wireless fidelity
  • Bluetooth Bluetooth
  • camera and the like, and details are not described herein.
  • the data processing method provided by the embodiment of the present application is described in detail below with reference to the specific components in the mobile phone 100 shown in FIG. 2 in order to make the purpose, the technical solution and the advantages of the embodiment of the present application more clear.
  • the steps shown below can also be performed in any terminal other than the mobile phone shown in FIG. 2.
  • the logical order of the data processing methods provided by the embodiments of the present application is shown in the method flowchart, in some cases, the steps shown or described may be performed in an order different from that herein.
  • an embodiment of the present application provides a data processing method, which may be applied to a terminal, where the terminal may include a central processing unit CPU, an image processing unit, and a display unit.
  • the method may include:
  • the CPU converts the first format into a second format, where the video frame data is used to represent the video layer picture, and the second format is a preset class.
  • the format, the preset class format is a format that the image processing unit can recognize.
  • the video refers to a continuous picture whose display speed exceeds 24 frames per second, and is specifically composed of video data of one frame and one frame.
  • the video frame data is video data obtained after decoding the video source data, and may be referred to as rawdata data, and each frame of data may correspond to one video layer picture.
  • the preset class format here is a video frame format that the image processing unit can recognize and directly process, and is usually a general standard video frame format, for example, a format such as YUV, CMYK, YCBCR or RGB, and each type is specifically Can include multiple formats.
  • a general standard video frame format for example, a format such as YUV, CMYK, YCBCR or RGB, and each type is specifically Can include multiple formats.
  • some of the preset video frame formats can be found in some common standard video formats listed in Table 1 below.
  • the CPU may convert the first format.
  • the first format is usually close to the preset type format, and is generally a simple variant of the preset type format. For example, the first format adds, changes, or deletes some characters in the video data stream corresponding to the format of the preset class format YUV type.
  • the CPU converts the first format into the second format at a faster speed, and the delay is small, for example, may be several milliseconds.
  • the converting the first format to the second format by the CPU specifically means that the CPU converts the video frame data of the first format into the video frame data of the second format.
  • the CPU may determine, by the framework framework layer module, whether the first format can be recognized by the image processing unit.
  • the image processing unit converts the second format into a target format, where the target format is a data frame format that the display unit can use for display.
  • the target format is a data frame format that the display unit can directly use for display, and can usually be an RGB type format.
  • the image processing unit may specifically convert the second format into a target format that is adapted to the characteristics of the display unit according to the characteristics of the display unit.
  • the target format may be the RGB888 format; when the display unit is an enhanced color screen, the target format may be the RGB555 format.
  • the image processing unit scales the video layer picture represented by the video frame data of the target format, so that the size of the scaled video layer picture matches the size of the display unit.
  • the size of the video layer picture represented by the video frame data may also be scaled so that the size of the video layer picture matches the size of the display unit. In order for the display unit to display the scaled video layer screen.
  • the CPU may convert the first format into a second format that is close to the format of the first format, and the image processing unit may convert the second format.
  • the target format can be directly used for display, and the video layer picture is scaled to the screen size to facilitate display of the display unit.
  • the terminal When adopting the solution in the prior art, the terminal will convert the first format into the target format through the GPU. Since the first format and the target format usually differ greatly, the first format is usually not a simple deformation of the target format, and thus the GPU usually needs to perform complex processing to convert the GPU into a target format. Also, the GPU needs to scale the video layer to the screen size. Since the GPU is a complex arithmetic unit and internally contains a large number of logic arrays, the operation of the GPU will consume more power.
  • the process of converting the first format into the target format may be divided into two parts, that is, the CPU converts the first format into the second format, and the image processing unit converts the second format into the target format.
  • the CPU converts the first format into the second format
  • the image processing unit converts the second format into the target format.
  • the first part compared with the GPU, although the CPU is also a complex arithmetic unit, the CPU is only used to convert the first format into a second format similar to the first format, so that the CPU can be realized by simple processing and processing. Fast speed and small delay.
  • the power consumption can be saved by running the image processing unit to convert the second format to the target format compared to running the GPU. It can be seen that the method for converting the first format into the second format by the CPU+image processing unit provided by the embodiment of the present application can save power.
  • the prior art is implemented by running a complex arithmetic unit GPU, and the present application is realized by running a simpler image processing unit, thereby saving power.
  • the embodiment of the present application replaces the GPU by a smaller and simpler image processing unit to perform a part of the format conversion operation and the zoom operation, thereby reducing the power consumption and improving the user experience.
  • the image processing unit is configured to scale the video layer image to a screen size, that is, the size of the video layer picture entering the image processing unit is the original size;
  • the GPU is used to scale the video layer picture to the screen size, that is, the video layer picture that enters the image processing unit from the GPU has been scaled to the screen size, and the image processing unit only performs 1:1 scaling (it can also be understood as No scaling).
  • the method provided by the embodiment of the present application may further include:
  • the display unit displays the scaled video layer screen.
  • the video frame data corresponding to the video layer picture herein is a target format.
  • the display unit here may be a screen of the terminal, such as a liquid crystal display (LCD) screen or an organic light emitting display (OLED) screen.
  • LCD liquid crystal display
  • OLED organic light emitting display
  • the display unit and the screen are used to indicate the same meaning, and both can be used universally.
  • the display unit can display the scaled video layer picture. And, the display unit displays the video layer screen at a frequency greater than 24 frames/second for video playback.
  • the process of scaling the size of the video layer picture to match the size of the display unit is referred to as scaling the video layer picture to the screen size or scaling the video layer picture to the matching size.
  • the first format in step 201 may correspond to the first video playing application, and the video frame data in the first format is data obtained by decoding the video source data by using a decoder corresponding to the first video playing application. .
  • the video frame data here is the video data obtained by decoding by the decoder, where the decoder is a decoder corresponding to the video playing application, not a system decoder of the Android system, and specifically may include a software decoder or a hardware decoder.
  • the decoders corresponding to different video playbacks may be different.
  • the format of the video frame data obtained by decoding corresponding to different video playing applications may also be different, that is, the format of the decoded video frame data corresponds to the video playing application.
  • the video playback application 1 corresponds to a video frame format 1 (first format)
  • the video playback application 2 corresponds to a video frame format 2 (first format).
  • the CPU converting the first format into the second format may include:
  • the CPU converts the first format into the second format according to the first video playing application corresponding to the first format.
  • the first format is a format of the video data obtained by decoding by the decoder corresponding to the first video playing application, and thus the CPU may use the first video playing application corresponding to the first format, Convert the first format to the second format.
  • the terminal may save the correspondence between the first video playing application and the third format, where the third format is a preset type format, and the second format is a third format corresponding to the first video playing application. Therefore, the step 2011 may specifically include: the CPU converting the first format into a third format corresponding to the first video playing application, the third format That is the second format.
  • the video playback application 1 corresponds to video frame format 3 (third format)
  • the video playback application 2 corresponds to video frame format 4 (third format).
  • the first format is the video frame format 2
  • the first video playing application corresponding to the video frame format 2 is the video playing application 2
  • the third format corresponding to the video playing application 2 is the video frame format 4, and thus, the CPU can Video frame format 2 (first format) is converted into video frame format 4 (second format).
  • the terminal may also store a correspondence between other video playback applications and video frame formats in the preset video, and the terminal may also pass The software upgrade and the like update and modify the correspondence between the video playback application and the video frame format saved in the terminal.
  • the correspondence between the first video playing application and the third format may be determined in a testing phase.
  • the format in the preset class format may be referred to as a fourth format.
  • the first format of the video frame data obtained by decoding by the decoder corresponding to the first video playing application may be converted into a preset type format.
  • the image processing unit can convert the fourth format into a target format, and the display unit can display the video layer picture represented by the video frame data of the target format, if the screen display condition is met, It can be determined that the fourth format belongs to the preset type format, and the third format is determined to be the format in which the video layer data represented by the corresponding video frame data has the highest resolution in the preset format; or, the third format can be determined.
  • the format is any of the preset class formats. Furthermore, the correspondence between the first video playing application corresponding to the first format and the determined third format may be saved.
  • the screen display conditions can be set according to actual needs. For example, a pixel on the display unit that can normally display an image is greater than or equal to a preset value.
  • the first format may be converted into a fourth format in the preset class format, and if the video frame data corresponding to the fourth format converted to this time satisfies the screen display condition, the current format is determined.
  • the corresponding relationship between the second format corresponding to the video playback application and the first format may be incorrect, and the video playback application may be deleted at this time.
  • the correspondence of the third format may be incorrect, and the video playback application may be deleted at this time.
  • the video layer may be displayed through a surface view surfaceview.
  • surfaceview has a separate drawing surface, which does not share the same drawing surface with its host window.
  • the surfaceview interface can be drawn in a separate thread without taking up the main thread resources, enabling a complex and efficient interface.
  • the method may further include:
  • the CPU creates a new video layer on the upper layer of the corresponding position of the original video layer corresponding to the video frame data, where the new video layer is used to cover the original video layer, and the new video layer corresponds to the new surface view new surface view.
  • the CPU may create a new video layer at a corresponding position of the original video layer by setting the width wide and the height height corresponding to the new video layer to be the same as the width width and the cover height corresponding to the original video layer. Moreover, the CPU can create a new video on the upper layer of the original video layer of the video frame by setting the zorder corresponding to the new video layer. Floor. In this way, the new video layer set can completely cover the original video layer.
  • the CPU acquires video frame data through an open graphics library opengl interface.
  • the CPU fills the new surface view new surfaceview with the video frame data obtained through the open graphics library opengl interface.
  • the video frame data in the new surface view is the video frame data corresponding to the new video layer
  • the video frame data corresponding to the new video layer is the video frame data corresponding to the original video layer
  • the terminal side Since the original video layer and the original surface view new surfaceview corresponding to the original video layer are created by the video playing application, the terminal side is not easy to modify and control them, so the CPU on the terminal side can create a new surface view new surfaceview and new
  • the video layer replaces the original surface view new surfaceview and the original video layer, so that the terminal components can be formatted and scaled by controlling the original surface view new surfaceview and the original video layer to achieve the purpose of reducing power consumption.
  • the user does not perceive, and the resolution, the frame rate, the playback speed, and the display content are not affected, and thus Will not reduce the user experience.
  • the CPU may convert the first format into the second format, which may include:
  • the CPU converts the first format corresponding to the video frame data in the new surface view new surfaceview into the second format.
  • the CPU can correspond to the new video layer corresponding to the first format corresponding to the video frame data in the new surface view new surfaceview. Convert to the second format.
  • the converting, by the CPU, the first format corresponding to the video frame data in the new surface view in the new surface view to the second format may include:
  • the CPU converts the first format corresponding to the video frame data in the new surface view new surfaceview into the second format through the underlying module nativesurface of the new surface view new surfaceview.
  • the underlying module nativesurface is used to support the function of the new surfaceview.
  • the CPU may first obtain video frame data (ie, data capture data catch) from the opengl interface through the nativesurface, and then copy the acquired video frame data (ie, data sharing data share), thereby converting the first format corresponding to the video frame into the second format. (ie data adjustment data adjust).
  • step 2010 the image processing unit can process the video data in the new surface view new surfaceview.
  • the foregoing step 202 may specifically include:
  • the image processing unit converts the second format corresponding to the video frame data into the target format in the new surface view new surface view.
  • the foregoing step 203 may specifically include:
  • the image processing unit scales a new video layer image represented by the video frame data that has been converted into the target format in the new surface view new surfaceview.
  • the original video layer may correspond to the original surface view.
  • the method may further include:
  • the display unit stops displaying the original video layer picture corresponding to the original surface view of the original surface view.
  • the CPU sends the video frame data corresponding to the original video layer to the GPU, the image processing unit, and the display unit for processing.
  • the original video layer picture does not need to be displayed again.
  • the CPU may stop sending the video frame data in the original surface view to the GPU, the image processing unit, and the display unit for processing, and the display unit may stop displaying the original video layer image, and display the new surface view new surfaceview new Video layer picture.
  • the CPU can stop displaying the original video layer by setting the transparency of the new video layer to be non-transparent.
  • the transparency of the new video layer is non-transparent, the original video layer is completely occluded, and the terminal will not continue to perform the display processing flow on the original video layer, and the display unit will no longer display the original video layer picture.
  • the CPU does not need to set the transparency of the new video layer to be non-transparent.
  • the CPU inserts a new surface view new surfaceview into the view hierarchy viewhieiarchy where the original surfaceview is located.
  • Viewhieiarchy is a tree-tree structure of the view view, including all views and attributes of all views, such as the wide coordinate information of the view, the height coordinate information, and the z-axis coordinate information. Based on this information, the layout of the entire screen can be obtained. Viewhieiarchy can correspond to video playback applications.
  • the CPU can insert the new surface view new surfaceview into the view hierarchy viewhieiarchy where the original surfaceview is located, so that the new surface view new surfaceview can be merged with other views in the screen, thus in the new surface view.
  • the new surfaceview replaces the original surface view corresponding to the original video layer, it does not affect the overall layout of the screen, and does not change the normal display of the screen.
  • step 209 may be performed after step 205 and before step 2040.
  • the above step 204 may specifically include:
  • the display unit displays the scaled target video layer picture, and the target video layer picture is a new video layer picture represented by the video frame data of the target format in the new surface view new surfaceview.
  • the method may further include:
  • the image processing unit scales the video layer picture represented by the video frame data, so that the size of the scaled video layer picture matches the size of the display unit.
  • the display unit displays the scaled video layer screen.
  • the image processing unit can directly process the video frame data of the first format.
  • the video frame data obtained by decoding by the system decoder may be directly sent to the image processing unit for processing.
  • the method can also include:
  • the display unit displays a bullet layer and/or a logo logo layer.
  • the terminal includes corresponding hardware structures and/or software modules for performing various functions.
  • the present application can be implemented in a combination of hardware or hardware and computer software in combination with the algorithmic steps of the various examples described in the embodiments disclosed herein. Whether a function is implemented in hardware or computer software to drive hardware depends on the specific application and design constraints of the solution. A person skilled in the art can use different methods to implement the described functions for each particular application, but such implementation should not be considered to be beyond the scope of the present application.
  • the embodiment of the present application may divide the function module into the terminal according to the foregoing method example.
  • each function module may be divided according to each function, or two or more functions may be integrated into one processing module.
  • the above integrated modules can be implemented in the form of hardware or in the form of software functional modules. It should be noted that the division of the module in the embodiment of the present application is schematic, and is only a logical function division, and the actual implementation may have another division manner.
  • FIG. 8 is a schematic diagram showing a possible composition of the terminal 30 involved in the foregoing embodiment.
  • the terminal 30 may include: at least one processor. 31. Image processing unit 32, screen 33, memory 34, and bus 35.
  • the processor 31, the image processing unit 32, the screen 33, and the memory 34 are connected by a bus 35.
  • the memory 34 is used to store instructions.
  • the processor 31 is configured to execute an instruction to: convert the first format into the second format when the first format corresponding to the decoded video frame data is not the preset class format.
  • the video frame data is used to represent the video layer picture
  • the second format is a preset type format
  • the preset class format is a format that the image processing unit can recognize.
  • the image processing unit 32 is configured to execute an instruction to: convert the second format into a target format, the target format is a data frame format used by the display unit for display, and scale the video layer image represented by the video frame data of the target format, so that The size of the scaled video layer picture matches the size of the display unit.
  • the screen 33 is for supporting the terminal to execute step 204 in the data processing method shown in FIG. 5, step 208 and step 2040 in FIG.
  • the processor 31 is further configured to support the terminal to perform steps 205-207, step 2010, and step 209 in the data processing method shown in FIG. 6.
  • the image processing unit 32 can also be used to support the terminal to perform steps 2020 and 2030 in the data processing method shown in FIG. 6.
  • the terminal provided in the embodiment of the present application is configured to execute the foregoing data processing method, so that the same effect as the data processing method described above can be achieved.
  • FIG. 9 shows another possible composition diagram of the terminal 40 involved in the above embodiment.
  • the terminal 40 includes a processing module 41 and a communication module 42.
  • the processing module 41 is configured to control and manage the actions of the terminal 40.
  • the processing module 41 is configured to support the terminal 40 to perform the operations of the processor 31, the image processing unit 32, and the screen 33 shown in FIG. 8, and/or for the purposes of this document. Other processes of the described techniques.
  • the communication module 42 is used to support communication between the terminal and other network entities. Terminal also A storage module 43 may be included for performing the operations of the memory 34 shown in FIG. 8, storing the program code and data of the terminal 40.
  • the processing module 41 can be a processor or a controller. It is possible to implement or carry out the various illustrative logical blocks, modules and circuits described in connection with the present disclosure.
  • the processor can also be a combination of computing functions, for example, including one or more microprocessor combinations, a combination of a digital signal processor (DSP) and a microprocessor, and the like.
  • the communication module 42 can be a transceiver, a transceiver circuit, a communication interface, or the like.
  • the disclosed apparatus and method may be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of modules or units is only a logical function division.
  • there may be another division manner for example, multiple units or components may be combined or It can be integrated into another device, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be in an electrical, mechanical or other form.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may be one physical unit or multiple physical units, that is, may be located in one place, or may be distributed to a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.
  • An integrated unit can be stored in a readable storage medium if it is implemented as a software functional unit and sold or used as a standalone product.
  • the technical solution of the embodiments of the present application may be embodied in the form of a software product in the form of a software product in essence or in the form of a contribution to the prior art, and the software product is stored in a storage medium.
  • a number of instructions are included to cause a device (which may be a microcontroller, chip, etc.) or a processor to perform all or part of the steps of the various embodiments of the present application.
  • the foregoing storage medium includes various media that can store program codes, such as a USB flash drive, a mobile hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

L'invention concerne un procédé de traitement de données et un terminal, se rapportant au domaine technique des terminaux et susceptibles de réaliser une économie sur une grandeur électrique des terminaux. La solution spécifique fait intervenir: un terminal comportant une unité centrale de traitement (CPU), une unité de traitement d'images et une unité d'affichage, et lorsqu'un premier format correspondant à des données de trame vidéo décodées n'est pas un format de type préétabli, la CPU convertit le premier format en un second format, les données de trame vidéo étant utilisées pour représenter une image de couche vidéo, le second format étant le format de type préétabli, et le format de type préétabli étant un format qui peut être reconnu par l'unité de traitement d'images; l'unité de traitement d'images convertit le second format en un format de destination, le format de destination étant un format de trame de données destiné à être affiché par l'unité d'affichage; et l'unité de traitement d'images applique un zoom à une image de couche vidéo représentée par des données de trame vidéo du format de destination, de sorte que la taille de l'image de couche vidéo zoomée concorde avec la taille de l'unité d'affichage. Le mode de réalisation de la présente invention est utilisé pour lire une vidéo.
PCT/CN2017/100080 2017-03-27 2017-08-31 Procédé de traitement de données et terminal WO2018176734A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201780033551.XA CN109196865B (zh) 2017-03-27 2017-08-31 一种数据处理方法、终端以及存储介质

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710189121.9 2017-03-27
CN201710189121 2017-03-27

Publications (1)

Publication Number Publication Date
WO2018176734A1 true WO2018176734A1 (fr) 2018-10-04

Family

ID=63674060

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/100080 WO2018176734A1 (fr) 2017-03-27 2017-08-31 Procédé de traitement de données et terminal

Country Status (2)

Country Link
CN (1) CN109196865B (fr)
WO (1) WO2018176734A1 (fr)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111831713B (zh) * 2019-04-18 2025-01-28 阿里巴巴集团控股有限公司 一种数据处理方法、装置及设备
CN110427094B (zh) * 2019-07-17 2021-08-17 Oppo广东移动通信有限公司 显示方法、装置、电子设备及计算机可读介质
CN112839252B (zh) * 2019-11-25 2023-03-21 青岛海信电器股份有限公司 显示设备
CN111083496A (zh) * 2019-12-25 2020-04-28 Oppo广东移动通信有限公司 数据处理方法及相关产品
CN112788431A (zh) * 2020-12-24 2021-05-11 四川云从天府人工智能科技有限公司 基于html5的视频播放方法、装置、系统、介质及浏览器
CN114489556B (zh) 2021-05-21 2022-12-09 荣耀终端有限公司 一种播放声音的方法及设备
CN113613071B (zh) * 2021-07-30 2023-10-20 上海商汤临港智能科技有限公司 一种图像处理方法、装置、计算机设备和存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030001964A1 (en) * 2001-06-29 2003-01-02 Koichi Masukura Method of converting format of encoded video data and apparatus therefor
CN101059797A (zh) * 2006-04-20 2007-10-24 风网科技(北京)有限公司 视频文件自动转换的系统及其方法
CN101073263A (zh) * 2004-06-14 2007-11-14 Rok产品有限公司 媒体播放器
CN105049931A (zh) * 2015-08-10 2015-11-11 合一网络技术(北京)有限公司 对移动终端中非支持格式的视频进行转换的方法及系统
CN105430236A (zh) * 2015-12-22 2016-03-23 北京天诚盛业科技有限公司 图像输出格式快速转换的方法

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8031197B1 (en) * 2006-02-03 2011-10-04 Nvidia Corporation Preprocessor for formatting video into graphics processing unit (“GPU”)-formatted data for transit directly to a graphics memory
US8233527B2 (en) * 2007-05-11 2012-07-31 Advanced Micro Devices, Inc. Software video transcoder with GPU acceleration
CN103841451B (zh) * 2012-11-28 2017-09-29 腾讯科技(深圳)有限公司 多媒体播放方法、装置及终端
CN103607581A (zh) * 2013-08-01 2014-02-26 广东本致数码科技有限公司 基于三维图像视频监控图像显示方法
CN103841389B (zh) * 2014-04-02 2015-10-21 北京奇艺世纪科技有限公司 一种视频播放方法及播放器

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030001964A1 (en) * 2001-06-29 2003-01-02 Koichi Masukura Method of converting format of encoded video data and apparatus therefor
CN101073263A (zh) * 2004-06-14 2007-11-14 Rok产品有限公司 媒体播放器
CN101059797A (zh) * 2006-04-20 2007-10-24 风网科技(北京)有限公司 视频文件自动转换的系统及其方法
CN105049931A (zh) * 2015-08-10 2015-11-11 合一网络技术(北京)有限公司 对移动终端中非支持格式的视频进行转换的方法及系统
CN105430236A (zh) * 2015-12-22 2016-03-23 北京天诚盛业科技有限公司 图像输出格式快速转换的方法

Also Published As

Publication number Publication date
CN109196865B (zh) 2021-03-30
CN109196865A (zh) 2019-01-11

Similar Documents

Publication Publication Date Title
WO2018176734A1 (fr) Procédé de traitement de données et terminal
US12244820B2 (en) Adaptive transfer function for video encoding and decoding
US20210132779A1 (en) Electronic device and method for configuring display thereof
US11106307B2 (en) Method for low power driving of display and electronic device for performing same
US11050968B2 (en) Method for driving display including curved display area, display driving circuit supporting the same, and electronic device including the same
US9491367B2 (en) Image data processing method and electronic device supporting the same
CN113553014A (zh) 多窗口投屏场景下的应用界面显示方法及电子设备
WO2021008427A1 (fr) Procédé et appareil de synthèse d'image, dispositif électronique et support d'informations
US20180220068A1 (en) Foveated camera for video augmented reality and head mounted display
WO2019076274A1 (fr) Procédé et terminal d'affichage d'image dynamique
US9489883B2 (en) Electronic apparatus and method of displaying image thereof
US11051042B2 (en) Image processing device and method
US9781380B2 (en) Method, apparatus and terminal for playing multimedia content
US20160077659A1 (en) Method and apparatus for processing display data in electronic device
CN106933329B (zh) 一种移动终端适配节能等级的方法、装置及移动终端
US12353708B2 (en) Display method, terminal device and non-transitory storage medium
CN114157867B (zh) 图像处理方法、装置、电子设备及存储介质
CN117812274A (zh) 一种图像压缩方法、装置及设备
CN107040810A (zh) 多媒体投射方法、终端及显示设备
CN106168889A (zh) 一种片源编码方法以及电子设备
US20130176289A1 (en) Display switch method and portable device thereof
EP4343674A1 (fr) Procédé de traitement d'appel et son dispositif associé
CN116957933A (zh) 图像处理方法、装置和计算机可读存储介质
WO2024198633A1 (fr) Procédé de commutation vidéo et dispositif électronique
CN120475217A (zh) 帧率的控制方法、装置、设备、存储介质及车辆

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17903245

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17903245

Country of ref document: EP

Kind code of ref document: A1